Browse Items (11810 total)
Sort by:
-
Research challenges in self-driving vehicle by using internet of things (IoT)
This article summarizes the benefits, safety hazards, and limitations of owning a self-driving vehicle. Finding a way to use an SDV(Self Driving Vehicle) is minimizing the risk for an accident is important for public and road safety. The actual rate of accidents for self-driving vehicles are lower than that for regular vehicles since the total number of miles of self-driving vehicles combined is nowhere close to that of regular fossil-fueled vehicles. Even though there is no proof that self-driving vehicles will not cause accidents, it is important to know that self-driving vehicles weren't the cause in all the cases they have been involved. That is, it will not be purely considered as the machine's mistake. The safety level of self-driving vehicles has been proven to be one of the best and that has led to the number of serious accident-related wounds in self-driving vehicles to remain lower than the standard level. Nevertheless, Internet of Things plays a major role in developing the self-driving vehicle concept. 2021 IEEE. -
A secured predictive analytics using genetic algorithm and evolution strategies
In the banking sector, the major challenge will be retaining customers. Different banks will be offering various schemes to attract new customers and retain existing customers. The details about the customers will be provided by various features like account number, credit score, balance, credit card usage, salary deposited, and so on. Thus, in this work an attempt is made to identify the churning rate of the possible customers leaving the organization by using genetic algorithm. The outcome of the work may be used by the banks to take measures to reduce churning rates of the possible customers in leaving the respective bank. Modern cyber security attacks have surely played with the effects of the users. Cryptography is one such technique to create certainty, authentication, integrity, availability, confidentiality, and identification of user data can be maintained and security and privacy of data can be provided to the user. The detailed study on identity-based encryption removes the need for certificates. 2020 by IGI Global. All rights reserved. -
A novel approach with matrix based public key crypto systems
Here in this model, a new mechanism is used for Public Key Cryptography. A generator matrix is used to generate a field with a large prime number. The generator matrix, prime number and quaternary vector are used as global variables. The Generator Matrix is powered by a private key to generate Public Key. Since the model is based on Discrete Logarithm Problem, which is Hard problem, the proposed algorithm supports the features like Authenticity of users, Security & Confidentiality of data transmitted. Going by the construction of the algorithm, Encryption is being done on blocks of data for which it consumes less computing resources. Going by complexity of the algorithm, the key length needed is about 72 bit lengths to provide sufficient strengths against crypto analysis. 2017 Taru Publications. -
Fully homomorphic encryption with matrix based digital signature standard
In this work, a novel mechanism is considered for Asymmetric mode encrypting data. A generator matrix is used to generate a field with a large prime number. The generator matrix, prime number and ternary vector are used as global variables. Those global variables are used to calculate public key and sub keys. Using the Global Variables and Private key, a Digital signature algorithm is proposed which supports the feature like Authenticity of users. The mechanism can also be used in Homomorphic Encryption where computations to be carried out on cipher text and generate an encrypted result which, when decrypted, matches the result of operations performed on the plaintext. 2017 Taru Publications. -
Cubic spline curve public key cryptography
In this work, a cubic spline curve is considered for Asymmetric mode encrypting data. A Cubicspline curve is used to generate a set of points under initial known boundary conditions. This set of points is considered as generator points. Given the field, known the generator points, considering for some time and space steps and for a considered private key, Public key will be generated based on Discrete Logarithm problem. Thus for a set of public and private keys, data can be transmitted securely with features like Authenticity of users, Security & Confidentiality of data transmitted. Going by the construction of the algorithm, Encryption is being done on blocks of data for which it consumes less computing resources. Going by complexity of the algorithm, the key length needed is about 120 bit lengths to provide sufficient strengths against cryptoanalysis. 2017 Taru Publications. -
Message from IEEE InC4 2023 Program Chair
[No abstract available] -
Message from IEEE InC4 2024 Program Chair
[No abstract available] -
Security mechanisms in cloud computing-based big data
In the existent system, data is encrypted and stored when passed to the cloud. During any operations on the data, it is decrypted and then the computation is done. This decrypted data is vulnerable and prone to be misused. After the computations are done, the data and the result are encrypted and stored back in the cloud. This creates an overhead to the system as well as increases time complexity. With this chapter, the authors aim to reduce the overhead of the systems to perform repeated encryptions and decryptions. This can be done by allowing the computations to happen directly on the encrypted text. The result obtained by performing computations on encrypted data will be the same as the ones done on the original plain text. This new security solution is fully fit for processing and retrieval of encrypted data, effectively leading to the broad applicable project, the security of data transmission, and the storage of data. The work is secured further with additional concepts like probabilistic and time stamp-based encryption processes. 2021, IGI Global. -
Security mechanisms in cloud computing-based big data
In the existent system, data is encrypted and stored when passed to the cloud. During any operations on the data, it is decrypted and then the computation is done. This decrypted data is vulnerable and prone to be misused. After the computations are done, the data and the result are encrypted and stored back in the cloud. This creates an overhead to the system as well as increases time complexity. With this chapter, the authors aim to reduce the overhead of the systems to perform repeated encryptions and decryptions. This can be done by allowing the computations to happen directly on the encrypted text. The result obtained by performing computations on encrypted data will be the same as the ones done on the original plain text. This new security solution is fully fit for processing and retrieval of encrypted data, effectively leading to the broad applicable project, the security of data transmission, and the storage of data. The work is secured further with additional concepts like probabilistic and time stamp-based encryption processes. 2021, IGI Global. -
A machine learning model for population analysis among different states in India which influences the socio, demographic and economic needs of society
In this work Data from 2011 census is taken to identify the state which influences more in Population census among the different states identified. The data is considered from Madhya Pradesh, followed with Utter Pradesh, then to Bihar, Bengal and Orissa. Similarly other case studies are also done for Southern Indian states and North Eastern States. Genetic algorithm will be tried to find the optimal location for the given study. A fitting function is calculated for the population data of 2011 using Lagrange Interpolation technique. This fitting function is given as input to Genetic algorithm to find the optimal state which have maximum influence in the population growth among different states of India as per the Case studies done. BEIESP. -
Revocable and Secure Multi-Authority Attribute-Encryption Scheme
Security is an important factor as nowadays many systems generates and process huge amount of data. This also leads many of us to rely on a third-party service provider for storing sensitive and confidential data. Providing outsourcing means the data owner will encrypt and store the data in a third-party storage system. In this paper, we are proving solutions for two main problems. The first issue is what if the attribute authority itself can access the data because the attributes and secret keys are known by attribute. This issue is called the key escrow problem. For solving it we are proposing a multi-authority system with Elliptic Curve Cryptography. The second issue that we addressed in this paper is the revocation problem, which means when someone leaves the system should be prohibited from accessing subsequent data this is called forward security and as a second case when someone joins the system should be prevented from accessing previous shared date this is called backward security. In this paper, we address both forward and backward security. For solving this problem we are using the concept of the Lagrange interpolation technique for generating and verifying secret keys. Based on this technique secret key will be dynamically altered and used for encryption and due to this can achieve greater security. 2023, Ismail Saritas. All rights reserved. -
Security Analysis for a Revocable Multi-Authority ABE-Attribute-Based Mechanism
Due to the tremendous increase in data, groups or even organizations are storing data with third-party providers to solve storage problems. Ciphertext policy attribute based encryption helps to outsource data, which means encrypt the data at the data owners end and uploading it to third-party storage with some access policy. In normal Identity-based encryption, if a data owner wants to send information to a data user, it will be sent with some identity of the data user, such as mail id, so that only that particular user can read the message. The main problem is that the data owner should know each users identity. For instance, in some organizations where a data owner wants to send a message to a group of people with an identical designation, it can be sent with the help of the users attribute using attribute-based encryption. Here, the data owner does not need to know the specific details of each user; instead, with the help of attributes and the provided access policy, they can access this message. This research mainly focuses on three aspects of CP-ABE: access policy, number of attribute authorities, and revocation. When it comes to access policy, the currently existing access policies are not secure due to their linearity in nature because shares are always calculated using the same linear equation. So, for this problem, this work has developed a non-linear SS-secret-sharing model that increases the confidentiality of the model. 2024 Seventh Sense Research Group. -
Predicting and improvising the performance of rocket nozzle throat using machine learning algorithms
This paper is a study of one dimensional heat conduction with thermo physical properties like K, row, Cp of a material varying with temperature. The physical problem is characterized by a cylinder of infinite length and thickness L, imposed with a net heat flux at x= 0, with the other end being insulated. The temperatures at the insulate end are measured by placing thermocouples. As the temperatures at the other end are very high, it is not possible to measure temperatures by keeping thermocouples which will burn away. So the problem is initialized with known sensor values near insulated end. By proper predicting values by ARIMA Model, the temperature distribution in Rocket Nozzle throat system (RNT) is calculated. The outcome of the work is processed with Machine Learning algorithm like Genetic algorithm in identifying the optimal location of sensor position which helps in improvising the performance of RNT. 2020, Institute of Advanced Scientific Research, Inc. All rights reserved. -
Statistical tests for key strength identification in cryptography
The cryptographic study involves three algorithms, one for Encryption of Plain text to Cipher text, one for Decryption for Cipher text back to Plain text and third for the generation of the Key. Key generation algorithm works on the principle of Randomness. In this work, the randomness of Key is studied by using Statistical methods like Runs Up & Runs Down test, Runs (Above and Below the mean), Chi Square test & Auto correlation test for its usability in Cryptographic study. 2020 IJSTR. -
Design of a new curve based cipher
This work aims to develop a model with curve based cryptographic scheme which supports Confidentiality and Authentication at low computing resources and equal security. T TARU PUBLICATIONS. -
Rethinking human values
Just as there are many cultures within the world, so also are there many practices, beliefs, myths, values, and traditions within each culture. These unique ways of being can often present challenging frames of reference that may prevent a whole perspective from being attained. This essay examines the contextual formation of culture and the fundamentals intricate to the search for universal values. An illumination is also provided upon some of the major and extreme forms of cultural practices that may pose difficulty in achieving such a goal. -
Phytochemical analysis and antioxidant activity of leaf extracts of some selected plants of the family acanthaceae
The present era of scientific research has witnessed an enumerable amount of evidences to showcase the immense potential of medicinal plants. In the present investigation, the phytochemical analysis of Phlogacanthus pubinervius T. Anderson., Adhatoda vasica (L.) Nees,Phlogacanthus thyrsiflorus Nees, Phlogacanthus curviflorus (Wall.) Nees, and Ruellia tuberosa L. was carried out for the different plants extracted with methanol. Analysis was carried out to estimate the quantity of phenols, carbohydrates, tannins, flavonoids and proteins. The antioxidant property of these plants were analysed using DPPH method. The concentration of the plant samples required to decrease the DPPH concentration by 50% was calculated by interpolation from linear regression analysis and denoted IC50 value (g/ml). The qualitative analysis showed the presence of alkaloids, tannins, saponins, proteins, carbohydrate and phenols in all the sample extracts. The highest amount of tannins and phenols was observed in P. thyrsiflorus. P. pubinervius (77.83%), A. vasica (74.81%), P. curviflorus (94.20%), and R. tuberosa (70.78%) which showed highest antioxidant activity of DPPH-scavenging at 150 g/ml of methanol extract. The high percent of scavenging activities of those plants add value to their medicinal properties. The presence of the high amount of phytochemical compounds suggests that the plants have high amount of medicinal compounds and can be extensively used to extract the natural compounds. Kripasana & Xavier (2020). This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited (https://creativecommons.org/licenses/by/4.0/). -
Classification of adaptive back propagation neural network along with fuzzy logic in chronic kidney disease
A steady deterioration in kidney function over months or years is known as chronic kidney disease (CKD). Through a range of techniques, such as pharmacological intervention in moderate cases and hemodialysis and renal transport in severe cases, early identification of CKD is crucial and has a substantial influence on reducing the patient's health development. The outcomes show the patient's kidneys' present state. It is suggested to develop a system for detecting chronic renal disease using machine learning. Finding the best feature sets typically involves using metaheuristic algorithms since feature selection is an NP-hard issue with amorphous polynomials. Semi-crystalline tabu search (TS) is frequently used for both local and global searches. In this study, we employ a brand-new hybrid TS with stochastic diffusion search (SDX)-based feature selection. The adaptive backpropagation neural network (ABPNN-ANFIS) is then classified using fuzzy logic. Fuzzy logic may be used to combine the ABPNN findings. Consequently, these techniques can aid experts in determining the stage of chronic renal disease. The Adaptive Neuron Clearing Inference System (ABPNN-ANFIS) was utilised to develop adaptive inverse neural networks using the MATLAB programme. The outcomes demonstrate that the suggested ABPNN-ANFIS is 98 % accurate in terms of efficiency. 2024 -
The Persistence of Untouchability: Working Conditions of Dalit Journalists in India
This article can be viewed as an extension of the Oxfam report of 2019, which revealed that Indian news media is dominated by upper castes and the near absence of Dalit and Adivasi journalists. Using critical political economy as a framework, and undertaking qualitative interviews of self-identifying Dalit journalists, their conditions of work in mainstream news media are examined. In addition to the problems faced by journalists in general, this research reveals that Dalit journalists experience considerable psychological stress and extra intensity of work. They tolerate a toxic work environment that results in mental trauma and have to navigate rigid caste networks. Supplementing in-depth interviews with secondary data, the article argues that the conditions within which Dalit journalists function contain all dimensions of untouchability: exclusion, humiliation, and exploitation. The article concludes with a call to end this untouchability, revive the Working Journalists Act to ameliorate the conditions of work of Indias fourth estate. Specific legislation is required to ensure favorable conditions of work for Dalit journalists. Further, the article calls for a theoretical revamping of critical political economy to include caste, particularly when analyzing South Asian media. The Author 2024 -
Global iPhone Local Labour: Exploring ICT Production, Labour and Cultural Production
A theory of value pertinent to the contemporary iPhone era focuses on formal and informal labour circuits. This study extends this framework by examining a labour dispute in an iPhone factory near Bangalore, delving into its dissemination through media and the broader critical political economy surrounding the recent iPhone production in India. Furthermore, it incorporates a geographical perspective into the circuit framework to illustrate the movement of capital and labour in Bangalore, rekindling discussions on coreperiphery dynamics in the context of capital and labour migration. Further, this research builds upon the typography of worker-generated content by illustrating a specific category of such content within the iPhone labour dispute. Utilising a critical political economy of media approach, this article aims to assess the broader implications of the updated framework and to open new avenues for research within the emerging field of information communication technologies, cultural production and labour. 2024 South Asian University.