Browse Items (9795 total)
Sort by:
-
A novel approach using steganography and cryptography in business intelligence
In the information technology community, communication is a vital issue. And image transfer creates a major role in the communication of data through various insecure channels. Security concerns may forestall the direct sharing of information and how these different gatherings cooperatively direct data mining without penetrating information security presents a challenge. Cryptography includes changing over a message text into an unintelligible figure and steganography inserts message into a spread media and shroud its reality. Both these plans are successfully actualized in images. To facilitate a safer transfer of image, many cryptosystems have been proposed for the image encryption scheme. This chapter proposes an innovative image encryption method that is quicker than the current researches. The secret key is encrypted using an asymmetric cryptographic algorithm and it is embedded in the ciphered image using the LSB technique. Statistical analysis of the proposed approach shows that the researcher's approach is faster and has optimal accuracy. 2021, IGI Global. -
A novel approach with matrix based public key crypto systems
Here in this model, a new mechanism is used for Public Key Cryptography. A generator matrix is used to generate a field with a large prime number. The generator matrix, prime number and quaternary vector are used as global variables. The Generator Matrix is powered by a private key to generate Public Key. Since the model is based on Discrete Logarithm Problem, which is Hard problem, the proposed algorithm supports the features like Authenticity of users, Security & Confidentiality of data transmitted. Going by the construction of the algorithm, Encryption is being done on blocks of data for which it consumes less computing resources. Going by complexity of the algorithm, the key length needed is about 72 bit lengths to provide sufficient strengths against crypto analysis. 2017 Taru Publications. -
A Novel Artificial Intelligence System for the Prediction of Interstitial Lung Diseases
Interstitial lung disease (ILD) encompasses a spectrum of more than 200 fatal lung disorders affecting the interstitium, contributing to substantial mortality rates. The intricate process of diagnosing ILDs is compounded by their diverse symptomatology and resemblance to other pulmonary conditions. High-resolution computed tomography (HRCT) assumes the role of the primary diagnostic tool for ILD, playing a pivotal role in the medical landscape. In response, this study introduces a computational framework powered by artificial intelligence (AI) to support medical professionals in the identification and classification of ILD from HRCT images. Our dataset comprises 3045 HRCT images sourced from distinct patient cases. The proposed framework presents a novel approach to predicting ILD categories using a two-tier ensemble strategy that integrates outcomes from convolutional neural networks (CNNs), transfer learning, and machine learning (ML) models. This approach outperforms existing methods when evaluated on previously unseen data. Initially, ML models, including Logistic Regression, BayesNet, Stochastic Gradient Descent (SGD), RandomForest, and J48, are deployed to detect ILD based on statistical measures derived from HRCT images. Notably, the J48 model achieves a notable accuracy of 93.08%, with the diagnostic significance of diagonal-wise standard deviation emphasized through feature analysis. Further refinement is achieved through the application of Marker-controlled Watershed Transformation Segmentation and Morphological Masking techniques to HRCT images, elevating accuracy to 95.73% with the J48 model. The computational framework also embraces deep learning techniques, introducing three innovative CNN models that achieve test accuracies of 94.08%, 92.04%, and 93.72%. Additionally, we evaluate five full-training and transfer learning models (InceptionV3, VGG16, MobileNetV2, VGG19, and ResNet50), with the InceptionV3 model achieving peak accuracy at 78.41% for full training and 92.48% for transfer learning. In the concluding phase, a soft-voting ensemble mechanism amplifies training outcomes, yielding ensemble test accuracies of 76.56% for full-training models and 92.81% for transfer learning models. Notably, the ensemble comprising the three newly introduced CNN models attains the pinnacle of test accuracy at 97.42%. This research is poised to drive advancements in ILD diagnosis, presenting a resilient computational framework that enhances accuracy and ultimately betters patient outcomes within the medical domain. 2024, The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd. -
A novel assessment of bio-medical waste disposal methods using integrating weighting approach and hesitant fuzzy MOOSRA
Bio-medical waste (BMW) management is highly important precaution for human health and environmental concern. There are several disposal treatment followed by medical practitioners in medical waste management. Here, a few disposal treatment is considered to be an alternatives. When assessing, it is necessary to evaluate and assume that all disposal treatment methods are safe and hygienic. In this way, every alternative assessment is evaluated based on the social acceptance, technology and operation, environmental protection, cost, noise and health risk. Finally the best alternative is chosen. When BMW is disposed and we select the best treatment method in BMW management, it can lead to multi-criteria decision making (MCDM) processes related to uncertain critical assessments. When making a decision, the decision makers having some hesitation to give their suggestions. Therefore, here we use hesitant MCDM method. In today's practice we have choose five methods of BMW disposal methods used in the medical world and we have its alternatives. One of these alternative is sorted by six criteria weights for selecting the best method. The main aim of this research paper is propose a new methodology of hesitant fuzzy weight finding technique, it is named as Hesitant Fuzzy Subjective and Objective Weight Integrated Approach (HF-SOWIA) and also propose a new hesitant fuzzy rank finding methodology, it is named as Hesitant Fuzzy Multi-Objective Optimization on the basis of Simple Ratio Analysis (HF-MOOSRA). After evaluation, the result shows that autoclaving is the best alternative for BMW disposal treatment methods. Furthermore, sensitivity analysis is make in order to observe the difference of alternative ranking when the importance of subjective and objective weights changes. 2020 Elsevier Ltd -
A Novel Assessment of Healthcare Waste Disposal Methods: Intuitionistic Hesitant Fuzzy MULTIMOORA Decision Making Approach
Waste produced from medical facilities systems incorporates a blend of dangerous waste which can posture dangers to humans and ecological receptors. Lacking administration of healthcare waste can prompt hazard to medicinal service specialists, patients, public health, communities and the wider environment. Hence, proper management of healthcare waste is imperative to reduce the associated health and environment risk. In this paper, we extend the MULTIMOORA decision making method with intuitionistic hesitant fuzzy set to evaluate the healthcare waste treatment methods. Intuitionistic hesitant fuzzy set is a generalized form of a hesitant fuzzy set. Intuitionistic hesitant fuzzy set considers the uncertainty of data in a single framework and take more information into account. The MULTIMOORA method consists of three parts namely the ratio system, reference point approach and the full multiplicative form. In the optimal ranking methods, the IHF-MULTIMOORA method is uncomplicated it is able to be used practically with high dimension intuitionistic hesitant fuzzy sets. For pathological, pharmaceutical, sharp, solid and chemical wastes, the preferred waste disposal methods are deep burial, incineration, autoclave, deep burial, and chemical disinfection, respectively. 2013 IEEE. -
A Novel Auto Encoder- Network- Based Ensemble Technique for Sentiment Analysis Using Tweets on COVID- 19 Data
The advances in digitalization have resulted in social media sites like Twitter and Facebook becoming very popular. People are able to express their opinions on any subject matter freely across the social media networking sites. Sentiment analysis, also termed emotion artificial intelligence or opinion mining, can be considered a technique for analyzing the mood of the general public on any subject matter. Twitter sentiment analysis can be carried out by considering tweets on any subject matter. The objective of this research is to implement a novel algorithm to classify the tweets as positive or negative, based on machine learning, deep learning, the nature inspired algorithm and artificial neural networks. The proposed novel algorithm is an ensemble of the decision tree algorithm, gradient boosting, Logistic Regression and a genetic algorithm based on the auto-encoder technique. The dataset under consideration is tweets on COVID-19 in May 2021. 2024 Taylor & Francis Group, LLC. -
A novel automated method for coconut grading based on audioception
The quality of the coconuts used for various purposes is of utmost importance. Demand for better quality products is constantly on the rise due to the improvements in the standard of living of people. There is a possibility that a bad coconut goes unnoticed by the traders, as it is hard to decide if a coconut is good or bad by relying only on its external appearance. Traditionally, quality assessment is carried out manually with the help of three senses; sight, hearing and smell. In the proposed work, a sound processing technique is used in an attempt to automate this process which overcomes the drawbacks of manual processing, which can be used in large godowns and warehouses. This proposed method provides the quality assessment of the coconut purely based on audioception. While creating the database, coconuts varying in size, shape, color and water content were taken from several places as a source for the dataset. Features are extracted from the sound pattern produced by the dropped coconut, which forms the basis for classification. Sequential Minimal Optimization (SMO), Dagging and Naive Bayes classifiers were used and the results obtained were found to be encouraging. 2005 ongoing JATIT & LLS. -
A novel automated method for the detection of strangers at home using parrot sound
The sound produced by parrots is used to gather information about their behavior. The study of sound variation is important to obtain indirect information about the characteristics of birds. This paper is the first of a series in analyzing bird sounds, and establishing the adequate relation of bird's sound. The paper proposes a probabilistic method for audio feature classification in a short interval of time. It proposes an application of digital sound processing to check whether the parrots behave strangely when a stranger comes. The sound is classified into different classes and the emotions of the birds are analyzed. The time frequency of the signal is checked using spectrogram. It helps to analyze the parrot vocalization. The mechanical origin of the sound and the modulation are deduced from spectrogram. The spectrogram is also used to check the amplitude and frequency modulation of sound and the frequency of the sound are detected and analyzed. This research and its findings will help the bird lovers to know the bird behavior and plan according to that. The greater understanding of birds will help the bird lovers to feed and care for birds. BEIESP. -
A Novel Back-Propagation Neural Network for Intelligent Cyber-Physical Systems for Wireless Communications
Wireless sensor networks, which play a significant role in monitoring complex environments that change rapidly over time, were used in the Artificial Intelligence method. External factors or the device designers themselves are both responsible for this complex behavior. Sensor networks often use machine learning techniques to adapt to such conditions, eliminating the need for excessive redesign. Cyber-physical systems (CPS) appeared as the promising option for improving physical-virtual interactions. The quality of the system containing processing information is primarily determined by the system function. There are many benefits obtained while combining Artificial Intelligence (AI) and Cyber-Physical Systems (CPSs) in buildings. In CPS-based indoor environment has various design schemes containing measurement and intelligent buildings in the control system consisting of detection, tracking, execution, and communication modules. The Multi-Agent System (MAS) is the smallest control unit that simulates among neurons and it flexibly provides the information. To mimic the interactions between human neurons, multi-agents are used. In this paper, the CPSs information world is built on the fundamental principle of granular formal concepts and the theory of granular computing is investigated. The calculation module is used by Back-Propagation Neural Network (BPNN) for pattern recognition and classification by environmental information. Various parameters namely the normalized root mean square error, peak signal-to-noise ratio, mean square error, and the mean absolute error are chosen as the objective assessment criteria to assess the benefits of the proposed method and the effectiveness of the proposed system is proven. 2024 IETE. -
A novel chemical route for low-temperature curing of natural rubber using 2,4 dihydroxybenzaldehyde: improved thermal and tensile properties
A novel method for chemically curing natural rubber (NR) using 2,4-dihydroxybenzaldehyde (DHB) at low temperatures has been discovered. Adding varying amounts of DHB to NR increases the crosslinking between the NR molecular chains. The chemical reaction between NR molecular chains and DHB was confirmed through Fourier transform infrared (FTIR) and proton nuclear magnetic resonance (NMR) spectra. From the thermogravimetric analysis (TGA), the thermal stability and activation energy of degradation were determined. The variation in glass transition temperature (Tg), as an indication of increased crosslink density, reducing the mobility of rubber chains, has been confirmed through differential scanning calorimetry (DSC). The addition of DHB to latex significantly enhanced the thermal stability of the rubber. An increase in the activation energy of 5.52% was observed upon the addition of 80mL DHB into NRL when compared to the uncured one. Furthermore, the tensile properties, in terms of tensile strength and modulus of elasticity of rubber, were drastically increased through DHB crosslinking. Tensile strength values of rubber were found to increase by reducing its elongation at break due to the formation of crosslinks between the macromolecular chains. NR cured with 80mL DHB exhibited superior tensile and thermal properties among the series of cured samples. By adding 80mL of DHB, the tensile strength increased by 390% and the elongation at break decreased by 10%. The advantage of this curing method is that, it is an effective technique for crosslinking NR directly from NR latex at comparatively low temperature. Graphical abstract: (Figure presented.) Iran Polymer and Petrochemical Institute 2024. -
A Novel CNN Approach for Condition Monitoring of Hydraulic Systems
In the dynamic landscape of Industry 4.0, the ascendancy of predictive analytics methods is a pivotal paradigm shift. The persistent challenge of machine failures poses a substantial hurdle to the seamless functioning of factories, compelling the need for strategic solutions. Traditional reactive maintenance checks, though effective, fall short in the face of contemporary demands. Forward-thinking leaders recognize the significance of integrating data-driven techniques to not only minimize disruptions but also enhance overall operational productivity while mitigating redundant costs. The innovative model proposed herein harnesses the robust capabilities of Convolutional Neural Networks (CNN) for predictive analytics. Distinctively, it selectively incorporates the most influential variables linked to each of the four target conditions, optimizing the model's predictive precision. The methodology involves a meticulous process of variable extraction based on a predetermined threshold, seamlessly integrated with the CNN framework. This nuanced and refined approach epitomizes a forward-looking strategy, empowering the model to discern intricate failure patterns with a high degree of accuracy. 2024 IEEE. -
A novel congestion-aware approach for ECC based secured WSN multicasting
--Multicasting in Wireless Sensor Networks greatly reduces the communication complexity between The Base station and set of sensor nodes deployed in a given region. It reduces the number of packets to be sent thus minimizing the chance of congestion. Still the existence of congestion appears due to improper channel utilization resulting in low throughput. In this paper, we have addressed the issue of congestion with reference to WSN multicasting. The Simulation results have shown that our approach is better in terms of throughput and delay compared with existing approaches. 2018, Institute of Advanced Scientific Research, Inc.. All rights reserved. -
A Novel Deep Learning Approach for Identifying Interstitial Lung Diseases from HRCT Images
Interstitial lung diseases (ILDs) are defined as a group of lung diseases that affect the interstitium and cause death among humans worldwide. It is more serious in underdeveloped countries as it is hard to diagnose due to the absence of specialists. Detecting and classifying ILD is a challenging task and many research activities are still ongoing. High-resolution computed tomography (HRCT) images have essentially been utilized in the diagnosis of this disease. Examining HRCT images is a difficult task, even for an experienced doctor. Information Technology, especially Artificial Intelligence, has started contributing to the accurate diagnosis of ILD from HRCT images. Similar patterns of different categories of ILD confuse doctors in making quick decisions. Recent studies have shown that corona patients with ILD also go on to sudden death. Therefore, the diagnosis of ILD is more critical today. Different deep learning approaches have positively impacted various image classification problems recently. The main objective of this proposed research work was to develop a deep learning model to classify the ILD categories from HRCT images. This proposed work aims to perform binary and multi-label classification of ILD using HRCT images on a customized VGG architecture. The proposed model achieved a high test accuracy of 95.18% on untrained data. 2022, The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd. -
A Novel Deep Learning Approach for Retinopathy Prediction Using Multimodal Data Fusion
In contemporary research on mild cognitive disorders (MCI) and Alzheimer's disease (AD), the predominant approach involves the utilization of double data modalities for making predictions related to AD stages. However, there is a growing recognition of the potential benefits that could be derived from the fusion of multiple data modalities to obtain a more comprehensive perspective in the analysis of AD staging. To address this, we have employed deep learning techniques to holistically assess data from various sources, including, genetic (single nucleotide polymorphisms (SNPs)), imaging (magnetic resonance imaging (MRI)), and clinical tests, with the objective of categorizing patients into distinct groups: AD, MCI, and controls (CN). For the analysis of imaging data, convolutional neural networks have been employed. Moreover, we have introduced a novel approach for data interpretation, enabling the identification of the most influential features learned by these deep models. This interpretation process incorporates clustering and perturbation analysis, shedding light on the crucial aspects of the data contributing to our classification results. Our experimentation, conducted on the dataset (i.e., ADNI), has yielded compelling results. Furthermore, our findings have underscored the significant advantage of integrating multi-modality data over solely relying on double modality models, as it has led to improvements in terms of accuracy, precision, recall, and mean F1 scores. 2024, Ismail Saritas. All rights reserved. -
A novel discrete slash family of distributions with application to epidemiology informatics data
This study puts forward a new class of discrete distribution that can be used by the epidemiologists and medical scientists to model data relating to epidemiology informatics. The proposed distribution is superior to traditional discrete modeling alternatives, viz., discrete Weibull and geometric distributions in terms of its model fit and flexibility to handle heavy-tailed dataset. It is a flexible three-parameter discrete distribution, grounded in the slash family and can be considered as a refined extension to the geometric distribution. We explored the diverse properties of this novel distribution thoroughly by evaluating the mathematical properties. The models parameters are estimated using the maximum likelihood estimation method, where the methodology validity is confirmed through an extensive simulation study. Furthermore, the practical utility of the distribution to model epidemiology informatics was examined with the help of eight different datasets representing three different dimensions of the epidemiology informatics, viz., mortality, infection and medication statistics. The Author(s), under exclusive licence to Springer Nature Switzerland AG 2024. -
A Novel Dynamic Physical Layer Impairment-Aware Routing and Wavelength Assignment (PLI-RWA) Algorithm for Mixed Line Rate (MLR) Wavelength Division Multiplexed (WDM) Optical Networks
The ever-increasing global Internet traffic will inevitably lead to a serious upgrade of the current optical networks' capacity. The legacy infrastructure can be enhanced not only by increasing the capacity but also by adopting advance modulation formats, having increased spectral efficiency at higher data rate. In a transparent mixed-line-rate (MLR) optical network, different line rates, on different wavelengths, can coexist on the same fiber. Migration to data rates higher than 10 Gbps requires the implementation of phase modulation schemes. However, the co-existing on-off keying (OOK) channels cause critical physical layer impairments (PLIs) to the phase modulated channels, mainly due to cross-phase modulation (XPM), which in turn limits the network's performance. In order to mitigate this effect, a more sophisticated PLI-Routing and Wavelength Assignment (PLI-RWA) scheme needs to be adopted. In this paper, we investigate the critical impairment for each data rate and the way it affects the quality of transmission (QoT). In view of the aforementioned, we present a novel dynamic PLI-RWA algorithm for MLR optical networks. The proposed algorithm is compared through simulations with the shortest path and minimum hop routing schemes. The simulation results show that performance of the proposed algorithm is better than the existing schemes. 2016 by De Gruyter. -
A novel dynamic Physical Layer Impairment-Aware Routing and Wavelength Assignment (PLI-RWA) algorithm for Mixed Line Rate (MLR) Wavelength Division Multiplexed (WDM) optical networks /
Journal of Optical Communications, Vol.37, Issue 4, pp.349-356, ISSN: 2191-6322 (Online) 0173-4911 (Print). -
A Novel Edge-Based Trust Management System for the Smart City Environment Using Eigenvector Analysis
The proposed Edge-based Trust Management System (E-TMS) uses an Eigenvector-based approach for eliminating the security threats present in the Internet of Things (IoT) enabled smart city environment. In most existing trust management systems, the trust aggregation process completely depends on the direct trust ratings obtained from both legitimate and malicious neighboring IoT devices. E-TMS possesses an edge-assisted two-level trust computation approach for ensuring the malicious free trust evaluation of IoT devices. The E-TMS aims at removing the false contribution on aggregated trust data. It utilizes the properties of the Eigenvector for identifying compromised IoT devices. The Eigenvector Analysis also helps to avoid false detection. The analysis involves a comparison of all the contributed trust data about every single connected device. A spectral matrix will be generated corresponding to the contributions and the received trust will be scaled based on the obtained spectral values. The absolute sum of obtained values will contain only true contributions. The accurate identification of false data will remove the effect of malicious contributions from the final trust value of a connected IoT device. Since the final trust value calculated by the edge node contains only the trustworthy data, the prediction about the malicious nodes will be accurate. Eventually, the performance of E-TMS has been validated. Throughput and network resilience are higher than the existing system. 2022 G. Nagarajan et al. -
A Novel Energy-Efficient Hybrid Optimization Algorithm for Load Balancing in Cloud Computing
In the field of Cloud Computing (CC), load balancing is a method applied to distribute workloads and computing resources appropriately. It enables organizations to effectively manage the needs of their applications or workloads by spreading resources across numerous PCs, networks, or servers. This research paper offers a unique load balancing method named FFBSO, which combines Firefly algorithm (FF) which reduces the search space and Bird Swarm Optimization (BSO). BSO takes inspiration from the collective behavior of birds, exhibiting tasks as birds and VMs as destination food patches. In the cloud environment, tasks are regarded as autonomous and non-preemptive. On the other hand, the BSO algorithm maps tasks onto suitable VMs by identifying the possible best positions. Simulation findings reveal that the FFBSO algorithm beat other approaches, obtaining the lowest average reaction time of 13ms, maximum resource usage of 99%, all while attaining a makespan of 35s. 2023 IEEE. -
A Novel Ensemble based Model for Intrusion Detection System
In the present interconnected world, the increasing reliance on computer networks has made them susceptible to multiple security threats and intrusions. Intrusion Detection Systems (IDS) is essential for shielding these networks by detecting and mitigating potential threats in real-time. This research paper presents an in-depth study of employing the Random Forest algorithm for building an effective intrusion detection System. The proposed IDS uses the power of the Random Forest algorithm, a popular ensemble learning technique, to detect various types of intrusions in network traffic effectively. The algorithm integrates more than one decision trees to produce a robust and accurate classifier, capable of handling large-scale and complex datasets typical of network traffic. The proposed system can be used in various industries and sectors to protect critical assets, ensuring the uninterrupted operation of computer networks. Evolving cyber threats have encouraged further research into ensemble analytics methods to increase the resilience of Intrusion Detection Systems in an ever-changing threat landscape. 2024 IEEE.