Browse Items (2150 total)
Sort by:
-
An Energy Optimized Clustering approach for Communication in Vehicular Cloud Systems
Vehicular cloud networks are considered to possess faster transitional topology and mobility thereby adhering to its features as an ad hoc network. Many times, it is difficult to monitor vehicular nodes that results in internetworking concerns as a result of power inadequacy during real computation. This leads to lots of energy wastage issues encountered during routing which degrades lifetime of nodes. Thus in this study a new clustering based energy optimization method is proposed to enhance the efficiency of vehicular communication. K-medoid cluster analysis along with dragonfly approach is applied to the system model to optimize energy. On the basis of simulation undertaken, it is recorded that the network lifetime, packets delivered, processing delay and throughput are increased using the proposed model. 2023 IEEE. -
An Enhanced Data-Driven Weather Forecasting using Deep Learning Model
Predicting present climate and the evolution of the ecosystem is more crucial than ever because of the huge climatic shift that has occurred in nature. Weather forecasts normally are made through compiling numerical data on from the atmospheric state at the moment and also applying scientific knowledge in the atmospheric processes to forecast on how the weather atmosphere would evolve. The most popular study subject nowadays is rainfall forecasting because of complexity in handling the data processing in addition to applications in weather monitoring. Four different state temperature data were collected and applied deep learning methods to predict the temperature level in the forthcoming months. The results brought out with the accuracy from 92.5% to 97.2% for different state temperature data. 2023 IEEE. -
An Enhanced Deep Learning Model for Duplicate Question Detection on Quora Question pairs using Siamese LSTM
The question answering platform Quora has millions of users which increases the probability of questions asked with similar intent. One question may be structured in two different ways by two users, and answering similar questions repeatedly impacts user experience. Manual filtration of such questions is a tedious task, so Quora attempts to detect and remove these duplicate questions by using the Random Forest Model, which is not completely effective. As Quora contains question answers in the form of text data, different Natural Language Processing techniques are used to transform the text data into numerical vectors. In this research, the log loss metric acts as the primary metric to evaluate different models. The primary contribution is that the Siamese network is used to process two questions parallelly and find vectors representation of each question. The vectors computed by this method enables similarity detection which is more effective than existing models. GloVe word embedding is used to understand the semantic similarity between two questions. The random classifier is built as the base model and logistic regression, linear SVM and XGBoost model are used to reduce the log loss. Finally, a Siamese LSTM is proposed which reduces the loss dramatically. 2022 IEEE. -
An enhanced framework to design intelligent course advisory systems using learning analytics
Education for a person plays an anchor role in shaping an individuals career. In order to achieve success in the academic path, care should be taken in choosing an appropriate course for the learners. This research work is based on the framework to design a course advisory system in an efficient way. The design approach is based on overlapping of learning analytics, academic analytics, and personalized systems. This approach provides an efficient way to build course advisory system. Also, mapping of course advisory systems into the reference model of learning analytics is discussed in this paper. Course advisory system is considered as enhanced personalized system. The challenges involved in the implementation of course advisory system is also elaborated in this paper. Springer Science+Business Media Singapore 2017. -
An ettective dynamic scheduler tor reconfigurable high speed computing system
High Speed Computing is a promising technology that meets ever increasing real-time computational demands through leveraging of flexibility and parallelism. This paper introduces a reconfigurable fabric named Reconfigurable High Speed Computing System (RHSCS) and offers high degree of flexibility and parallelism. RHSCS contains Field Programmable Gate Array (FPGA) as a Processing Element (PE). Thus, RHSCS made to share the FPGA resources among the tasks within single application. In this paper an efficient dynamic scheduler is proposed to get full advantage of hardware utilization and also to speed up the application execution. The addressed scheduler distributes the tasks of an application to the resources of RHSCS platform based on the cost function called Minimum Laxity First (MLF). Finally, comparative study has been made for designed scheduling technique with the existing techniques. The proposed platform RHSCS and scheduler with Minimum Laxity First (MLF) as cost function, enhances the speed of an application up to 80.30%. 2014 IEEE. -
An Examination of Methodological Approaches for Segmentating Fetal Brain MRI Images - Analysis
In today's world and in the country like India, Women's health needs more care. Especially the women's health during the pregnancy period plays a vital role in both the mother as well as the baby's care. As per a survey, among thousands three of them found to have fetal brain abnormalities. If these abnormalities are predicted at the early stage, then it will be an added advantage in saving both the life of mother and baby. During the pregnancy number of tests have to be performed to monitor fetal development. Tests like fetal ultrasound, Chorionic Villus Sampling, Amniocentesis, Fetal Echocardiogram, Fetal MRI imaging SCAN etc. The fetal brain abnormality can be predicted as well as treated at the early stage by analyzing the fetal brain MRI during the gestational period. Identifying abnormalities in fetal brain MRI images involves several essential steps, including image segmentation, analyzing images involves extracting distinctive features, refining their quality, identifying relevant patterns, and categorizing them based on specific criteria. The process of classification determines whether an abnormality is present or not. Analyzing images presents a complex undertaking owing to the diversity in shapes, spatial arrangements, and intensity levels within the images. This paper focuses on reviewing and comparing various segmentation techniques, highlighting their respective strengths and weaknesses. 2024 IEEE. -
An Experimental Investigation on Flexural Strength of Ferrocement Slab Made of Slag Sand Partially Replaced with Iron Ore Tailings
Effective use of slag sand and Iron Ore Tailings and other waste obtained from the manufacturing industry and mining industry like waste foundry sand, will reduce the negative impact on the environment and also will provide opportunities for effective use of natural resources and contribute to sustainability. The aim of this research project is to study the flexural strength of ferrocement slab made of slag sand partially replaced with iron ore tailings with sustainability point of view. Investigation of 48 slab panels of 700mm 300 mm size with thickness 25 mm and 30 mm was conducted using 1 and 2 layers of weld mesh reinforcement casted with different percentage of iron ore tailings. Slabs were tested in Universal Testing Machine, which showed good results with 15% of iron ore tailings. Published under licence by IOP Publishing Ltd. -
An exploration of the impact of Feature quality versus Feature quantity on the performance of a machine learning model
About 0.62 trillion bytes of data are generated every hour globally. These figures have been increasing as a result of digitalization and social networks. Some data ecosystems capture, store, and manage this big DATA. The basis is to be able to analyze their information and extract their value. This fact is a gold mine for companies researching and using this data. This leads us to follow how essential and valuable data is in this growing age. For any machine learning model, the selection of data is necessary. In this paper, several experiments have been performed to check the importance of data quality vs. data quantity on model performance. This clearly indicates comparing the data's richness regarding feature quality (e.g., features in images) and the amount of data for any machine learning model. Images are classified into two sets based on features, then removing redundant features from them, then training a machine learning model. Model getting trained with non-redundant data gives highest accuracy (>80%) in all cases versus the one with all features, proving the importance of feature variability and not just the feature count. 2023 IEEE. -
An Human Islet Cell RNA-Seq for Genome-Wide Genotype Deepsec Framework Using Deep Learning Based Diabetes Prediction
Evaluating the tissues responsible for complicated human illnesses is important to rank significance of genetic revision connected to features. In order to make predictions about the regulatory functions of geneticsvariations athwart wide range of epigenetic changes, this article introduces a Convolutional neural network (CNN) model upgraded filters and Deepsec framework incorporated with comprehensive ENCODE and Roadmap consortia have compiled a human epigenetic map that indicates specificity to certain tissues or cell types. Deepsec framework integrates transcription factors, histone modification markers, and RNA accessibility maps to comprehensively evaluate the consequences of non-coding alterations on the most important components, even for uncommon variations or novel mutations. By using trait-associated loci and more than 30 different human pancreatic islets and their subsets of cells sorted using fluorescence-activated cell sorting, annotations of epigenetic profiling were obtained (FACS) on a genome-wide scale. The proposed model, used '1492' publicly available GWAS datasets. My team presented that deepsec framework does epigenetic annotations found important GWAS associations and uncover regulatory loci from background signals when exposed to CNN-based analysis, offering fresh intuition underlying nadir causes of type 2diabetes. The suggested approaches are anticipated to be extensively used in downstream GWAS analysis, making it possible to assess non-coding variations and conduct downstream GWAS analysis 2023 IEEE. -
An improved AI-driven Data Analytics model for Modern Healthcare Environment
AI-driven statistics analytics is a swiftly advancing and impactful era that is transforming the face of healthcare. By leveraging the energy of AI computing and gadget studying, healthcare organizations can speedy gain insights from their huge datasets, offering a greater comprehensive and personalized approach to hospital therapy and populace health management. This paper explores the advantages of AI-driven statistics analytics in healthcare settings, masking key benefits along with progressed analysis and treatment, better-affected person effects, and financial savings. Moreover, this paper addresses the main challenges associated with AI-pushed analytics and offers potential solutions to enhance accuracy and relevance. In the long run, statistics analytics powered by way of AI gives powerful opportunities to improve healthcare outcomes, and its use is expected to expand within the coming years. 2024 IEEE. -
An Improved and Efficient YOLOv4 Method for Object Detection in Video Streaming
As object detection has gained popularity in recent years, there are many object detection algorithms available in today's world. Yet the algorithm with better accuracy and better speed is considered vital for critical applications. Therefore, in this article, the use of the YOLOV4 object detection algorithm is combined with improved and efficient inference methods. The YOLOV4 state-of-the-art algorithm is 12% faster compared to its previous version, YOLOV3, and twice as faster compared to the EfficientDet algorithm in the Tesla V100 GPU. However, the algorithm has lacked performance on an average machine and on single-board machines like Jetson Nano and Jetson TX2. In this research, we examine the performance of inferencing in several frameworks and propose a framework that effectively uses hardware to optimize the network while consuming less than 30% of the hardware of other frameworks. 2022, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
An Improved Artificial Intelligence based Service Quality to Increase Customer Satisfaction and Customer Loyalty in Banking Sector
This study clarifies and determines how service quality affects customer loyalty and reliability. The support of quality in the open and private financial sphere and understanding of its connection to customer loyalty and conduct goal Utilizing an upgraded SERVQUAL (BANQUAL) tool with 26 items, the review was conducted among 802 bank customers. The social goal battery was used to estimate the clients' expected conduct. The expert used a seven-point Likert scale to assess the standard and saw service quality (implementation), as well as the social expectations of the clients. The most reliable tool to quantify the conceptualization of the differentiation score is the BANQUAL instrument. It is used to evaluate gaps in service between assumptions and perceptions of service quality. The SERVQUAL instrument is modified to make it suitable in the banking industry. Questions on parking at the bank, the variety of things and programmes available, and the banks' genuine efforts to address customer grievances are added to the instrument (Responsiveness). The writing audit was sufficiently compiled from many sources, reflecting both an Indian and foreign environment. The postulation included several hypotheses then examined using structural equation modelling. To meet the exploration goals, the views were tested using the products AMOS and SISS. The data were analysed using corroborative and explorative element research to confirm the BANQUAL instrument's dependability and legitimacy of the financial business execution and service quality aspects. The resulting CFA model value exhibits excellent psychometric qualities. Professional businesses and clients increasingly use artificial intelligence support specialists (AISA) for management. However, no measure measuring the support quality can fully capture the essential factors affecting AISA service quality. By developing a scale for evaluating the quality of AISA service, this study seeks to solve this deficiency(AISAQUAL). 2023 IEEE. -
An Improved Image Up-Scaling Technique using Optimize Filter and Iterative Gradient Method
In numerous realtime applications, image upscaling often relies on several polynomial techniques to reduce computational complexity. However, in high-resolution (HR) images, such polynomial interpolation can lead to blurring artifacts due to edge degradation. Similarly, various edge-directed and learning-based systems can cause similar blurring effects in high-frequency images. To mitigate these issues, directional filtering is employed post corner averaging interpolation, involving two passes to complete the corner average process. The initial step in low-resolution (LR) picture interpolation involves corner pixel refinement after averaging interpolation. A directional filter is then applied to preserve the edges of the interpolated image. This process yields two distinct outputs: the base image and the detail image. Furthermore, an additional cuckoo-optimized filter is implemented on the base image, focusing on texture features and boundary edges to recover neighboring boundary edges. Additionally, a Laplacian filter is utilized to enhance intra-region information within the detailed image. To minimize reconstruction errors, an iterative gradient approach combines the optimally filtered image with the sharpened detail image, generating an enhanced HR image. Empirical data supports the effectiveness of the proposed algorithm, indicating superior performance compared to state-of-the-art methods in terms of both visual appeal and measured parameters. The proposed method's superiority is demonstrated experimentally across multiple image datasets, with higher PSNR, SSIM, and FSIM values indicating better image degradation reduction, improved edge preservation, and superior restoration capabilities, particularly when upscaling High-Frequency regions of images. 2023 IEEE. -
An improvised grid resource allocation and classfication through regression
The resource allocation is one of the important mechanisms of grid computing, which helps to assign the available resources very efficiently. The one of the issue of grid computing is fixing the target nodes during the grid job execution. In existing method, resource monitored data are collected from grid then jobs are allocated to the resources based on available data, through regression algorithm. In this method total execution time of an application and run time of jobs should be high. The proposed method mitigate running time by classify the resources in the data collected from grid based on dwell time using novel classification algorithm. It reduces the jobs run time and fit the best available resources to the jobs in the computational grid. 2017 IEEE. -
An Innovative Approach for Osteosarcoma Bone Cancer Detection based on Attention Embedded R-CNN Approach
The malignant bone tumor osteosarcoma. Any bone is at risk, but lengthy bones like the limbs are more vulnerable. Although the precise cause of this malignant growth is uncertain, experts concur that it is caused by changes to deoxyribonucleic acid (DNA) inside the bones. This can cause the breakdown of good tissue and the growth of aberrant, pathological bone. Osteosarcoma has a 76% cure rate if detected early and treated before it spreads to other parts of the body. An X-ray is the primary tool for detecting bone tumors. Bone X-rays and other imaging tests can help detect osteosarcoma. A biopsy should be performed for an accurate diagnosis. This is a time-consuming and tedious task that might be greatly reduced with the help of appropriate tools. Data preprocessing, segmentation, feature extraction, and model training are the four main pillars of the proposed approach. Unwanted noises can be filtered out with some preprocessing. Low-spatial-frequency and high-spatial-frequency components are separated using segmentation. The proposed approach employed Tumor Border Clarity, Joint Distance, Tumor Texture, and other features for feature extraction. Let's move on to A-Residual CNN model training. The success percentage of the proposed approach was 96.39 percent. 2023 IEEE. -
An Innovative Method for Brain Stroke Prediction based on Parallel RELM Model
Strokes occur when blood supply to the brain is suddenly cut off or severely impaired. Stroke victims may experience cell death as a result of oxygen and food shortages. The effectiveness of various predictive data mining algorithms in illness prediction has been the subject of numerous studies. The three stages that make up this suggested method are feature selection, model training, and preprocessing. Missing value management, numeric value conversion, imbalanced dataset handling, and data scaling are all components of data preparation. The chi-square and RFE methods are utilized in feature selection. The former assesses feature correlation, while the latter recursively seeks for ever-smaller feature sets to choose features. The whole time the model was being trained, a Parallel RELM was used. This new method outperforms both ELM and RELM, achieving an average accuracy of 95.84%. 2024 IEEE. -
An Innovative Method for Election Prediction using Hybrid A-BiCNN-RNN Approach
Sentiment, volumetric, and social network analyses, as well as other methods, are examined for their ability to predict key outcomes using data collected from social media. Different points of view are essential for making significant discoveries. Social media have been used by individuals all over the world to communicate and share ideas for decades. Sentiment analysis, often known as opinion mining, is a technique used to glean insights about how the public feels and thinks. By gauging how people feel about a candidate on social media, they can utilize sentiment analysis to predict who will win an upcoming election. There are three main steps in the proposed approach, and they are preprocessing, feature extraction, and model training. Negation handling often requires preprocessing. Natural Language Processing makes use of feature extraction. Following the feature selection process, the models are trained using BiCNN-RNN. The proposed method is superiorto the widely usedBiCNN and RNN methods. 2023 IEEE. -
An Innovative Method for Enterprise Resource Planning (ERP) for Business and knowledge Management Based on Tree MLP Model
This strategy highlights the benefits of utilizing cutting-edge IT to back up company goals and genuinely assist in changing internal procedures by implementing an ERP-appropriate solution. Any organization, no matter how big or little, can benefit from an enterprise resource planning (ERP) system, which is an integrated suite of tools designed to streamline and improve internal business operations. Staying true to this approach will ensure that you get the greatest results while training the model, selecting features, and doing preprocessing. In order to use dense vector embedding for preparing the raw system logs, ERP system logs are typically represented by a combination of alphanumeric characters. While selecting features, SIM uses Particle Swarm Optimization (PSO) to create uniform product configurations. Using a Tree-MLP, the model was trained. This new strategy outperforms the old one, including Decision Tree and MLP. A 94.30% improvement in accuracy was achieved after implementing the technique. 2024 IEEE. -
An Innovative Method for Fuel Consumption and Maintenance Cost of Heavy-Duty Vehicles based on SR-GRU-CNN Algorithm
A heavy-duty vehicle's fuel usage, and thus its carbon dioxide emissions, are significantly impacted by the driver's behavior. The average fuel economy of a car varies by about 28% between drivers. Fuel efficiency can be improved by driver education, monitoring, and feedback. Fuel efficiency-based incentives are one form of feedback that can be provided. The largest challenge for transportation companies implementing such incentive programs is how to accurately evaluate drivers' fuel consumption. The processes of preprocessing, feature extraction, and model training are all utilized in the suggested method. Principal component analysis (PCA) is widely utilized in data science's preprocessing stage. GMM is used for feature extraction. Afterwards, SR-GRU-CNN is used to train the models based on the selected features. When compared to the two most popular alternatives, CNN and SR-GRU, the proposed methodexcels. 2023 IEEE. -
An Innovative Method for Housing Price Prediction using Least Square - SVM
The House Price Prediction is often employed to forecast housing market shifts. Individual house prices cannot be predicted using HPI alone due to the substantial correlation between housing price and other characteristics like location, area, and population. While several articles have used conventional machine learning methods to predict housing prices, these methods tend to focus on the market as a whole rather than on the performance of individual models. In addition, good data pretreatment methods are intended to be established to boost the precision of machine learning algorithms. The data is normalized and put to use. Features are selected using the correlation coefficient, and LSSVM is employed for model training. The proposed approach outperforms other models such as CNN and SVM. 2023 IEEE.