Browse Items (11808 total)
Sort by:
-
Portfolio Optimization Using Quantum-Inspired Modified Genetic Algorithm
Optimization of portfolios has an additional level of complexity and has been an area of interest for both financial leaders and artificial intelligence experts. In this article, a quantum-inspired version of an improved genetic algorithm is proposed for the task of portfolio optimization. An effort is made to implement two different genetic versions along with their extension in the quantum-inspired space. Improvements to the popular crossover techniques, viz. (i) arithmetic and (ii) heuristic crossover are proposed to reduce computational time. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
Design and Development of Mobile Robot Manipulator for Patient Service During Pandemic Situations
Time and manpower are important constraints for completing large-scale tasks in this rapidly growing civilization. In most of the regular and often carried out works, such as welding, painting, assembly, container filling, and so on, automation is playing a vital part in reducing human effort. One of the key and most commonly performed activities is picking and placing projects from source to destination. Constant monitoring of patient bodily indicators such as temperature, pulse rate, and oxygen level and service of the patients becomes challenging in the current pandemic condition to the nurses and medical staffs. In consideration to this, a mobile robot with an integrated robotic arm has been designed and developed which can be available for service of patients continuously alongside monitoring them in general ward as well as in ICU of hospitals. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
An Efficient andOptimized Convolution Neural Network forBrain Tumour Detection
Brain tumour is a life threatening disease and can affect children and adults. This study focuses on classifying MRI scan images of brain into one of 4 classes namely: glioma tumour, meningioma tumour, pituitary tumour and normal brain. Person affected with brain tumours will need treatments such as surgery, radiation therapy or chemotherapy. Pretrained Convolution Neural Networks such as VGG19, MobileNet, and AlexNet which have been widely used for image classification using transfer learning. However due to huge storage space requirements these are not effectively deployed on edge devices for creation of robotic devices. Hence a compressed version of these models have been created using Genetic Algorithm algorithm which occupies nearly 3040% of space and also a reduced inference time which is less by around 50% of original model. The accuracy provided by VGG19, AlexNet, MobileNet and Proposed CNN before compression was 92.18%, 89.45%, 93.75% and 96.85% respectively. Similarly the accuracy after compression for VGG19, AlexNet, MobileNet and Proposed CNN was 91.34%, 88.92%, 94.40% and 95.29%. 2023, Springer Nature Switzerland AG. -
On-board Converter for Electric Vehicle Charging using LCLC Resonant Topology
Due to their high efficiency, high power density, and soft switching characteristics, LLC-based AC-DC resonant converters are a great choice for EV chargers. Adding a capacitor across the magnetizing inductance of the LLC resonant architecture (LCLC configuration) enhances efficiency and reduces the need for a larger series inductor. The output DC voltage of the converter is generally regulated using switching frequency control. However, the power factor of the converter varies significantly with the switching frequency. As a result, any fluctuations in load may cause the converter to operate at a lower power factor. This paper proposes a single-stage topology based on the LCLC resonant structure. Zero voltage switching (ZVS) of the IGBTs used in the converter is ensured by the LCLC resonant configuration. Converter have a power factor correction (PFC) stage on the front of the converter to achieve natural power factor correction. Since the PFC stage and the resonant stage are controlled by the same switches, the converter is smaller and less costly. Simulations in MATLAB/Simulink are used to validate the topology. 2023 IEEE. -
Human Resource Management in the Power Industry Using Fuzzy Data Mining Algorithm
Currently, database and information technology's frontier study area is data mining. It is acknowledged as one of the essential technologies with the greatest potential. Numerous technologies with a comparatively high level of technical substance are used in data mining, including artificial intelligence, neural networks, fuzzy theory, and mathematical statistics. The realization is challenging as well. Job satisfaction is one of several factors that cause employees to leave or switch jobs, and it is also closely tied to the organization's human resource management (HRM) procedures. It is continuously difficult and at times beyond the HR office's control to keep their profoundly qualified and talented specialists, yet data mining can assume a part in recognizing those labourers who are probably going to leave an association, permitting the HR division to plan a mediation methodology or search for options. We have analysed the major thoughts, techniques, and calculations of affiliation rule mining innovation in this article. They effectively finished affiliation broadcasting, acknowledged perception, and eventually revealed valuable data when they were coordinated into the human resource management arrangement of schools and colleges. 2023 IEEE. -
A Cooperative Global Sequencing Algorithm for Distributed Wireless Sensor Networks
Data gathering is a very fundamental use for wireless sensor networks. The area to be monitored has sensor units distributed. They can tell how much demand there is. Temperature, pressure, humidity, sun rays, and other factors could be involved. The detected data is sent to a centralized device called a sink or just a base station. Networks are frequently distributed in character, meaning that more than one kind of instrument is placed in a particular area. There is only one kind of component in uniform networks. A tree is created and anchored at the sink after the nodes have been distributed. In distributed networks, flawless aggregation is challenging to accomplish. In contrast to uniform networks, nodes may receive and transmit multiple types of packets. Every message should be forwarded by the node to a parent so that it can be combined in order to increase the likelihood of aggregation. As a result, a node might need to choose more than one progenitor. This implies that various parameters should be taken into account while forming trees. We have improved the literature's suggested combined distributed scheduling and tree generation for distributed networks. We discover that the expanded method maximizes aggregation, schedules the network with fewer time slots, and uses less energy. Additionally, it is discovered that distributed networks require more management costs to schedule than uniform networks do. 2023 IEEE. -
Trust Model for Cloud Using Weighted KNN Classification for Better User Access Control
The majority of the time, cloud computing is a service-based technology that provides Internet-based technological services. Cloud computing has had explosive growth since its debut, and it is now integrated into a wide variety of online services. These have the primary benefit of allowing thin clients to access the resources and services. Even while it could appear favorable, there are a lot of potential weak points for various types of assaults and cyber threats. Access control is one of the several protection layers that are available as part of cloud security solutions. In order to improve cloud security, this research introduces a unique access control mechanism. For granting users access to various resources, the suggested approach applies the trust concept. For the purpose of predicting trust, the KNN model was recently proposed, however the current approach for categorizing options is sensitive and unstable, particularly when an unbalanced data scenario occurs. Furthermore, it has been discovered that using the exponent distance as a weighting system improves classification performance and lowers variance. The prediction of the users trust levels using weighted K-means closest neighbors is presented in this research. According to the findings, the suggested approach is more effective in terms of throughput, cost, and delay. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
Metaheuristicsbased Task Offloading Framework in Fog Computing for Latency-sensitive Internet of Things Applications
The Internet of Things (IoT) applications have tremendously increased its popularity within a short span of time due to the wide range of services it offers. In the present scenario, IoT applications rely on cloud computing platforms for data storage and task offloading. Since the IoT applications are latency-sensitive, depending on a remote cloud datacenter further increases the delay and response time. Most of the IoT applications shift from cloud to fog computing for improved performance and to lower the latency. Fog enhances the Quality of service (QoS) of the connected applications by providing low latency. Different task offloading schemes in fog computing are proposed in literature to enhance the performance of IoT-fog-cloud integration. The proposed methodology focuses on constructing a metaheuristic based task offloading framework in the three-tiered IoT-fog-cloud network to enable efficient execution of latency-sensitive IoT applications. The proposed work utilizes two effective optimization algorithms such as Flamingo search algorithm (FSA) and Honey badger algorithm (HBA). Initially, the FSA algorithm is executed in an iterative manner where the objective function is optimized in every iteration. The best solutions are taken in this algorithm and fine tuning is performed using the HBA algorithm to refine the solution. The output obtained from the HBA algorithm is termed as the optimized outcome of the proposed framework. Finally, evaluations are carried out separately based on different scenarios to prove the performance efficacy of the proposed framework. The proposed framework obtains the task offloading time of 71s and also obtains less degree of imbalance and lesser latency when compared over existing techniques. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
Swarm Intelligence Decentralized Decision Making In Multi-Agent System
This research aims to understand how groups of agents can make decisions collectively without relying on a central authority. The research could focus on developing algorithms and models for distributed problem solving, such as consensus-reaching and voting methods, or for coordinating actions among agents in a decentralized manner. The research could also look into the application of these methods in various fields like distributed robotics, swarm intelligence, and multi-agent systems in smart cities and transportation networks. Swarm intelligence in decentralization is an emerging field that combines the principles of swarm intelligence and decentralized systems to design highly adaptive and scalable systems. These systems consist of a large number of autonomous agents that interact with each other and the environment through local communication and adapt their behaviors based on environmental cues. The decentralized nature of these systems makes them highly resilient and efficient, with potential applications in areas such as robotics, optimization, and block chain technology. However, designing algorithms and communication protocols that enable effective interaction among agents without relying on a centralized controller remains a key challenge. This article proposes a model for swarm intelligence in decentralization, including agents, communication, environment, learning, decision-making, and coordination, and presents a block diagram to visualize the key components of the system. The paper concludes by highlighting the potential benefits of swarm intelligence in decentralization and the need for further research in this area. 2023 IEEE. -
Efficient Method for Tomato Leaf Disease Detection and Classification based on Hybrid Model of CNN and Extreme Learning Machine
Through India, most people make a living through agriculture or a related industry. Crops and other agricultural output suffer significant quality and quantity losses when plant diseases are present. The solution to preventing losses in the harvest and quantity of agricultural products is the detection of these illnesses. Improving classification accuracy while decreasing computational time is the primary focus of the suggested method for identifying leaf disease in tomato plant. Pests and illnesses wipe off thousands of tons of tomatoes in India's harvest every year. The agricultural industry is in danger from tomato leaf disease, which generates substantial losses for producers. Scientists and engineers can improve their models for detecting tomato leaf diseases if they have a better understanding of how algorithms learn to identify them. This proposed approaches a unique method for detecting diseases on tomato leaves using a five-step procedure that begins with image preprocessing and ends with feature extraction, feature selection, and model classification. Preprocessing is done to improve image quality. That improved K-Means picture segmentation technique proposes segmentation as a key intermediate step. The GLCM feature extraction approach is then used to extract relevant features from the segmented image. Relief feature selection is used to get rid of the categorization results. finally, classification techniques such as CNN and ELM are used to categorize infected leaves. The proposed approach to outperforms other two models such as CNN and ELM. 2023 IEEE. -
Heart Disease PredictionA Computational Machine Learning Model Perspective
Relying on medical instruments to predict heart disease is either expensive or inefficient. It is important to detect cardiac diseases early to avoid complications and reduce the death rate. This research aims to compare various machine learning models using supervised learning techniques to find a better model that gives the highest accuracy for heart disease prediction. This research compares standalone and ensemble models for prediction analysis. Six standalone models are logistic regression, Naive Bayes, support vector machine, K-nearest neighbors, artificial neural network, and decision tree. The three ensemble models include random forest, AdaBoost, and XGBoost. Feature engineering is done with principal component analysis (PCA). The experimental process resulted in random forest giving better prediction analysis with 92% accuracy. Random forest can handle both regression and classification tasks. The predictions it generates are accurate and simple to comprehend. It is capable of effectively handling big datasets. Utilizing numerous trees avoids and inhibits overfitting. Instead of searching for the most prominent feature when splitting a node, it seeks out an optimal feature among a randomly selected feature set in order to minimize the variance. Due to all these reasons, it has performed better. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
A Data Mining approach on the Performance of Machine Learning Methods for Share Price Forecasting using the Weka Environment
It is widely agreed that the share price is too volatile to be reliably predicted. Several experts have worked to improve the likelihood of generating a profit from share investing using various approaches and methods. When used in reality, these methods and algorithms often have too low of a success rate to be helpful. The extreme volatility of the marketplace is a significant contributor. This article demonstrates the use of data mining methods like WEKA to study share prices. For this research's sake, we have selected a HCL Tech share. Multilayer perceptron's, Gaussian Process and Sequential minimal optimization have been employed as the three prediction methods. These algorithms that develop optimal rules for share market analysis have been incorporated into Weka. We have transformed the attributes of open, high, low, close and adj-close prices forecasted share for the next 30 days. Compare actual and predicted values of three models' side by side. We have visualized 1step ahead and the future forecast of three models. The Evaluation metrics of RMSE, MAPE, MSE, and MAE are calculated. The outcomes achieved by the three methods have been contrasted. Our experimental findings show that Sequential minimal optimization provided more precise results than the other method on this dataset. 2023 IEEE. -
Forecasting Bitcoin Price During Covid-19 Pandemic Using Prophet and ARIMA: An Empirical Research
Bitcoin and other cryptocurrencies are the alternative and speculative digital financial assets in today's growing fintech economy. Blockchain technology is essential for ensuring ownership of bitcoin, a decentralized technology. These coins display high volatility and bubble-like behavior. The widespread acceptance of cryptocurrencies poses new challenges to the corporate community and the general public. Currency market traders and fintech researchers have classified cryptocurrencies as speculative bubbles. The study has identified the bitcoin bubble and its breaks during the COVID-19 pandemic. From 1st April 2018 to 31st March 2021, we used high-frequency data to calculate the daily closing price of bitcoin. The prophet model and Arima forecasting methods have both been taken. We also examined the explosive bubble and found structural cracks in the bitcoin using the ADF, RADF, and SADF tests. It found five multiple breaks detected from 2018 to 2021 in bitcoin prices. ARIMA(1,1,0) fitted the best model for price prediction. The ARIMA and Facebook Prophet model is applied in the forecasting, and found that the Prophet model is best in forecasting prices. 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG. -
Bipolar Disease Data Prediction Using Adaptive Structure Convolutional Neuron Classifier Using Deep Learning
The symptoms of bipolar disorder include extreme mood swings. It is the most common mental health disorder and is often overlooked in all age groups. Bipolar disorder is often inherited, but not all siblings in a family will have bipolar disorder. In recent years, bipolar disorder has been characterised by unsatisfactory clinical diagnosis and treatment. Relapse rates and misdiagnosis are persistent problems with the disease. Bipolar disorder has yet to be precisely determined. To overcome this issue, the proposed work Adaptive Structure Convolutional Neuron Classifier (ASCNC) method to identify bipolar disorder. The Imbalanced Subclass Feature Filtering (ISF2) for visualising bipolar data was originally intended to extract and communicate meaningful information from complex bipolar datasets in order to predict and improve day-to-day analytics. Using the Scaled Features Chi-square Testing (SFCsT), extract the maximum dimensional features in the bipolar dataset and assign weights. In order to select features that have the largest Chi-square score, the Chi-square value for each feature should be calculated between it and the target. Before extracting features for the training and testing method, evaluate the Softmax neural activation function to compute the average weight of the features before the feature weights. Diagnostic criteria for bipolar disorder are discussed as an assessment strategy that helps diagnose the disorder. It then discusses appropriate treatments for children and their families. Finally, it presents some conclusions about managing people with bipolar disorder. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
Employee Attrition, Job Involvement, and Work Life Balance Prediction Using Machine Learning Classifier Models
Employee performance is an integral part organizational success, for which Talent management is highly required, and the motivating factors of employee depend on employee performance. Certain variables have been observed as outliers, but none of those variables were operated or predicted. This paper aims at creating predictive models for the employee attrition by using classifier models for attrition rate, Job Involvement, and Work Life Balance. Job Involvement is specifically linked to the employee intentions to turn around that is minimal turnover rate. So, getting justifiable solution, this paper states the novel and accurate classification models. The Ridge Classifier model is the first one it has been used to classify IBM employee attrition, and it gave an accuracy of 92.7%. Random Forest had the highest accuracy for predicting Job Involvement, with accuracy rate of 62.3%. Similarly, Logistic Regression has been the model selected to predict Work Life Balance, and it has a 64.8% accuracy rate, making it an acceptable classification model. The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023. -
An exploration of the impact of Feature quality versus Feature quantity on the performance of a machine learning model
About 0.62 trillion bytes of data are generated every hour globally. These figures have been increasing as a result of digitalization and social networks. Some data ecosystems capture, store, and manage this big DATA. The basis is to be able to analyze their information and extract their value. This fact is a gold mine for companies researching and using this data. This leads us to follow how essential and valuable data is in this growing age. For any machine learning model, the selection of data is necessary. In this paper, several experiments have been performed to check the importance of data quality vs. data quantity on model performance. This clearly indicates comparing the data's richness regarding feature quality (e.g., features in images) and the amount of data for any machine learning model. Images are classified into two sets based on features, then removing redundant features from them, then training a machine learning model. Model getting trained with non-redundant data gives highest accuracy (>80%) in all cases versus the one with all features, proving the importance of feature variability and not just the feature count. 2023 IEEE. -
Assessing Academic Performance Using Ensemble Machine Learning Models
Artificial Intelligence (AI) shall play a vital role in forecasting and predicting the academic performance of students. Societal factors such as family size, education and occupation of parents, and students' health, along with the details of their behavioral absenteeism are used as independent variables for the analysis. To perform this study, a standardized dataset is used with data instances of 1044 entries and a total of 33 unique variables constituting the feature matrix. Machine learning (ML) algorithms such as Support Vector Machine (SVM), Random Forest (RF), Multilayer Perceptron (MLP), LightGBM, and Ensemble Stacking (ES) are used to assess the specified dataset. Finally, an ES model is developed and used for assessment. Comparatively, the ES model outclassed other ML models with a test accuracy of 99.3%. Apart from accuracy, other parameters of metrics are used to evaluate the performance of the algorithms. 2023 IEEE. -
User Sentiment Analysis of Blockchain-Enabled Peer-to-Peer Energy Trading
A new way for the general public to consume and trade green energy has emerged with the introduction of peer-to-peer (P2P) energy trading platforms. Thus, how the peer-to-peer energy trading platform is designed is crucial to facilitating the trading experience for users. The data mining method will be used in this study to assess the elements affecting the P2P energy trading experience. The Natural Language Processing (NLP) approach will also be used in this study to evaluate the variables that affect the P2P energy trading experience and look at the role of topic modeling in the topic extraction using LDA. The findings show that the general public was more interested in the new technology and how the energy coin payment system operated during the trade process. This explanation of energy as a CC is an outlier that fits well with the conventional literature. The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2023. -
Probing the Role of Information and Communication Technology (ICT) in Enhancing Research: An Epilogue of Accessible Research Tools
Information and Communication Technology (ICT) has revolutionized the way researchers conduct their work. It has enabled them to access a wealth of information through online databases, collaborate with colleagues across the globe, and analyze vast amounts of data quickly and accurately. This paper explores the role of ICT in enhancing research tools, highlighting the benefits it provides to researchers in terms of increased efficiency, improved accuracy, and greater access to resources. It also discusses some of the challenges associated with using ICT in research, such as data security and privacy concerns, and offers potential solutions. Overall, the paper concludes that ICT is an essential tool for researchers and will continue to play an increasingly important role in advancing scientific knowledge and innovation. The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd 2023. -
A Scoping review of Deep Reinforcement Learning methods in Visual Navigation
Reinforcement Learning (RL) is a subset of Machine Learning that trains an agent to make a series of decisions and take action by interacting directly with the environment. In this approach, the agent learns to attain the goal by the response from its action as rewards or punishment. Recent advances in reinforcement learning combined with deep learning methods have led to breakthrough research in solving many complex problems in the field of Artificial Intelligence. This paper presents recent literature on autonomous visual navigation of robots using Deep Reinforcement Learning (DRL) algorithms and methods. It also describes the algorithms evaluated, the environment used for implementation, and the policy applied to maximize the rewards earned by the agent. The paper concludes with a discussion of the new models created by various authors, their merits over the existing methods, and a briefing on further research. 2023 IEEE.