Browse Items (11855 total)
Sort by:
-
HULA: Dynamic and Scalable Load Balancing Mechanism for Data Plane of SDN
Multi-rooted topologies are used in large-scale networks to provide greater bisectional bandwidth. These topologies efficiently use a higher degree of multipathing, probing, and link utilization. An end-to-end load balancing strategy is required to use the bisection bandwidth effectively. HULA (Hop-by-hop Utilization-aware Load balancing Architecture) monitors congestion to determine the best path to the destination but, needs to be evaluated in terms of scalability. The authors of this paper through artifact research methodologies, stretch the scalability up to 1000 nodes and further evaluate the performance of HULA on software defined network platform over ONOS controller. A detailed investigation on HULA algorithm is analysed and compared with four proficient large-scale load balancing mechanisms including: connection hash, weighted round-robin, Data Plane Devlopment Kit (DPDK) technique, and a Stateless Application-Aware Load-Balancer (SHELL). 2023 IEEE. -
Rescue Operation with RF Pose Enabled Drones in Earthquake Zones
The main objective of this research is to use machine learning algorithms to locate people stranded by an earthquake or other big disasters. Disasters are often unpredictable, they can result in significant economic loss, and the survivors may struggle with despair and other mental health issues. The time, the victim's precise location, the possible condition of the victim, the resources and manpower on hand are the main challenges the rescue team must deal with. This article examines a model that gathers data and, using that data, predicts risk analysis and probability of finding the shortest distance to reach the person in need. Using a drone equipped with RF-pose technology and EHT sensors, it will be able to locate any individuals trapped inside a collapsed structure. To determine the dataset's extreme points and the shortest route to the victim's location by using the Dijkstra's algorithms. The primary aim of this article is to discuss the idea of applying these ML (Machine Learning) algorithms and creating a model that aids in rescuing those trapped beneath collapsed buildings. Devices that are part of the Internet of Things (IoT) have grown in popularity over the past few years as a result of their capacity for data collection and transmission. Particularly in disaster management, search and rescue operations, and other related disciplines, drones have shown to be useful IoT devices. These tools are perfect for emergency response circumstances because they can be utilized to access locations that are hard to get to or too dangerous for humans. Drones with cameras and other sensors can be used in disaster management to gather data in real-time on the severity of the damage caused by earthquakes and other disasters. The afflicted area may be mapped out with their help, and they can also be used to find survivors and spot dangerous places that should be avoided. The rescue operation can then be planned and the resource allocation made more efficient using this information. Drones can be used in search and rescue operations to find and follow people who are stuck or lost. Drones can be equipped with the RF-pose sensors used in the research described in the abstract to assist in locating people who are buried under debris. Thermal camera-equipped drones can also be used to locate people in low-light or night-time conditions by detecting their body heat. The capacity of drones to offer real-time data is one of the benefits particularly disaster management. 2023 IEEE. -
GNSS Signal Obstruction Removal Tool for Evaluating and Improving Position Accuracy in Satellite Networks
The positioning accuracy of Global Navigation Satellite System (GNSS) is largely affected by the site's surroundings. However, the methods to simulate GNSS signal obstruction and the nature of signal obstruction have not yet been explored fully. In this research, we investigated a way to remove the signals received from a specific region by specifying azimuth and elevation from GNSS observation files and evaluating how the removal of signals affects GNSS positioning accuracy. In addition, we also investigated the signal blockage for buildings of certain dimensions and a mountain. Python was used as a programming language to develop a program for the signal removal. RTKPOST was used for the GNSS data processing, and RTKPLOT was used for the visualisation of processed data and analysis of positioning accuracy. We successfully developed a Python shell script to remove the signals in GNSS data file from specific region by specifying azimuth and elevation. It was also found that removing signals from azimuth 0 to 100 degree and elevation 0 to 30 degree increased the positioning accuracy within a low multipath dataset. However, when the maximum elevation angle was increased to 45 degrees, positioning accuracy degraded, indicating that the signal from certain elevations have a positive or negative impact on positioning accuracy. Further research avenues are explored as an extension of work done here. 2023 IEEE. -
System Design for Financial and Economic Monitoring Using Big Data Clustering
Economic data executives are becoming increasingly important for the longevity and improvement of ventures due to the constant expansion in the influence of data innovation. This study lays out an undertaking economic data the executive's structure for the intricate internal undertaking economic data the board business. It also includes the application of web-based big data technology to understand the fairness, reliability, and security of system database calculations, mainly to improve office capabilities and solve daily project management problems. used in the project. The aim is to evaluate the suitability of transfer clustering computation (DCA) for managing large amounts of data in energy systems and the suitability of data economics dispatch methods for harnessing new energies. Then, combine day-ahead shipping plans with continuous shipping plans to create a multi-period, data-economic shipping model. Consider how the calculations are performed using a case study on the use of new energies. This will enable new energy in multi-period data economics shipping models while meeting his DR requirements on the customer side. 2023 IEEE. -
An Innovative Method for Housing Price Prediction using Least Square - SVM
The House Price Prediction is often employed to forecast housing market shifts. Individual house prices cannot be predicted using HPI alone due to the substantial correlation between housing price and other characteristics like location, area, and population. While several articles have used conventional machine learning methods to predict housing prices, these methods tend to focus on the market as a whole rather than on the performance of individual models. In addition, good data pretreatment methods are intended to be established to boost the precision of machine learning algorithms. The data is normalized and put to use. Features are selected using the correlation coefficient, and LSSVM is employed for model training. The proposed approach outperforms other models such as CNN and SVM. 2023 IEEE. -
Into the Dark World of User Experience: A Cognitive Walkthrough Study
In this age of AI, the unison of man and machine is going to be more prominent than ever, thus creating a need to understand the underlying framework that is adopted by app designers and developers from a psychological point of view. Research on the various benefits and harmful effects of user experience design and furthermore developing interventions and regulations to moderate the use of dark strategies in digital tools is the need of the hour. This paper calls for an ethical consideration of designing the experience of users by looking at the unethical practices that exist currently. The purpose of the study was to understand the cognitive, behavioural and affective experience of dark patterns in end users. There is a scarcity in the scientific literature with regard to dark patterns. This paper adopts the methodology of user cognitive walkthrough with 6 participants whose transcripts were analysed using thematic network analyses. The results are presented in the form of a thematic network. A few examples of the themes found are the experience of manipulation in users, rebellious attitudes, and automatic or habitual responses. These findings provide a basis for an in-depth understanding of dark patterns in user experience and provide themes that will help future researchers and designers develop ethical and more enriching user experiences for users. 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG. -
Artificial Intelligence and Machine Learning Combined Security Enhancement Using ENIGMA
Enigma is a relatively new and emerging field that has the potential to bring significant benefits to the way contracts are executed and managed. The integration of Artificial Intelligence (AI) into smart contract technology can automate repetitive tasks, reduce the need for human intervention, improve decision-making, and provide transparency and trust. It can also provide more flexibility, handle more complex tasks, learn from past experiences, have predictive capabilities, and have human oversight and intervention. All these features make Enigma contracts more advanced than traditional smart contracts. AI-powered smart contracts, or Enigma contracts, can also improve contract execution, increase efficiency, facilitate better negotiation, and facilitate automated dispute resolution. However, as the technology is still in its early stages, major challenges and risks can adopted but the need for robust security. The potential for AI is to make decisions that are not in the best interests of its parties. Despite these challenges, the potential benefits of AI-powered smart contracts make them an area of on-going research and development that is worth exploring further. Enigma can be used or applied in various fields, and can be used to secure the sensitive information by applying robust security system. Enigma contract is a AI powered smart contract which is used to automate decision-making processes and improve its efficiency, Enigma as the name suggest it is a complex security network which has the potential to revolutionize the security system by increasing efficiency. 2023 IEEE. -
Portfolio Optimization Using Quantum-Inspired Modified Genetic Algorithm
Optimization of portfolios has an additional level of complexity and has been an area of interest for both financial leaders and artificial intelligence experts. In this article, a quantum-inspired version of an improved genetic algorithm is proposed for the task of portfolio optimization. An effort is made to implement two different genetic versions along with their extension in the quantum-inspired space. Improvements to the popular crossover techniques, viz. (i) arithmetic and (ii) heuristic crossover are proposed to reduce computational time. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
Design and Development of Mobile Robot Manipulator for Patient Service During Pandemic Situations
Time and manpower are important constraints for completing large-scale tasks in this rapidly growing civilization. In most of the regular and often carried out works, such as welding, painting, assembly, container filling, and so on, automation is playing a vital part in reducing human effort. One of the key and most commonly performed activities is picking and placing projects from source to destination. Constant monitoring of patient bodily indicators such as temperature, pulse rate, and oxygen level and service of the patients becomes challenging in the current pandemic condition to the nurses and medical staffs. In consideration to this, a mobile robot with an integrated robotic arm has been designed and developed which can be available for service of patients continuously alongside monitoring them in general ward as well as in ICU of hospitals. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
An Efficient andOptimized Convolution Neural Network forBrain Tumour Detection
Brain tumour is a life threatening disease and can affect children and adults. This study focuses on classifying MRI scan images of brain into one of 4 classes namely: glioma tumour, meningioma tumour, pituitary tumour and normal brain. Person affected with brain tumours will need treatments such as surgery, radiation therapy or chemotherapy. Pretrained Convolution Neural Networks such as VGG19, MobileNet, and AlexNet which have been widely used for image classification using transfer learning. However due to huge storage space requirements these are not effectively deployed on edge devices for creation of robotic devices. Hence a compressed version of these models have been created using Genetic Algorithm algorithm which occupies nearly 3040% of space and also a reduced inference time which is less by around 50% of original model. The accuracy provided by VGG19, AlexNet, MobileNet and Proposed CNN before compression was 92.18%, 89.45%, 93.75% and 96.85% respectively. Similarly the accuracy after compression for VGG19, AlexNet, MobileNet and Proposed CNN was 91.34%, 88.92%, 94.40% and 95.29%. 2023, Springer Nature Switzerland AG. -
On-board Converter for Electric Vehicle Charging using LCLC Resonant Topology
Due to their high efficiency, high power density, and soft switching characteristics, LLC-based AC-DC resonant converters are a great choice for EV chargers. Adding a capacitor across the magnetizing inductance of the LLC resonant architecture (LCLC configuration) enhances efficiency and reduces the need for a larger series inductor. The output DC voltage of the converter is generally regulated using switching frequency control. However, the power factor of the converter varies significantly with the switching frequency. As a result, any fluctuations in load may cause the converter to operate at a lower power factor. This paper proposes a single-stage topology based on the LCLC resonant structure. Zero voltage switching (ZVS) of the IGBTs used in the converter is ensured by the LCLC resonant configuration. Converter have a power factor correction (PFC) stage on the front of the converter to achieve natural power factor correction. Since the PFC stage and the resonant stage are controlled by the same switches, the converter is smaller and less costly. Simulations in MATLAB/Simulink are used to validate the topology. 2023 IEEE. -
Human Resource Management in the Power Industry Using Fuzzy Data Mining Algorithm
Currently, database and information technology's frontier study area is data mining. It is acknowledged as one of the essential technologies with the greatest potential. Numerous technologies with a comparatively high level of technical substance are used in data mining, including artificial intelligence, neural networks, fuzzy theory, and mathematical statistics. The realization is challenging as well. Job satisfaction is one of several factors that cause employees to leave or switch jobs, and it is also closely tied to the organization's human resource management (HRM) procedures. It is continuously difficult and at times beyond the HR office's control to keep their profoundly qualified and talented specialists, yet data mining can assume a part in recognizing those labourers who are probably going to leave an association, permitting the HR division to plan a mediation methodology or search for options. We have analysed the major thoughts, techniques, and calculations of affiliation rule mining innovation in this article. They effectively finished affiliation broadcasting, acknowledged perception, and eventually revealed valuable data when they were coordinated into the human resource management arrangement of schools and colleges. 2023 IEEE. -
A Cooperative Global Sequencing Algorithm for Distributed Wireless Sensor Networks
Data gathering is a very fundamental use for wireless sensor networks. The area to be monitored has sensor units distributed. They can tell how much demand there is. Temperature, pressure, humidity, sun rays, and other factors could be involved. The detected data is sent to a centralized device called a sink or just a base station. Networks are frequently distributed in character, meaning that more than one kind of instrument is placed in a particular area. There is only one kind of component in uniform networks. A tree is created and anchored at the sink after the nodes have been distributed. In distributed networks, flawless aggregation is challenging to accomplish. In contrast to uniform networks, nodes may receive and transmit multiple types of packets. Every message should be forwarded by the node to a parent so that it can be combined in order to increase the likelihood of aggregation. As a result, a node might need to choose more than one progenitor. This implies that various parameters should be taken into account while forming trees. We have improved the literature's suggested combined distributed scheduling and tree generation for distributed networks. We discover that the expanded method maximizes aggregation, schedules the network with fewer time slots, and uses less energy. Additionally, it is discovered that distributed networks require more management costs to schedule than uniform networks do. 2023 IEEE. -
Trust Model for Cloud Using Weighted KNN Classification for Better User Access Control
The majority of the time, cloud computing is a service-based technology that provides Internet-based technological services. Cloud computing has had explosive growth since its debut, and it is now integrated into a wide variety of online services. These have the primary benefit of allowing thin clients to access the resources and services. Even while it could appear favorable, there are a lot of potential weak points for various types of assaults and cyber threats. Access control is one of the several protection layers that are available as part of cloud security solutions. In order to improve cloud security, this research introduces a unique access control mechanism. For granting users access to various resources, the suggested approach applies the trust concept. For the purpose of predicting trust, the KNN model was recently proposed, however the current approach for categorizing options is sensitive and unstable, particularly when an unbalanced data scenario occurs. Furthermore, it has been discovered that using the exponent distance as a weighting system improves classification performance and lowers variance. The prediction of the users trust levels using weighted K-means closest neighbors is presented in this research. According to the findings, the suggested approach is more effective in terms of throughput, cost, and delay. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
Metaheuristicsbased Task Offloading Framework in Fog Computing for Latency-sensitive Internet of Things Applications
The Internet of Things (IoT) applications have tremendously increased its popularity within a short span of time due to the wide range of services it offers. In the present scenario, IoT applications rely on cloud computing platforms for data storage and task offloading. Since the IoT applications are latency-sensitive, depending on a remote cloud datacenter further increases the delay and response time. Most of the IoT applications shift from cloud to fog computing for improved performance and to lower the latency. Fog enhances the Quality of service (QoS) of the connected applications by providing low latency. Different task offloading schemes in fog computing are proposed in literature to enhance the performance of IoT-fog-cloud integration. The proposed methodology focuses on constructing a metaheuristic based task offloading framework in the three-tiered IoT-fog-cloud network to enable efficient execution of latency-sensitive IoT applications. The proposed work utilizes two effective optimization algorithms such as Flamingo search algorithm (FSA) and Honey badger algorithm (HBA). Initially, the FSA algorithm is executed in an iterative manner where the objective function is optimized in every iteration. The best solutions are taken in this algorithm and fine tuning is performed using the HBA algorithm to refine the solution. The output obtained from the HBA algorithm is termed as the optimized outcome of the proposed framework. Finally, evaluations are carried out separately based on different scenarios to prove the performance efficacy of the proposed framework. The proposed framework obtains the task offloading time of 71s and also obtains less degree of imbalance and lesser latency when compared over existing techniques. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
Swarm Intelligence Decentralized Decision Making In Multi-Agent System
This research aims to understand how groups of agents can make decisions collectively without relying on a central authority. The research could focus on developing algorithms and models for distributed problem solving, such as consensus-reaching and voting methods, or for coordinating actions among agents in a decentralized manner. The research could also look into the application of these methods in various fields like distributed robotics, swarm intelligence, and multi-agent systems in smart cities and transportation networks. Swarm intelligence in decentralization is an emerging field that combines the principles of swarm intelligence and decentralized systems to design highly adaptive and scalable systems. These systems consist of a large number of autonomous agents that interact with each other and the environment through local communication and adapt their behaviors based on environmental cues. The decentralized nature of these systems makes them highly resilient and efficient, with potential applications in areas such as robotics, optimization, and block chain technology. However, designing algorithms and communication protocols that enable effective interaction among agents without relying on a centralized controller remains a key challenge. This article proposes a model for swarm intelligence in decentralization, including agents, communication, environment, learning, decision-making, and coordination, and presents a block diagram to visualize the key components of the system. The paper concludes by highlighting the potential benefits of swarm intelligence in decentralization and the need for further research in this area. 2023 IEEE. -
Efficient Method for Tomato Leaf Disease Detection and Classification based on Hybrid Model of CNN and Extreme Learning Machine
Through India, most people make a living through agriculture or a related industry. Crops and other agricultural output suffer significant quality and quantity losses when plant diseases are present. The solution to preventing losses in the harvest and quantity of agricultural products is the detection of these illnesses. Improving classification accuracy while decreasing computational time is the primary focus of the suggested method for identifying leaf disease in tomato plant. Pests and illnesses wipe off thousands of tons of tomatoes in India's harvest every year. The agricultural industry is in danger from tomato leaf disease, which generates substantial losses for producers. Scientists and engineers can improve their models for detecting tomato leaf diseases if they have a better understanding of how algorithms learn to identify them. This proposed approaches a unique method for detecting diseases on tomato leaves using a five-step procedure that begins with image preprocessing and ends with feature extraction, feature selection, and model classification. Preprocessing is done to improve image quality. That improved K-Means picture segmentation technique proposes segmentation as a key intermediate step. The GLCM feature extraction approach is then used to extract relevant features from the segmented image. Relief feature selection is used to get rid of the categorization results. finally, classification techniques such as CNN and ELM are used to categorize infected leaves. The proposed approach to outperforms other two models such as CNN and ELM. 2023 IEEE. -
Heart Disease PredictionA Computational Machine Learning Model Perspective
Relying on medical instruments to predict heart disease is either expensive or inefficient. It is important to detect cardiac diseases early to avoid complications and reduce the death rate. This research aims to compare various machine learning models using supervised learning techniques to find a better model that gives the highest accuracy for heart disease prediction. This research compares standalone and ensemble models for prediction analysis. Six standalone models are logistic regression, Naive Bayes, support vector machine, K-nearest neighbors, artificial neural network, and decision tree. The three ensemble models include random forest, AdaBoost, and XGBoost. Feature engineering is done with principal component analysis (PCA). The experimental process resulted in random forest giving better prediction analysis with 92% accuracy. Random forest can handle both regression and classification tasks. The predictions it generates are accurate and simple to comprehend. It is capable of effectively handling big datasets. Utilizing numerous trees avoids and inhibits overfitting. Instead of searching for the most prominent feature when splitting a node, it seeks out an optimal feature among a randomly selected feature set in order to minimize the variance. Due to all these reasons, it has performed better. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
A Data Mining approach on the Performance of Machine Learning Methods for Share Price Forecasting using the Weka Environment
It is widely agreed that the share price is too volatile to be reliably predicted. Several experts have worked to improve the likelihood of generating a profit from share investing using various approaches and methods. When used in reality, these methods and algorithms often have too low of a success rate to be helpful. The extreme volatility of the marketplace is a significant contributor. This article demonstrates the use of data mining methods like WEKA to study share prices. For this research's sake, we have selected a HCL Tech share. Multilayer perceptron's, Gaussian Process and Sequential minimal optimization have been employed as the three prediction methods. These algorithms that develop optimal rules for share market analysis have been incorporated into Weka. We have transformed the attributes of open, high, low, close and adj-close prices forecasted share for the next 30 days. Compare actual and predicted values of three models' side by side. We have visualized 1step ahead and the future forecast of three models. The Evaluation metrics of RMSE, MAPE, MSE, and MAE are calculated. The outcomes achieved by the three methods have been contrasted. Our experimental findings show that Sequential minimal optimization provided more precise results than the other method on this dataset. 2023 IEEE. -
Forecasting Bitcoin Price During Covid-19 Pandemic Using Prophet and ARIMA: An Empirical Research
Bitcoin and other cryptocurrencies are the alternative and speculative digital financial assets in today's growing fintech economy. Blockchain technology is essential for ensuring ownership of bitcoin, a decentralized technology. These coins display high volatility and bubble-like behavior. The widespread acceptance of cryptocurrencies poses new challenges to the corporate community and the general public. Currency market traders and fintech researchers have classified cryptocurrencies as speculative bubbles. The study has identified the bitcoin bubble and its breaks during the COVID-19 pandemic. From 1st April 2018 to 31st March 2021, we used high-frequency data to calculate the daily closing price of bitcoin. The prophet model and Arima forecasting methods have both been taken. We also examined the explosive bubble and found structural cracks in the bitcoin using the ADF, RADF, and SADF tests. It found five multiple breaks detected from 2018 to 2021 in bitcoin prices. ARIMA(1,1,0) fitted the best model for price prediction. The ARIMA and Facebook Prophet model is applied in the forecasting, and found that the Prophet model is best in forecasting prices. 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.