Browse Items (11855 total)
Sort by:
-
Enzyme immobilized conducting polymer-based biosensor for the electrochemical determination of the eco-toxic pollutant p-nonylphenol
The unbridled release of harmful endocrine disruptors (EDs) into the environment is deteriorating human and animal health. A facile and efficacious biosensor was developed by immobilizing laccase over electropolymerized poly anthranilic acid on a carbon fiber paper (CFP) electrode, Lac/PAA/CFP for the detection of p-nonylphenol (PNP). PNP is a persistent phenolic endocrine disruptor and a harmful eco-toxic pollutant. Physico-chemical and electrochemical characterization of the fabricated electrode was carried out to study the modification of the Lac/PAA/CFP electrode. Cyclic voltammetric studies divulged that the prepared sensor has catalytic activity approximately twice greater than that of the bare CFP electrode. The influence of pH and scan rate was scrutinized for the modified electrode. Under optimized conditions differential pulse voltammetric studies were used for the quantification and the results revealed that the biosensor has a low limit of detection (LOD) and limit of quantification (LOQ) of 1.74 nM and 5 Nm, respectively with a broad linear dynamic range of 5250 nM. In the presence of interferants, the developed biosensor exhibited good selectivity toward the electrochemical detection of PNP. Molecular docking studies carried out revealed the hydrogen bonding interaction of the Asn264 residue of laccase Trametes versicolor. Further, the fabricated biosensor was accessed for its practicality in real samples collected from tap water and lake water. 2023 Elsevier Ltd -
EPCAEnhanced Principal Component Analysis for Medical Data Dimensionality Reduction
Innovations in technology from thelast one decade have led to the generation of colossal amounts of medical data with comparably low cost. Medical data should be collected with utmost care. Sometimes, the data have high features but not all the features play an important role in drawing the relations to the mining task. For the training of machine learning algorithms, all the attributes in the data set are not relevant. Some of the characteristics may be negligible and some characteristics may not influence the outcome of the forecast. The pressure on machine learning algorithms can be minimized by ignoring or taking out the irrelevant attributes. Reducing the attributes must be done at the risk of information loss. In this research work, an Enhanced Principal Component Analysis (EPCA) is proposed, which reduces the dimensions of the medical dataset and takes paramount care of not losing important information, thereby achieving good and enhanced outcomes. The prominent dimensionality reduction techniques such as Principal Component Analysis (PCA), Singular Value Decomposition (SVD), Partial Least Squares (PLS), Random Forest, Logistic Regression, Decision Tree and the proposed EPCA are investigated on the following Machine Learning (ML) algorithms: Support Vector Machine (SVM), Artificial Neural Networks (ANN), Nae Bayes (NB) and Ensemble ANN (EANN) using statistical metrics such as F1 score, precision, accuracy and recall. To optimize the distribution of the data in the low-dimensional representation, EPCA directly mapped the data to a space with fewer dimensions. This is a result of feature correlation, which made it easier to recognize patterns. Additionally, because the dataset under consideration was multicollinear, EPCA aided in speeding computation by lowering the data's dimensionality and therebyenhancedthe classification model's accuracy. Due to these reasons, the experimental results showed that the proposed EPCA dimensionality reduction technique performed better when compared with other models. 2023, The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd. -
Epidemic Prediction using Machine Learning and Deep Learning Models on COVID-19 Data
A catastrophic epidemic of Severe Acute Respiratory Syndrome-Coronavirus, commonly recognised as COVID-19, introduced a worldwide vulnerability to human community. All nations around the world are making enormous effort to tackle the outbreak towards this deadly virus through various aspects such as technology, economy, relevant data, protective gear, lives-risk medications and all other instruments. The artificial intelligence-based researchers apply knowledge, experience and skill set on national level data to create computational and statistical models for investigating such a pandemic condition. In order to make a contribution to this worldwide human community, this paper recommends using machine-learning and deep-learning models to understand its daily accelerating actions together with predicting the future reachability of COVID-19 across nations by using the real-time information from the Johns Hopkins dashboard. In this work, a novel Exponential Smoothing Long-Short-Term Memory Networks Model (ESLSTM) learning model is proposed to predict the virus spread in the near future. The results are evaluated using RMSE and R-Squared values. 2022 Informa UK Limited, trading as Taylor & Francis Group. -
Epidemiological Transition in India and Determinants that Are Shifting Disease Burden: A Systematic Review
Indias disease burden patterns are shifting towards increased morbidity and mortality from Non-communicable disease and chronic diseases. This is one of the first studies conducted using the PRISMA guidelines and checklist to understand the role played by various determinants of health in this epidemiological transition happening in India. The search on 9 reputed bibliographic databases yielded 459 articles and finally 58 articles were selected based on carefully curated selection criteria. The results confirm the relation between India's demographic transition and the increasing disease burden from Non-communicable diseases (NCDs). 21 studies significantly associated urban residential status, increasing income, better living conditions and education with increasing NCDs' prevalence. 12 studies found that NCDs were more prevalent among women than men. Increased physical activity, a healthy diet and a lower hipto-waist ratio were observed to protect against NCDs. While 9 studies found smoking tobacco and alcohol consumption were not significantly associated with the prevalence of NCDs. It is of foremost importance that Indias public health policy focus must shift towards inclusivity as there is an affluent gradient to the increased morbidity and mortality from NCDs. Copyright2024 by authors, all rights reserved. -
Epigenetic Mechanisms Induced by Mycobacterium tuberculosis to Promote Its Survival in the Host
Tuberculosis caused by the obligate intracellular pathogen, Mycobacterium tuberculosis, is one among the prime causes of death worldwide. An urgent remedy against tuberculosis is of paramount importance in the current scenario. However, the complex nature of this appalling disease contributes to the limitations of existing medications. The quest for better treatment approaches is driving the research in the field of host epigenomics forward in context with tuberculosis. The interplay between various host epigenetic factors and the pathogen is under investigation. A comprehensive understanding of how Mycobacterium tuberculosis orchestrates such epigenetic factors and favors its survival within the host is in increasing demand. The modifications beneficial to the pathogen are reversible and possess the potential to be better targets for various therapeutic approaches. The mechanisms, including histone modifications, DNA methylation, and miRNA modification, are being explored for their impact on pathogenesis. In this article, we are deciphering the role of mycobacterial epigenetic regulators on various strategies like cytokine expression, macrophage polarization, autophagy, and apoptosis, along with a glimpse of the potential of host-directed therapies. 2024 by the authors. -
Epilepsy Detection Using Supervised Learning Algorithms
In the current scenario, people are suffering and isolated by themselves by seizure detection and prediction in epilepsy. Also, it is highly essential that it needs to be identified through wearable devices. Researchers discussed this issue and outlined future developments in this field, suggesting that Machine Learning (ML) techniques could radically change how we diagnose and manage patients with epilepsy. However, as data availability has increased, Deep Learning (DL) techniques have become the most cutting-edge approach to adopt and use with wearable devices. On the other hand, large amounts of data are needed to train DL models, making overfitting problematic. DL models are created with open-source toolboxes and Python, allowing researchers to create automated systems and broaden computational accessibility. This work thoroughly overviews deep learning (DL) methods and neuroimaging modalities for automated epileptic seizure identification. It covers several MRI and EEG techniques for epileptic seizure diagnosis and treatment programmes designed to treat these seizures. The study also covers the difficulties in precise detection, the benefits and drawbacks of DL-based strategies, potential DL models and upcoming research in this area. 2024 IEEE. -
Epileptic seizure detection using EEG signals and multilayer perceptron learning algorithm
Purpose: Epileptic is a neurological chronic disorder that causes unprovoked, recurrent seizure. A seizure is a sudden rush of electrical activity in the brain. The central nervous system characterized by the loss of consciousness and convulsions. Epileptic is caused by abnormal electrical discharge that lead to uncountable movements, loss of consciousness and convulsions. 50-80 million people in the world are affected by this disorder. Now a days children and adults are affected the most and it has been medically treated. Sometimes it may lead to death and serious injuries. In this technology world the computerized detection is an enhanced solution to protect epileptic patients from dangers at the time of this seizure. Method: Perceptron learning algorithm is a supervised learning of binary classifiers and also it is a simple prototype of a biological neuron in artificial neural network. EEG is extensively documented for the diagnosing and assessing brain activates and related disorders. In this paper EEG signals are taken as dataset for epilepsy detection. The data is been represented based on three domains namely frequency, time and time-frequency applied by the chebysev filter for processing the signals. Result: Help the patients from dangers at the time of the seizure. Conclusion: The neurological diseases can be divided into two loss of consciousness and convulsions. In this technology world the seizure can be detected by computerized way like EEG and so on. This paper proposes an epileptic seizure detection using EEG (Electroencephalogram) and perceptron learning algorithm. 2020, IJSTR. -
Epileptic Seizure Prediction from EEG Signals Using DenseNet
Epilepsy is a disorder in which the normal electrical pattern in the brain is disrupted causing seizures or loss of consciousness. Seizure is harmful during various events like swimming or driving. The electroencephalogram (EEG) is the measurement of electrical activity received from the nerve cells of the cerebral cortex. Forthcoming seizures can be predicted from scalp EEG signal to improve the quality of life. The study proposes a method of automatic epileptic seizure prediction from raw EEG signal. The raw EEG signal is converted into EEG signal image for automatic extraction of features and classification of inter-ictal and pre-ictal state using Dense Convolutional Network (DenseNet). This classification process is carried out in a manner similar to the process followed by a medical practitioner without resorting to hand-crafted features. The public CHB-MIT EEG database is used for training, validation, and testing. An EEG signal for 1 second duration is taken as one sample. The accuracy for the classification of inter-ictal and pre-ictal state is achieved up to 94% by using 5-Fold cross validation. However, the accuracy is not up to the mark for the presence of common artifacts caused by eye-blinking and muscle activities during EEG recordings. Hence, a 30 seconds pool based technique is used for decision on correct state identification. The proposed pool based technique provides an average specificity of 95.87% and a false prediction rate of 0.0413/hour. It also provide average sensitivities of 100%, 97%, and 90% for the time slots 0 - 5 minutes, 5 - 10 minutes, and 10 - 15 minutes before the seizure event. 2019 IEEE. -
Equality Versus Discretion in Imposing Death Penalty in The Criminal Justice System : A Comparative Analysis Between India, UK and USA
The criminal justice system has two phases, namely, pre-conviction and postconviction, which are based on some theories which have to be exercised by the four major organs of administration of the criminal justice system, namely police (investigation), prosecution, defence and judiciary as well as correctional institutions. For this purpose, every legal system permits this mechanism to exercise equality and discretion at various phases such that justice is served according to the procedure established by law as it is required. The attempts to maintain a balance between the two in the sphere of criminal justice had begun long ago, although not succeeded yet by various countries. In the United States, more equality is emphasised in the postconviction stage. It focuses on offence egalitarianism quotrather than quotoffender egalitarianismquot. In Europe, the position is almost contrary. In India, strict adherence specifically to neither equality nor discretion at any step cannot be traced out. However, when it comes to sentencing cruel and heinous crimes, almost all countries fix a definite punishment where there is a broad scope for judicial discretion, often ending up squeezing the discretion to attain the idealistic concept of equality. This Study aims to discuss and point out the merits and demerits of the said system with suggestions. -
Equalization of Finite-Alphabet MMSE for All-Digital Massive MU-MIMO mm-Wave Communication
For more than twenty years, growing the performance and efficiency of wireless communications systems using antenna arrays has been an active field of study. Wireless networks with multiple-input multiple-output are also part of the current norms and are implemented around the world. Access points or BSs with comparatively few antennas are used for standard MIMO systems, and the resulting increase in spectral efficiency was relatively modest. A Multiple-Input Multiple-Output platform's capacity is researched where the transmitter outputs are processed and quantified by a set of limit quantizes through an analogue linear combining network. The linear mixing weights and cutoff levels are chosen from with a collection of possible combinations as a function of the transmitted signal. Millimetre-wave networking requires optimum data transmission to various computers on same moment network in combination with large multi-user actually massive. In order to guarantee efficient data transmission, the heavy insertion loss of wave propagation at su ch a faster speed needs proper channel estimation. A new channel estimation algorithm called Beam space Channel Estimation is suggested. From a set of possible configurations, the capacity of a massive stream from which antennas signals are handled by an analog channel as a part of the channel matrix, linear mixture weights and frequency modulation levels are selected. Probable implementations of specific analogue receiver designs for the combined network model, such as smart antenna selection, sign antennas output thresholding or linear output processing. To demonstrate the effectiveness of BEACHES in service and have FPGA implementation results, we are developing VLSI architecture. Our results show that for large MU-MIMOs, mm-wave communications with hundreds of antennas, specially made denoising can be done at maximum bandwidth and in an equipment format. Published under licence by IOP Publishing Ltd. -
Equitable and inclusive online learning: A framework for supporting students with disabilities
Online learning has become a widely adopted mode of education, particularly during the COVID-19 pandemic. In general, individuals with disabilities face challenges when using non-technology components for studying. This chapter proposes a framework for equitable and inclusive online learning practices that support students with disabilities. The framework is based on a review of current research and best practices for online learning and disability accommodations. The framework emphasizes a collaborative, student-centered approach to online learning that acknowledges the unique needs and experiences of students with disabilities. Depending on the disabilities, the framework is divided into two phases namely: Prevalent Learning, and Discrete Learning. The former comprised components: Accessibility, Accommodation, and Engagement, and later has components like Methodology, Evaluation. The framework proposed provides a roadmap for addressing the challenges faced by students with disabilities in online learning environments. 2023 by IGI Global. All rights reserved. -
Eradication of Global Hunger at UN Initiative: Holacracy Process Enriched byHuman Will and Virtue
The researchers have directed their attention to the UNs 2030 Agenda for Sustainable Development and its 17 Sustainable Development Goals (SDGs), with a specific focus on two critical objectives: hunger and poverty alleviation. While the UN has been vocal about eradicating hunger and poverty, the researchers believe that a fundamental shift in human perspective is needed. They propose a novel approach rooted in holacracy to revolutionize food production, distribution, and management. At the core of their proposal lies the ancient Indian principle, Vasudhaiva Kutumbakam, which translates to The World Is One Family. While it may seem utopian, the researchers see it as a reachable goal through holacracy. Their hypothesis centres on producing food for all and collectively utilizing it, transcending national boundaries and individual interests. The researchers advocate for a transformation in the way the UN operates by embracing holacracy as a practical social technology rather than a mere concept. Holacratic organizations, they argue, have the potential to remove barriers obstructing progress. The implementation of their vision begins with the UN functioning as a global nerve centre for data, with its 193 member nations acting as equal and interdependent contributors. This Centre would display the worldwide food landscape and foster a moral and ethical awakening, emphasizing the shared responsibility for all humanity. Real-time data on food availability, supply chains, and consumption would be accessible on a public website. Holacracy, they contend, should inspire individuals to prioritize love for humanity as a panacea. Power circles interconnect to collaboratively address issues. The UN could serve as a catalyst for this transformation. The knowledge nerve centre would provide critical data on arable land, water resources, and supply chain infrastructure to facilitate problem-solving at various levels. Timely responses and actions would be driven by the principles of holacracy and advanced digital technologies, addressing concerns hindering the achievement of UN goals. This data-driven approach, coupled with actionable plans, aims to eliminate food shortages and subsequently combat poverty and hunger worldwide. In conclusion, the researchers envision a future where holacracy and a shared sense of responsibility propel humanity towards ending hunger and poverty, with the UN playing a pivotal role as a catalyst for change and a provider of essential data and guidance. The Author(s), under exclusive license to Springer Nature Switzerland AG 2024. -
Ergos: redefining storage infrastructure and market access for small farmers in India
Learning outcomes: After completion of the case study, students will be able to analyse the path of the entrepreneurship from idea generation to market development to scaling up business, examine the impact of start-ups like Ergos on Indias agriculture value chain, discuss the challenges faced by tech entrepreneurs in growing a business, identify problems solved by Grain Bank Model and evaluate digitisation of farmings custodial services such as warehousing, market linkages and loans. Case overview/synopsis: The case study discusses how founders of Ergos, India-based leading digital AgriTech start-up, Kishor Kumar Jha and Praveen Kumar, started one of the unique models in the AgriTech landscape in India. After noticing the grim condition of small and marginal farmers in Bihar, India. Kishor and Praveen decided to put their banking and corporate experience to use in the farming sector. Ergos aimed to empower farmers by providing them with a choice on when, how much quantity, and at what price they should sell their farm produce, thus maximising their income. As a result, Ergos launched the grain bank model, which provided farmers with doorstep access of end-to-end post-harvest supply chain solutions by leveraging a robust technology platform to ensure seamless service delivery. Ergos faced many challenges in its journey related to financing, marketing and distribution. Amidst these developments, it remained to be seen how Kishor and Praveen would be able to realise their goal to serve over two million farmers across India by 2025 and create a sustainable income for them through its GrainBank Platform. Complexity academic level: This case study was written for use in teaching graduate and postgraduate management courses in entrepreneurship and business strategy. Supplementary materials: Teaching notes are available for educators only. Subject code: CSS 3: Entrepreneurship 2024, Emerald Publishing Limited. -
Escape velocity backed avalanche predictor neural evidence from nifty /
International Journal of Recent Technology And Engineering, Vol.8, Issue 4, pp.486-490, ISSN No: 2277-3878. -
Escitalopram treatment ameliorates chronic immobilization stress-induced depressive behavior and cognitive deficits by modulating BDNF expression in the hippocampus
Major depressive disorder (MDD) affects 21% of the global population. Chronic exposure to stressful situations may affect the onset, progression, and biochemical alterations underlying MDD and associated cognitive impairments. Patients exhibiting MDD are mainly treated with several antidepressants; one is escitalopram, a selective serotonin reuptake inhibitor. However, whether or not it mitigates chronic stress-induced cognitive deficits is unknown. The present study exposed rats to chronic immobilization stress (CIS) 2 hours/day for 10 days. Then, escitalopram (5 mg and 10 mg/kg i.p.) was administered for 14 days and subjected to the elevated plus maze, open field test, forced swim test, sucrose preference test, and radial arm maze task. A different set of animals were used to assess the vascular endothelial growth factor (VEGF), glial fibrillary acidic protein (GFAP), and brain derived neurotrophic factor (BDNF) levels in the hippocampus, frontal cortex, and amygdale. Our data suggest that escitalopram significantly protected CIS-induced spatial learning and memory deficits, behavioral depression, and anxiety. Furthermore, escitalopram (10 mg/kg) shows a remarkable recovery of dentate gyrus and hippocampal atrophy. In addition, the restoration of molecular markers BDNF, VEGF, and GFAP expression is also implicated in the neuroprotective mechanisms of escitalopram. Our results suggested that esciatlorpam restores cognitive impairments in stressed rats by regulating neurotrophic factors and astrocytic markers. 2024 Shilpa Borehalli Mayegowda et al. This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/). All Rights Reserved. -
ESG efficiency analysis in the IT industry: a DEA-based approach
Unlocking the power of sustainable growth, Environmental, Social, and Governance (ESG) principles are redefining the future of responsible investment and corporate excellence. ESG regulations ensure that organizations maintain sustainable development and improve non-monetary metrics, such as stakeholders engagement, customer satisfaction, market acceptability, societal ethics, and values. Higher ESG scores demonstrate commitment towards responsible business practices and indicate higher market value for companies, which are valid for all sectors, including IT. However, existing literature reveals that IT sector companies pay less attention to planning their operations to make them more sustainable. Therefore, IT firms must identify methods and practices to maintain high ESG scores to achieve sustainable growth. The current study leads the readers into a new area of ESG through the help of an advanced method, DEA. DEA (Data Envelopment Analysis) methodology has been used to identify the decision units relative efficiency scores and helps identify peers and followers based on ESG scores. The study reveals that among the selected IT firms using the output-oriented strategy, 56.25% experience increasing returns to scale, 18.75 per cent experience decreasing returns to scale, and the remaining 25.00 per cent report constant returns to scale. This indicates that most IT industry firms can generate greater output change in proportion to the input change. 2025 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group. -
EShield: An effective detection and mitigation of flooding in DDoS attacks over large scale networks
Distributed Denial-of-Service attacks are very hard to be mitigated in wireless network environment. Here in this manuscript, an effective method of flood detection and mitigation architecture is proposed named eShield, which detects and prevent flooding attacks through spoof detection technique. The proposed method uses an architecture and an algorithm. eShield deals with Intrusion Protection and Detection Systems (IPDS) which collaboratively defend flooding attacks at different points in the network. Here eShield detects the supply node with its port variety which were below assault. Inorder to reduce the burden on international IPDS eShield makes use of distinct nearby IPDS for the assaults in flooding which have been carried out collaboratively. The assessment is done through the widespread simulation of eShield and it is proved to be an actual values based on time delay, false positive rates, computation and communication overhead. BEIESP. -
ESIPT-AIE Active Schiff BaseFluorescent Organic Nanoparticles Based on 2-(2-(4-(4-bromo Phenyl) Thiazol-2-yl)Vinyl)Phenol (BTVP) Utilized as a Multi-Functional Fluorescent Probe
The present study reports the synthesis and characterization of Aggregation-Induced Emission (AIE) Excited-state intramolecular proton transfer (ESIPT) active 2-(2-(4-(4-bromo phenyl) thiazol-2-yl)vinyl)phenol (BTVP). The AIE properties of BTVP in Acetone/Water solution are investigated, and fluorescent organic nanoparticles (FONs) (sizes ranging from 150200nm) are preparedin various water fractions (fH2O). The established visco-chromic property suggests that the restriction of intramolecular rotation is responsible for the AIE-ESIPT behavior of the molecule, providing a means to sense viscosity. The synthesized FONs act as fluorescence chemosensors to detect Al3+ ions via a photoinduced electron transfer (PET) mechanism. Job's, BenesiHildebrand method, and 1HNMR titration confirm the 1:1 binding of BTVP with the metal ion. Studies on the emission concerning pH reveal the high stability of FONs over a broad range of pH, and a gradual change in the emission wavelength for the BTVP-Al3+ complex (BTVP-Al) is observed, providing a means to sense pH ranging from 28. The solid-state photoluminescence of BTVP is used for latent fingerprint detection, demonstrating its efficiency in detecting both primary and secondary information. Additionally, both BTVP FNOs and BTVP-Al are used in cell imaging, where specific nuclear staining with BTVP-Al and cytoplasm staining with BTVP are observed. 2023 Wiley-VCH GmbH. -
ESSA Scheduling Algorithm for Optimizing Budget-Constrained Workflows
Workflows are a systematic approach for defining various scientific applications of distributed systems. They break down complicated, data-intensive processes into minor activities that can be executed serially or in parallel according to the type of application. Cloud systems need to allocate resources and schedule workflows efficiently. Despite many studies on job scheduling and resource provisioning, an efficient solution isn't found. Therefore, techniques are required to enhance resource utilization for optimal cloud computing platforms. Hence, user and provider quality of service (QoS) goals, like shortening workflows and ensuring budget limits with low energy utilization, must be considered. Enhanced Salp Swarm Optimization (ESSA) is designed to optimize makespan and QoS metrics in cloud systems. A Virtual Machine (VM's) compute capacity is related to Central Processing Unit (CPU) and memory. Size and memory demand is considered for tasks in the workflow, and task execution time is evaluated using both CPU and memory. The collated experimental outcomes convey that the newly presented technique boosts the workflows' energy utilization (up to 89%) and pushes the normalized makespan results to 3.2ms. 2022 IEEE.