Browse Items (472 total)
Sort by:
-
An Experimental Study on Improving Speaking Skills Through the Integration of Existential Intelligence for Post Graduate Learners of Business Studies
Within the field of ESP, the constrained access to discipline-specific materials has intensified the demand, emphasizing an acute necessity for refining speaking skills, particularly in conversational contexts. This reflects an evolving paradigm, emphasizing the critical need for refined and specialized speaking competencies within their scholarly domain. The dissertation examines the use of a learning module created for Postgraduate students of Business studies to improve their speaking skills. The study uses Task based Language Teaching (TBLT) approach and excerpts from Literature for pedagogical instruction, employing Dialogic Inquiry Model (DIM) as a framework and using Existential Intelligence as a guiding lens. Language and Intelligence are closely intertwined. Educational Psychologist Howard Gardner posed a challenge to the conventional notion of Intelligence which supported higher IQ based tests by introducing the Theory of Multiple Intelligences. Gardner's Intelligence framework initially comprised of seven types. In 1999, he introduced Existential Intelligence (EI), expanding the model to include EI as a half intelligence, but due to its abstract nature and lack of clear brain localization, it has posed challenges for precise quantification. Furthermore, Howard Gardners Intelligence Reframed (1999) was used as the primary text for the study. By contextualizing the study within Higher Education's Business studies domain, it examines how existential principles influence the development of speaking skills. Additionally, it explores how these principles contribute in shaping aspiring business entrepreneurs, providing added motivation to instill a sense of purpose which will enhance their managerial attributes. The research argues that infusing unexplored existential elements into the curriculum can stimulate critical thinking among Business Studies students, resulting in notable improvements in specific speaking dimensions like Fluency, Turn-Taking, Presentation and Negotiation skills (FTPN). Moreover, it highlights the pivotal role of this integration in reshaping ESP curricula to better cater to the unique needs of learners in this discipline. The research comprised of two cohorts of students. The first cohort consisted of first- year postgraduate students pursuing Business Studies at Sinhagad Institute of Management (SIOM), Pune while the second cohort included students from CHRIST Deemed to be University) Pune Lavasa Campus, resulting in a cumulative total of approximately 68 participants. The study extended over a 15-day duration to facilitate the completion of a comprehensive 30-hour module for both the phases separately. The study employs a mixed-method approach by combining qualitative and quantitative analysis. For the qualitative study, the analysis involved researchers observations and an examination of Achievement tests questionnaires employing the Likert scale. For quantitative analysis, a series of six Task-Based Language Teaching (TBLT) activities were conducted, encompassing pre and post achievement tests, each of which was assessed based on the study's objectives. The assessment tool utilized is the Communicative Skills Rating Scale (CSRS) by Spitzberg and Cupach (2002), featuring a four-tier evaluation system: self, partner, observer, and external evaluations. Data was collected via Audio-Visual methods for documentation purposes and to cross-reference any overlooked data during the concurrent evaluation process. The collected data underwent a systematic analysis to investigate how applying EI principles can improve conversational speaking skills. The efficacy of the learning module was evaluated based on the proficiency demonstrated in TBLT activities concerning FTPN skills. TBLT activities were administered both prior to and subsequent to the completion of each unit. The assessment of the effectiveness of classroom pedagogy (independent variable) was gauged through the researcher's observations and achievement test questionnaires. Simultaneously, the evaluation of participants' specific conversational skills (dependent variable) was evaluated and analyzed through the course of experimental study using CSRS evaluation scale. Data was analysed using Descriptive statistics through Excel and it was verified using R programming. The research delineates apparent improvements in FTPN within the DIM framework upon the integration of this intervention. Noteworthy enhancements also included a heightened motivation levels across the sample population. Both fast and slow learners exhibited advancements, with a more pronounced improvement observed among the latter group. Additionally, significant strides were observed in non-verbal proficiencies, notably in body posture, and refined listening and responsive non-verbal skills, which was a byproduct of the intervention. Also, a Gender-based analysis revealed an overall positive trend in both male and female students, yet a comparatively greater enhancement was evident among male students in assimilating and applying these interventions. The analysis of data obtained from the CSRS tool shows statistically significant influence on overall English anguage production of the participants in terms of FTPN variables. Moreover. progress tests provided statistically significant evidence for the efficacy of the researcher- developed learning module based on TBLT and DIM approach integrated using EI subsets. Each phase of participants underwent separate experimentation and assessment of their proficiency both before and after the intervention. The pre-achievement test revealed inadequate speaking skills and a lack of basic conversational understanding in both cohorts. Phase 1 (SIOM) showcased noticeable improvement, with a growth in Fluency (39.7%), Presentation (32.1%), Negotiation (37.3%), and Turn-Taking (38.5%) using the CSRS tool. Fast learners improved by an average of 24.1%, while slow learners showed a significant average increase of 51% from their pretest scores. Moreover, there was a 15.7% increase in motivation levels during the intervention. Group 2 (CUL) exhibited improvements in Fluency (36.35%), Presentation (35.8%), Negotiation (43.7%), and Turn-taking (40.5%). Fast learners increased by an average of 26.9%, and slow learners saw an average increase of 49.6% from their pretest scores. Additionally, there was an 18.2% spike in motivation levels during the intervention. Overall, the analysis of CSRS data and progress tests strongly supports the effectiveness of the researcher-developed learning module based on existential principles. It significantly enhanced oral participation and achievement of learning outcomes across both groups. The results through the post-achievement test showed that the researcher-developed learning module had a statistically significant influence on overall English language production in the participants. In educational psychology, Multiple Intelligence has garnered substantial research attention for its application in ESL/EFL and broader school curricula, particularly in teaching English and various subjects. However, the integration of Existential Intelligence within the context of ESP remains unexplored. Its potential significance and applicability within higher education for business students could be substantial. This intelligence category, rooted in philosophy, mysticism, aesthetics, and related domains, aligns closely with the fundamental realms of interest for MBA students. Its introduction could offer profound implications for their learning experience and academic endeavors. The research attempts to contribute to the growing field of English for Specific Field ix for Business students by situating the study within the Pune district of Maharashtra by analysing only FTPN which further offers scope for exploration. -
Smart Beta Investing in India : Portfolio Construction, Implementation, and Evaluation
The smart beta strategies, having marked their footprint in the developed markets in the last decades on the backdrop of the failure of active investing, are capturing emerging markets such as India recently. In this regard, the study attempts to examine the performance of smart beta strategies in long-only, multifactor, and alternative indexing frameworks in India. The study builds alternatively weighted (AW) univariate portfolios. Firstly, the cap-weighted (CW) single-factor portfolios are built. Subsequently, the portfolios are alternatively weighted and compared to the CW portfolio. Next, the CW multifactor portfolios are built and compared with singlefactor portfolios. Finally, the AW multifactor portfolios are built and newlinecompared with CW multifactor portfolios. All the portfolios are tested for their significant performance relative to the risk-free rate, market, and alpha under factor models. The portfolios were constructed from the constituents of NIFTY 500, adjusting for survivorship bias. The sample period spanned over 21 years from 01/10/2000 to 31/09/2021. The hypotheses were tested using the One-Sample T-test or Wilcoxon Signed Rank test for the difference in return, based on return distribution, and the Wald test for the difference in alpha and exposure using the Seemingly Unrelated Regression framework. The portfolios were constructed and analyzed using Python. We find mixed evidence of factor presence; the factor portfolios built on market data such as Illiquid, Winner, Stable, and Size offered better performance than those built on fundamental data such as Value, Strong, and Conservative. The Integrated portfolio does not differ from Mixed and single-factor portfolios, except for underperformance against the Illiquid portfolio. The alternative weighting offered mixed performance at single and multifactor levels. -
Optimal Benchmarking of Quality of Service and Quality of Experience Metrics for Telecom Service Providers Using A Slack Based, Measure in Data Envelopment Analysis
With new devices and new network technologies coming up, it has become an inevitable task to provide services of a minimum quality. Setting feasible Service Level Agreements (SLAs) is the need of the hour. This, being a part of network provisioning and providing the best possible Quality of Service (QoS) is very vital and helps improve user perceived quality or the Quality of Experience (QoE). QoE evaluation helps Internet Service Providers (ISPs) understand their user satisfaction better and this goes hand in hand with providing adequate network QoS. Moreover, in this era of competition, the ISPs themselves will have to be evaluated based on their QoE and QoS metrics to know their true position in the market in terms of performance against their peers/competitors. This evaluation is usually done on a per-metric basis. However, we see from current performance data that all the ISPs fare well on some metrics and need improvement in the others. It is a fact that no ISP fares bad on all given metrics and leads to an understanding that per-metric based evaluation may be a biased form of newlineevaluating performance. Hence, this research has attempted to use an intelligent, robust newlinemathematical technique called the Data Envelopment Analysis (DEA) with its Slack newlineBased Measure (SBM) approach. DEA is a proven, tested and tried technique that is in newlineuse in major industries even today. Being a multiple criterion evaluation methodology newlinebased on linear programming, it works well on multiple outputs and multiple inputs. DEA gives the overall, relative efficiency of the ISPs which gives us the true position of the provider against its peers. The Slack Based Measure provides the Output Slacks that show the potential improvement that the lagging ISPs can make to be in par with their peers/competitors. The Output targets that are provided by the technique can be used as benchmarks for SLAs. -
Prognosis of Kidney Disease on Ultrasound Images Using Machine Learning
Kidney diseases can affect the ability to clean the blood, filter extra water out of your blood. The kidneys failure will affect the control over blood pressure and sugar level. It can also affect red blood cell production and vitamin D metabolism which is very important for bone health. When your kidneys are damaged, waste products and fluid can build up in the body. This is harmful to the health. This damages the kidney function, can get worse over time, and when the kidneys stop working completely, this is called kidney failure or end-stage renal disease. Not all patients with kidney disease progress to kidney failure. This disease has emerged as one of the most prominent reasons of death and suffering in this century. Recent studies states that, kidney disease affects most of the population and over two million people require kidney replacement. To help prevent Chronic Kidney Diseases and lower the risk for kidney failure, control risk factors for CKD, get tested yearly, make lifestyle changes, take medicine as needed. The detection of kidney abnormalities at their early stages helps to avoid the impairment of newlinekidney. The US imaging is considered as preliminary diagnostic tool in finding various kidney diseases in the clinical imaging field. This is one of the commonly used imaging modalities due to the inexpensiveness and non-ionization nature. The presence of noise in US images, degrade newlinethe quality and clarity of the images. Also, the heterogeneous structure of kidney, makes it very difficult to detect and measure the size of stones and cysts. Hence, an automatic kidney disease detection system is highly in demand. The proposed model can assist the radiologist in accurate abnormality detection. The proposed model includes different phases such as, pre-processing, features extraction, classification and newlinesegmentation. The pre-processing phase include cropping and noise removal. Further, the GLCM and intensity-based features are extracted for the classification of abnormal kidney images. -
Subjectivity Analysis Using Social Opinion on Stress and Strain During Covid-19 Pandemic
The psychological health of several people across the globe has been under great risk newlineas a result of the COVID-19 pandemic that shook the entire world. The ubiquitous newlinepandemic had created a tectonic shift in everyone s life. The lives of people have newlineundergone a severe transition with strict measures like lockdown and social distancing newlineimposed by governments of several countries to stop the spread of the viral infections. newlineCoping through the adverse situation has been quite onerous causing stress among the people. The transition from normal life to a life filled with several restrictions has newlinebeen stressful and strenuous. A state of emotionally or physically being tensed can be newlineconsidered as stress. Stress can cause frustration, depression, nervousness and other mental health issues. Stress also leads to strain. Social media networking sites like newlineX(Earlier Twitter) and Facebook have emerged to become popular. During the times of lockdown and social distancing the social media networking sites have been a great newlineplatform for expressing opinions, exchange of ideas and thoughts. People have expressed their stressful situations and coping mechanisms through tweets , Facebook newlineposts and several other social media sites during the pandemic. The underlying stress newlineand strain of a person can be analyzed through the posts shared by the person through the social media sites. Early detection of the prevalence of the stress and strain is important, as medical help can be sought quickly and the person affected can be back to normalcy. Subjectivity analysis is the study that deals with analyzing the emotions, feelings, attitudes and polarity of opinions considering any subject matter. newlineThe present research focuses on subjectivity analysis through social opinion mining newlineduring the COVID-19 pandemic. Social opinion mining incorporates Natural Language Processing and Computational Linguistics that identifies the subjectivity across the posts of social media. -
Artificial Intelligence Based Computational Framework for Identification and Classification of Interstitial Lung Diseases Using HRCT Images
Interstitial Lung Diseases (ILDs) refer to a wide array of respiratory disorders characterised via infection and scarring of the lung's interstitial tissue. These conditions affect the space within the air sacs, compromising the lungs' ability to expand and contract properly. ILDs manifest with a range of symptoms, including persistent cough, shortness of breath, and fatigue. Diagnosis of ILDs often involves imaging methods, mainly High-Resolution Computed Tomography (HRCT), to assess lung abnormalities. ILDs can have lasting effects on respiratory function, leading to progressive fibrosis. The primary obstacle in identifying ILDs lies in the diverse array of symptoms they present, making it challenging to distinguish them from other pulmonary disorders. The HRCT is a commonly employed method in ILD diagnosis. These images provide a detailed depiction of lung tissue, revealing its size, shape, and any notable abnormalities or changes. Moreover, HRCT plays a crucial role in monitoring disease progression over time. Deep Learning (DL) excels in detecting patterns in intricate medical images that may pose challenges for traditional methods. Moreover, DL algorithms exhibit the ability to identify subtle changes in medical images indicative of pathology, and they can automate object detection tasks. The application of DL in medical contexts can enrich the precision and rapidity of diagnoses. In this research aimed at improving the accuracy of artificial intelligence AI-based ILD identification, we harnessed the benefits of deep learning, employing full-training, Transfer Learning (TL), and ensemble voting techniques. Our approach involved the construction of three Convolutional Neural Networks (CNNs) from scratch for ILD detection. Additionally, we customized models named InceptionV3, VGG16, MobileNetV2, VGG19, and ResNet50 for both full-training and TL strategies. This comprehensive methodology aimed to take benefits of DL architectures to enhance the precision of ILD identification in medical imaging. Both the first dataset consisting of HRCT images and the second dataset comprising Chest X-ray were employed in our study. However, during the initial training phase of the TL models, we utilized pre-trained ImageNet weights. To enhance performance, modifications were made to the classification layers of all five models for both TL and full-training processes. To further improve training outcomes, a soft-voting ensemble approach was employed. The ensemble, combining the predictions of all three newly developed CNN models (ILDNetV1, ILDNetV2 and ILDNetV3), and ILDNetV1 achieved the highest test accuracy at 98.14%. Additionally, we incorporated machine learning (ML) models, including Logistic Regression, BayesNet, RandomForest, Multilayer Perceptron (MLP), and J48, using statistical measurements derived from HRCT images. Our study introduces a novel AI-based system for predicting ILD categories. This system demonstrated superior performance on unseen data by leveraging the results from the newly constructed CNNs, transfer learning, and ML models. This comprehensive approach holds promise for advancing ILD category prediction, providing a more robust and accurate tool for medical diagnosis and decision- making. -
Load Balancing Strategy for Large Scale Software Defined Networks
Programmability has left its mark on every facet of business, with technology playing newlinean integral role. Social networking industry trends underscore technology s ubiquity in newlinenearly every business transaction. Traditional networks grapple with numerous challenges, rendering them ill-equipped to process and handle the demands of the modern newlinelandscape effectively. The lack of programming in these networks leads to stagnation, newlineinhibiting their ability to evolve or enhance performance. The advent of Software Defined Networks (SDN) has introduced increased flexibility into conventional networks, newlineopening avenues for creating innovative services. newlineSDN technology addresses challenges in large-scale networks, offering solutions for newlinehigh throughput, virtualization, fault detection, and load balancing, providing effective network management. The rapid expansion of network services and applications newlinein SDN environments demands sophisticated load-balancing solutions that adapt to newlinedynamic traffic patterns and varying service requirements. This study presents a pioneering algorithm, the Dynamic Load Balancing Algorithm (DLBA), which utilizes the newlineProgramming Protocol-independent Packet Processors (P4) language. The algorithm is newlinespecifically crafted to tackle the issues associated with optimizing traffic distribution in newlinethe data plane of SDN. newlineP4 programming language, recognized as one of the most robust languages, addresses newlinethe limitations of traditional networking, enhancing programmability and agility by newlinedistributing the load across the network. The research implements a novel quotDynamic newlineLoad Balancing Algorithmquot using the P4 language to instill dynamism and achieve load newlinebalance in large-scale networks. The P4-based implementation showcases dynamicity, scalability, flexibility, and adaptability. This research commences with thoroughly newlineexamining existing load-balancing algorithms implemented using the P4 language, followed by a comparative analysis between these algorithms and DLBA. -
Computational Methods for Detection and Recognition of Coronary Artery Stenosis in Angiogram Images
Coronary Artery Disease (CAD) is caused by stenosis of the coronary artery's lumen. This heart disease is one of the reasons for the highest mortality worldwide. This illness manifests as stenosis or plaque in the coronary arteries and causes atherosclerosis. It damages or clogs the heart arteries, causing a lack of blood flow to the heart muscles and leading to a heart attack. There are different medical modalities to diagnose the heart artery disease. A standard method used by the cardiologist to diagnose the severity of this disease is coronary angiography. An X-ray machine is used to capture the angiogram image at various angles during cardiac catheterization. Experts examine the data and offers different opinions. owever, most of the angiogram videos consist of unclear images with artifacts, and because of the complex structure of the arteries, medical experts fail to get accurate information about the damages and blockages in arteries. Based on the cardiologist's suggestions, a computational model is proposed as a secondary method to detect and recognize the stenosis level from the coronary angiogram images. The proposed model is Coronary Artery Stenosis Detection Using Digital Image Processing (CASDDIP). The proposed research model/framework can identify the stenosis in the cardiogram image with good accuracy of 98.06% precision. This proposed research experimentation can be compared with existing literature methods which outperforms compared to other methods using real time dataset. A dataset, such as angiogram videos and images of patients under varying age groups, is used to train the model. These videos are acquired from the healthcare center with due consent. The proposed CASDDIP model consists of four modules: Keyframe extraction and preprocessing Coronary Artery Segmentation Feature extraction and stenosis detection Initially, a novel keyframe extraction method is proposed to find the keyframe from the angiogram video. Followed by a hybrid segmentation method is presented in this research to extract the coronary artery region from the image. Further a method is proposed to detect the stenosis by extracting and fusing different features. Detected stenosis is categorized using the proposed stenosis level classification method. This CASDDIP model is a supporting tool to help the cardiologist during diagnosis. -
Online Higher Education : A Mixed Method Study of Delhi NCR, India
The education system in the new era displays extensive adaptability, as proven by presentday realities. In ancient India, the Gurukul system prevailed during the Vedic period, where Gurus guided students in an ashram or hermitage. This is quite different from the contemporary education system we have today. The modern education system involves structured classrooms, where teachers guide students in an institutional setting. The COVID-19 pandemic has brought about significant changes in the contemporary education system worldwide. This and the emergence of Digital India and the internet led to a structural change in the modern education system, from physical classrooms to online and remote classrooms. That means a paradigm shift happened in the educational system where Gurus are replaced by e-gadgets, books are being replaced by eBooks and traditional classrooms are being replaced by smart classrooms and online classes. The shift to remote learning has brought forth several challenges to the educational system which includes lack of meaningful teacher-student interaction, lack of motivation and engagement and its impact on mental health. This opens the scope for understanding the pros and cons of different modes of educational practices followed in the Indian online higher education system. Therefore, newlinethe present research captured experiences of teachers and students from Delhi NCR region newlineabout online higher education and the learning environment to understand the effectiveness of online learning. This research also focuses on social and psychological behavior along with the perceptions of teachers and students about online learning. The study explored the newlinechallenges and problems faced by them in online learning. A mixed method approach was newlineused for understanding the changing structure of the digital classroom. Along with the newlinedigital participatory approach, structured interviewing was used to have a better and in-depth understanding of online learning. -
Sexual Function and Sexual Satisfaction among Non-Working Married Women in Bengaluru
Background: Sexual function and satisfaction are two important components of the sexual health of women. Both are influenced by various external and internal factors over their life cycle. This study aims to explore the factors of sexual function and newlinesatisfaction among non-working married women in Bengaluru using an exploratory newlinesequential research study and highlighting the implications for social work practices. newlineMaterials and Methods: The study has two phases. The first phase was a qualitative newlineexploratory research study that adopted an inductive thematic data analysis. In-depth newlinequalitative interviews were conducted with 11 non-working working married women. The interviews were audio recorded and the transcribed data were analyzed with ATLAS.ti software. The results were presented thematically. The second phase was a newlinecross-sectional survey of 180 non-working married women. The data were collected newlinethrough semi-structured interviews with the Female Sexual Function Index, the New newlineSexual Satisfaction scale, the Psychological Distress Scale, the Subjective Happiness newlinescale, and questions related to socio-demographic details, and health. Descriptive and inferential statistics were carried out and multiple regression analyses were conducted with jamovi v2.3. to find the predictors of both sexual function and satisfaction. Results: In the qualitative phase various factors of sexual function and satisfaction were explored and organized into three global themes. They are Somatic and personal factors, Factors related to the mind, and Situational and extrinsic factors. The quantitative study found that physical, psychological, couples characteristics-related, family, and socio-cultural factors together predict a 17.3% variance in sexual function and a 78.6% variance in sexual satisfaction of women. Conclusion: The study could find positive and negative factors of sexual function and satisfaction. -
Computational Chemical Property Prediction and Anticancer Simulation of Heterocyclic Molecules
The Density Functional Theory (DFT) technique is popularly employed in establishing organic molecules' structural properties and reactivities. The B3LYP hybrid functional with the basis set 6-311G++(d,p) is utilised in the computational calculations with Gaussian 09W software. The DFT studies include energy minimisation (geometry optimisation), frontier molecular orbitals (FMO) analyses, theoretical UV spectral computation, natural bond orbital (NBO) evaluation, Topological analyses using Multiwfn 3.8 software are performed to evaluate the Pauli repulsion in atomic orbitals (as shown by ELF (Electron Localisation Function) maps), the areas of strong and weak pi-delocalisation in the molecules (as depicted in LOL (Localised Orbital Locator) maps) and the weak non-covalent intra-molecular interactions (as indicated in colour maps of RDG (Reduced Density Gradient)). Pharmacological evaluation is performed using SwissADME, ADMETLab 2.0, and PreADMET online tools. Molecular docking is performed using AutoDock Tools 1.5.6 with select anticancer target proteins to predict the bioactivity potential of the title molecules. The molecules studied in the work include a spiro compoun d, spiro[1H-indole-3,2-3H-1,3- benzothiazole]-2-one, a 2(3H)-furanone, 3,3,5-triphenylfuran-2(3H)-one, and a benzo[d]imidazole, 5,6-dichloro-1-cyclopentyl-2-(methylsulfinyl)-1H- benzimidazole. In addition, comparative studies are performed on the structure and reactivity of spirobrassinin derivatives, spirocyclic isatin-derivative analogues, and 3(2H)-furanones, and these three classes of molecules have already been predicted to possess anticancer properties in vitro. Interesting properties emerge in the preliminary theoretical investigations for these molecules, particularly in the FMO, the NLO and the molecular docking studies. The theoretical studies explore the reactivity, structure, and stability of the molecules under study, and biological evaluation examines their potential as lead compounds for cancer therapeutics. These studies can be further extended to include experimental validation and in vitro and in vivo tests to confirm further the efficacy of the anticancer action as well as the potential toxicity of the compounds. The theoretical investigations provide a database of information that could be useful for experimental scientists and medicinal chemists who primarily focus on drug design and discovery in their research so that they can narrow down the number of possible lead compounds from the vast chemical space of organic compounds that possess drug-like characteristics. -
Electrochemical Synthesis of Carbocyclic and Heterocyclic Motifs
Programmability has left its mark on every facet of business, with technology playing newlinean integral role. Social networking industry trends underscore technology s ubiquity in newlinenearly every business transaction. Traditional networks grapple with numerous challenges, rendering them ill-equipped to process and handle the demands of the modern newlinelandscape effectively. The lack of programming in these networks leads to stagnation, newlineinhibiting their ability to evolve or enhance performance. The advent of Software Defined Networks (SDN) has introduced increased flexibility into conventional networks, newlineopening avenues for creating innovative services. newlineSDN technology addresses challenges in large-scale networks, offering solutions for newlinehigh throughput, virtualization, fault detection, and load balancing, providing effective network management. The rapid expansion of network services and applications newlinein SDN environments demands sophisticated load-balancing solutions that adapt to newlinedynamic traffic patterns and varying service requirements. This study presents a pioneering algorithm, the Dynamic Load Balancing Algorithm (DLBA), which utilizes the newlineProgramming Protocol-independent Packet Processors (P4) language. The algorithm is newlinespecifically crafted to tackle the issues associated with optimizing traffic distribution in newlinethe data plane of SDN. newlineP4 programming language, recognized as one of the most robust languages, addresses newlinethe limitations of traditional networking, enhancing programmability and agility by newlinedistributing the load across the network. The research implements a novel quotDynamic newlineLoad Balancing Algorithmquot using the P4 language to instill dynamism and achieve load newlinebalance in large-scale networks. The P4-based implementation showcases dynamicity, scalability, flexibility, and adaptability. This research commences with thoroughly newlineexamining existing load-balancing algorithms implemented using the P4 language, followed by a comparative analysis between these algorithms and DLBA. -
A Study on Labeling Problems in Signed Graphs
A signed graph can be considered as a weighted graph G; that is, edges of G have been given the weights + and and#8722;. We discuss two ways to assign signs to the vertices of a line signed graph of a signed graph, namely S-marking and Canonical marking (C-marking) of L(S). We characterize signed graphs whose line signed graphs are S-cordial or total S-cordial provided the vertices of L(S) receive signs through S-marking. Also, we characterize signed graphs whose line signed graphs are C-cordial or total C-cordial provided the vertices of L(S) receive signs through C-marking. Signed graphs can be eand#64256;ectively used to model relationships and individual preferences toward one another. To represent this scenario we defne a signed graph from a given graph. The colors of the vertices can be used to represent individuals or sets, and the signs on the edges of the graph represent the relationship between them. So, we defne two labelings of a properly colored graph, namely parity labeling of a properly colored graph and product labeling of a properly colored graph. newlineThe parity labeling of a properly colored graph G under and#967;(G) colors is defned by assigning the sign of the edge of G as + if the colors on the adjacent vertices of that edge are both even or both odd, and#8722; otherwise. The obtained signed graph is known as parity colored signed graph of a graph. We characterize signed graphs which admit parity coloring. We also characterized signed graphs whose line signed graphs admit parity coloring. We initiate the study on parity labeling of a properly colored graph and the chromatic rna number of a graph. Also, we defne product labeling of a properly colored graph. The product labeling of a properly colored graph G under and#967;(G) colors is defned by assigning the sign of the edge of G as + if the color of one of the incident vertex of that edge is even, and#8722; otherwise. The obtained signed graph is known as color product signed graph of a graph. We characterize signed graphs which admit color product coloring. -
Graphs Emerging from Finite Dimensional Vector Spaces
A vector space over a field is defined as a collection closed under finite vector addition and scalar multiplication. Over the course of time, researchers have delved into exploring the intricate relationships between existing algebraic structures and graphs. This exploration led to the emergence of a distinctive class of graphs derived from vector spaces, following investigations into graphs originating from groups and rings. This thesis undertakes a thorough examination of a well-established algebraic structure known as the non-zero component graph of a finite-dimensional vector space over finite fields. Expanding on this, the thesis introduces the concept of orthogonal component graphs over finitedimensional vector spaces with a particular emphasis on the field Zp. The non-zero component graph of a finite-dimensional vector space over a newlinefinite field is a graph where vertices represent all possible non-zero vectors in newlinethe vector space. Vertices in the graph are made adjacent if they share a common basis vector in their linear combination. The thesis explores a variety of properties relating to distances, domination, and connectivity. Furthermore, it conducts in-depth study of coloring, color connections, topological indices, and centrality-based sensitivity specifically for non-zero component graphs. The concept of orthogonality among vectors in the vector space paves the way for a novel algebraic graph structure the orthogonal component graph. In this graph, vertices represent all possible non-zero vectors in the vector space, and adjacent vertices correspond to orthogonal vectors. The study extends to determining the properties of the orthogonal component graph, particularly in the newlinecontext of the field Z p. Additionally, it characterises the relationship between newlinenon-zero component graphs and orthogonal component graphs. In the latter chapters, the concept of non-zero component signed graphs is introduced and thoroughly discussed. -
A Study on Coloring Parameters and Topological Indices of Graphs
Graph coloring/labeling is a fundamental concept in graph theory that involves the assignment of weights, integers, or colors to the vertices/edges or both of a graph while adhering to specifc constraints. Graph colorings serve as a powerful mathematical model with broad applications in real-world scenarios, including network analysis, genomics, routing, optimization techniques, and digital networks. A research domain is established in which vertices of a graph are colored based on specifc conditions, and color degrees are taken into consideration leading to the exploration of chromatic topological indices and various chromatic polynomials. The introduction of chromatic topological indices in response to challenges in chemical graph theory has sparked signifcant research interest, creating a dynamic and expansive feld within graph theory. Motivated by this our study presents a comprehensive exploration of topological indices in the context of graph theory, specifcally focusing on the Zagreb index and its chromatic variants. The study calculates the frst and second rainbow chromatic Zagreb indices, rainbow chromatic irregularity indices, and rainbow chromatic total irregularity indices for well-known graph classes. Later, introduced the concept of b-chromatic Zagreb indices and b-chromatic irregularity indices and calculated the exact values for some standard graphs. Further, the rainbow chromatic topological indices and b-chromatic topological indices for various newlinederived graphs such as line, middle, total, and central graphs of some graph classes are determined. Novel graph polynomials, namely the b-chromatic Zagreb polynomials and b-chromatic irregularity polynomials, are introduced for some classes of graphs and the derived graphs such as degree splitting graph, mycielski graph. This comprehensive approach not only enhances our theoretical understanding of graph coloring but also oand#64256;ers practical insights into the predictive power of chromatic topological indices in diverse chemical contexts. -
QSRP Studies of Chemical Compunds Using Topological Indices
In this study, we explore the intersection of graph theory and chemistry, focusing on how graph theory s principles can model molecular structures and predict their physiochemical properties. Specifcally, it applies topological indices (mathematical descriptors) derived from the graph-theoretic representation of molecules to establish quantitative structure-property relationships (QSPR) for octane isomers and polychlorobiphenyls. The investigation encompasses 30 topological indices, including the Harary, Wiener, Zagreb, and connectivity indices, and assesses their correlation with key physiochemical properties. Through rigorous analysis, the study successfully develops QSPR models capable of predicting properties like BP, HVAP, DHVAP, HFORM, AcenFac, TSA, and RRT signifcantly advancing newlinethe predictive accuracy of chemical properties. The study of inverse problems about topological indices and QSPR is a testament to the interdisciplinary nature of modern scientifc research, bridging gaps between mathematics, chemistry, and computer science. Inverse problems in graph theory newlinehold a special place due to their capacity to address fundamental questions about the structure and behavior of graphs based on given properties. They are often more complex and challenging than direct problems. The study of inverse problems in our research contributes to the theoretical foundations of chemical graph theory by characterizing trees and unicyclic graphs with specifc topological indices and oand#64256;ering novel insights into the inverse problem for Zagreb indices. The newlineinclusion of Python programs for calculating various topological indices further bridges theoretical chemistry with practical application, highlighting the thesis s newlineaim to enhance both the efciency and accuracy of predicting chemical compound properties. This work not only demonstrates the profound impact of graph theory on chemical informatics but also opens new avenues for research in the feld. -
A Theoretical Study of Rayleigh-Benard Convection Problem with Realistic and Artificial Boundary Conditions
In this thesis we present linear and weakly non-linear study of Rayleigh Bard newlineconvection subject to general boundary condition, which includes both physically newlinerealistic and artifcial boundaries. A horizontal confguration is adopted, wherein newlinethe horizontal surfaces are attached to porous blocks, which allows for the inclusion newlineof rough boundaries modelled by the Robin boundary condition on velocity. The Robin boundary condition is utilised to model boundary condition on temperature as well. Adding nanoparticles to a base and#64258;uid results in an increased thermal conductivity of the base and#64258;uid. The objective of this research is to present a conducive understanding of the eand#64256;ect of nanoparticles and its enhanced thermophysical properties eand#64256;ects on the onset of convection. Eand#64256;ects of Rough Boundaries on Rayleigh-Bard Convection in Nanoand#64258;uids A linear and weakly non-linear stability analysis of Rayleigh-Bard convection in a Newtonian nanoand#64258;uid between two rough boundaries is carried out. A newlinesingle-phase description of nanoand#64258;uids is adopted in the study. Water-alumina and newlinewater-copper are nanoand#64258;uids in consideration for the study. The values of thermophysical quantities of nanoand#64258;uids are obtained using either the mixture theory or phenomenological laws. The boundary eigenvalue problem arising in the study is solved using the Maclaurin series. Also, a single-term Galerkin technique is adopted to obtain the guess value of the Rayleigh number and the wave number. Further, improved values of the Rayleigh number and the wave number are obtained using the Newton-Raphson method. The minimal Fourier series representation is used to arrive at the generalised Lorenz model. A detailed discussion is made on the eand#64256;ect newlineof rough boundaries on the onset of convection in nanoand#64258;uids. The study aims to newlinepresent a theoretical comparison between the results of the two nanoand#64258;uids considered and the destabilizing eand#64256;ect showcased by each of the nanoparticles on the onset of convection. -
Impact of Station Rotation Model in Enhancing Writing Skills and Academic Performance of Primary School Children
The core pursuit of the research is to evaluate the efficacy of technology-integrated English language instruction using one of the blended learning approaches. The intrusion of technology in education has increased rapidly ever since COVID-19, which altered the spheres of learning and teaching. The cross-section of education and technology has become higher and more robust than before because of the pandemic, which led to the exploration of varied dimensions in technologyintegrated teaching. The field of English language teaching has also undergone a significant transformation due to the advent of technology. The introduction of interactive multimedia tools, online platforms, and language learning applications has enabled educators to engage students newlinein more effective ways. Virtual classrooms have brought about a global connection, breaking geographical barriers and fostering cross-cultural communication. The use of adaptive learning systems and AI-driven newlinelanguage applications has personalised learning experiences, catering to individual needs and pace. Furthermore, technology has provided immersive experiences through virtual and augmented reality, which enhance language acquisition by providing real-life contexts for practice. Thus, technology has diversified teaching approaches and made English language learning more accessible, interactive, and tailored to the needs newlineof diverse learners. However, learners from socio-economically disadvantaged backgrounds are deprived of these advantages that could newlinehelp in improving their English language proficiency. The aim of the study is to evaluate the efficacy of technologybased English language instruction that integrates one of the blended learning approaches for improving the writing skills of primary school students from socio-economically deprived backgrounds who have little newlineor no exposure to English language learning outside of their classrooms. -
Towards An 'Alternate' Mythical Reality : A Postmodern Reading of the Graphic Narratives of Appupen and Amruta Patil
The graphic novel landscape in India has witnessed a significant change with newlineinnovative and revolutionary ideas. The new age writers of the gra -
Synergetic Effect of Metal Nanoparticle Embedded Graphene Membrane : A Novel Approach for Antimicrobial Filtration
Water, the elixir of life, holds a profound significance that extends far beyond its essential utility. It's not just a resource; it pulsates as the life force of our existence, intricately woven into the very fabric of our daily lives. Water is the silent force that shapes our world, from nurturing our health and sustaining social structures to fueling economic development and fostering the environment. However, the adequacy of potable water quality confronts adverse impacts stemming from inadequate wastewater treatment, escalating domestic and industrial waste, and the microbial contamination of surface water sources. Furthermore, climate change emerges as a pivotal factor intensifying the depletion of water levels in natural resources due to diminished rainfall. Reports project that, by 2025, two-thirds of global population might contend with water scarcity. Given the persistence of current scenario, there exists a notable potential for significant conflicts among nations stemming from water scarcity. However, such a predicament can be mitigated through proactive measures, including the preservation of natural resources and the implementation of advanced technologies to recover fresh water from contaminated sources. Advanced technologies for the purification of contaminated water encompass sedimentation, precipitation, filtration, and ion exchange, which can effectively extract clean water from diverse impurities. Notably, membrane-based purification has gained prominence in recent years, owing to its cost- effectiveness and energy-saving attributes. Carbon-based nanomaterials, including carbon nanotubes,fullerenes and graphene have garnered considerable attention in recent research studies, particularly in the realm of membrane applications. Within this, membranes fabricated by carbon nanotubes (CNT) stand out, showcasing exceptional filtering properties attributed to their tubular carbon structure. However, the cost-effectiveness and ease of synthesis impediments pose significant challenges, acting as bottlenecks for their widespread application in water purification. Consequently, graphene-based membranes emerge as a promising alternative to CNT membranes, demonstrating selective separation of ions and molecules. Specifically, membranes derived from graphene oxide (GO) and reduced graphene oxide (rGO) exhibit superior filtering capabilities compared to ceramic and polymeric counterparts, owing to their layered structure featuring tunable nanochannels, hydrophilic or hydrophobic nature, and commendable mechanical resilience. Graphene oxide solution synthesis has been done using Hummer's method, followed by fabrication of high-quality membranes through vacuum filtration techniques. Current work emphasis on recognizing the pivotal influence of membrane thickness on both water flux and dye rejection, meticulous optimization of filtration properties by producing graphene oxide (GO) membranes at various concentrations. Furthermore, reduction of graphene oxide through the hydrothermal method, enabling a comprehensive comparative analysis of water flux and rejection between graphene oxide (GO) and reduced graphene oxide (rGO) membranes was carried out. In our investigation, the results unequivocally validate that the GO 500 sample exhibits optimized filtration properties. Furthermore, the reduced graphene oxide (rGO) variant surpasses graphene oxide (GO) in terms of filtration efficacy, demonstrating superior filtering properties. It is noteworthy to highlight that reduced graphene oxide (rGO) exhibits less antibacterial properties compared to graphene oxide (GO). The disinfection capability of the membrane is pivotal in ensuring the recovery of pure water. To bolster the antibacterial features of GO, we have undertaken an enhancement strategy by incorporating silver nanoparticles. Silver nanoparticle, showcases multifaceted properties including surface plasmon resonance and unique morphologies, which contribute significantly to the inactivation of bacteria. The conducted studies reveal that membranes incorporating graphene oxide with silver (GO-Ag) exhibit remarkable antibacterial properties against both gram-positive and gram-negative bacteria. Additionally, these membranes demonstrate appreciable filtration capabilities and exhibit effective antifouling properties, further emphasizing their potential for advanced applications in water purification systems. Fouling is a significant challenge in membrane technology, as the continuous passage of contaminants results in the formation of layers on membrane surface, thereby diminishing its filtration efficiency. Despite the antifouling properties exhibited by GO- Ag membranes, there exists further improvement in enhancing performance and extending the membrane's lifespan. To address this, we have undertaken a reduction of graphene oxide and incorporated silver nanoparticles, aiming to augment the antifouling properties and overall efficacy of membrane. The conclusive findings indicate that fine-tuned membrane exhibits remarkable antibacterial properties, superior filtration capabilities, and a minimal irreversible fouling ratio. These outcomes provide confirmation that the fabricated membranes stand as potential materials for water purification applications, showcasing a well-rounded set of properties essential for effective and sustainable water treatment.