Browse Items (472 total)
Sort by:
-
Critical analysis of national and international laws in relations to intimate partner violence :
The term violence refers to varied perceptions of unacceptable behaviour. The newlinenature of the act suggests that it is inflicted by the superior over the inferior. newlineThus gender constitutes a significant factor of the study. The research identifies that women are subjected to gender based inequalities in different areas of life newlineand domestic environment is not an exception. Domestic Violence , a manifestation of gender inequality, is prevalent in many forms. The most newlinecommon form of domestic violence is identified as Intimate Partner Violence newline(IPV) by the national and international statistics. newlineThe research is drawn upon the objective to understand the different factors that newlinecontribute to IPV and to critically examine the legal framework for the protection of women against IPV in India. Research adopts a human rights based approach to understand the inferior status of women drawn upon gender newlineinequality and thus to critically examine the legal framework in India for protection of women against IPV. newlineThe study is divided into five chapters which specifically analyses the status of women in the Indian society and the role of law to protect women in intimate relationships. Comparative study of UK and US laws with special reference to international instruments is conducted. Identifying the major drawback of law, the study proves that the existing legal framework is inadequate to protect women from intimate partner violence . newline newline newline -
Cultural Memory In The Captivity Novels of NA D'Souza Alan Machado
Captivity novels are stories of men and women who were abducted and forcibly taken as captives and subjugated to slavery, usually for religious or political reasons. The research critically engages the captivity novels of Na DSouza and Alan Machado, which vividly evoke the harrowing captivity experience of Mangalore Konkani Catholic community during Tipu Sultans regime. It is alleged that Tipu, after the Second Anglo-Mysore war, wreaked vengeance on Konkani Catholic Christians on suspicion for betraying him and supporting the British. After two hundred years of great silence, the struggle for identity and the quest for history led the post-conflicting community to articulate the contents of the archives into the fabric of a literary composition. The literary works of DSouza and Machado are an essential bridge between generations problematizing history and memory illustrating the events of Great Captivity. The captivity narratives are cultural artefacts of memory that present alternative history of the Mysore Kingdom and revive the memories of the captivity experience of Mangalore Konkani Catholic Christians. The memories of the miseries revived in the writings of DSouza and Machado at the beginning of the twenty first century from the victims point of view expose the gaps in the official records of the Mysore Kingdom and emphasise the community's resilience and cultural significance. These narratives constitute the melancholic representation of the traumatic experience of the community and enable the community of the sufferers to re-live the torments which in turn act as a therapeutic agent. Thus, the imaginative recount of the Great Captivity by DSouza and Machado in the form of novels using memory as a tool challenge the historical construct and call for a legitimate space for the vanquished version in the construction of history. -
Cultural Politics of sports and nationalism in indian popular cinema
The idea of India does not naturally happen. Like Benedict Anderson points out in his Imagined Communities, the media plays a huge role in ensuring that the public upholds certain notions of the nation. So, the imagining of a nation into existence as well as the sustenance of the collective are media-enabled. If print media had influenced people to envisage a nation by narrativizing a cultural commonality within members of a particular political setup in a particular geography (as in the case of 19th century Europe), the part played by popular cinema in independent India in feeding the imagining of Indianness cannot be considered less significant. Cinema texts based on war, terrorism, partition, etc. lend themselves for a nationalist treatment. Sports-themed Indian popular cinema too chooses to establish a marriage of convenience with nationalism. While sports and nationalism right from the first decade of the 20th century have had a history of helping each other thrive, - owing to nationality-based participation in Olympics and a similar format adopted by most other global events - it is in the 21st century that Indian Popular Cinema started exploiting the sports-nationalism relationship for its own progress. This research studies the discourse on Indian nationalism that three sports-themed texts of Indian popular cinema offer. Apart from proving to be huge box office hits, Lagaan: Once Upon a Time in India, Chak De! India, and M S Dhoni: The Untold Story offer a lot for culture and nation theorists to ponder upon. -
Design an efficient protocol for secured energy efficient routing in large scale wireless sensor networks
Wireless Sensor Network has played a significant role in enabling communication and connectivity to human unreachable area since more than a last decade. Apart from its newlinebeneficial features, it also suffers from various problems that have yet not been solved till date in spite of massive research work in this area. The proposed work jointly addresses the problems of energy efficiency and secure routing in wireless sensor network. The existing literatures are found to provide a less scope of an effective solution to address such issues jointly. Hence, the prime goal of the proposed work is to introduce a mechanism that uses lightweight cryptosystem as a part of new hierarchical routing protocol. The mechanism of the proposed work is discussed in three different modules. Where each module is enhanced version of the previous module. The first module is named as Secured Tree Based Routing with Energy Efficiency (STREE), which newlineintroduces a new energy efficient selection of cluster head along with a very simple and newlinelightweight encryption mechanism for routing message using keccak, a newly launched newlinecryptographic hash function. The second module is named as Secured Authentication Based Routing (SABR) introduces node-to-node authentication along with identification newlineand compensation of network related delay owing to incorporated cryptography. The newlinethird module is named as Secured Anonymous Routing with Digital Signature (SARDS) newlinewhich introduces a distinct mechanism of using enhanced elliptical curve cryptography newlineand a new usage of digital signature. The modelling of proposed study is done using newlineanalytical research methodology and the outcome of the study has been compared with newlineexisting standard routing protocol SecLeach to find that proposed system presents a newlinesuperior mechanism of balancing security, energy efficiency, and communication newlineperformance in wireless sensor network. -
Design and Development of A Generic Framework for Surface Water Delineation and Monitoring Using a Hybrid Level Set Algorithm on Landsat Multi-Spectral Data
Surface water bodies are critical to the existence and sustenance of civilizations. Water bodies in urban cities across the world have undergone drastic decline in quality and quantity. This has been the result of a multitude of reasons like increase in population, urbanization and encroachment. Monitoring changes to water bodies is a newlinenecessary requirement in devising strategies to conserve them. This thesis proposes newlinea generic framework for monitoring and forecasting changes in the surface area of newlinelakes using a hybrid level set algorithm for water body delineation followed by a double exponential smoothing model for forecasting. The proposed hybrid level set algorithm combines the advantages of edge based and region based level sets. An edge detection term is introduced into the formulation which improves the delineation accuracy by forcing the level set evolution to stop at the boundaries of the region of interest. The performance of the algorithm was analyzed using Pearson s Correlation Co-effcient (PCC), Structural Similarity Index (SSIM) and Dice Similarity index and found to have superior performance compared to established methods in the literature. The study uses Landsat multi-spectral data for the last 30 years to build the proposed framework for forecasting the changes in the surface area of water bodies. The experiments were conducted for nine lakes in Bangalore, a fast growing city in India, and a steady decrease in the surface area is observed for most of the lakes that were studied. The city s renovation attempts have also seen that the some of the lakes are sustaining the rapid urbanization. The proposed forecast model has yielded acceptable results with an average error of 0.22% and a correlation coeffcient of 0.94 between the actual surface area and the forecasted surface area. The framework can be customized in the future to study specifc water bodies by plugging in external newlineparameters to improve the forecasting accuracy. -
Design and Development of Adaptive Authentication Model to Detect User Behavior Anomalies
The password-based authentication system has recently become more secure as the riskbased authentication system(RBA) is indentured. Recent research in the area has shown the significant use of 2 Factor Authentication (2FA) and Multi-Factor Authentication(MFA) in many commercial applications using Risk Based newlineAuthentication(RBA). The RBA system monitors the parameters extracted during the user login process, and based on the proposed model, the system raises a multi-factor newlineauthentication to the user. As the vulnerability has increased concerning passwords, fingerprints easy access to any web application may result in a security flow; the reason can be the existing methodology of the RBA system and also the unavailability of the data of the users during the initial login process, which hinders the authentication system during the initial login process as there is no standard method to incorporate RBA in the authentication system. Few researchers have proposed novel approaches to improve the authentication system. Still, to the best of our knowledge, no research has suggested methods to address the authentication system during the initial login process and also provide a robust way, a combination of Machine Learning (ML) and statistical newlineapproaches. Hence, a novel method is proposed for the RBA system during the initial newlinelogin phase using a Hierarchical Sub-Feature Based Model -(HSFBM) for different user newlinecategories. The FAR is comparatively better in our proposed model against the standard newlinemodel, with minimal re-authentication requests for the user. -
Design and development of an efficient model for handwritten modi script recognition
Machine simulation of human reading has caught the attention of computer science newlineresearchers since the introduction of digital computers. Character recognition, a branch of pattern recognition and computer vision, is the process of identifying either printed or handwritten text from document images and converting it into machinecoded text. Character recognition has been successfully implemented for various foreign language scripts like English, Chinese and Latin. In the case of Indian language scripts, the character recognition process is comparatively difficult due to various complexities such as the presence of the vowel modifiers and a large number of characters (class). MODI script is a shorthand form of Devanagari script and it was used as an official script for writing Marathi until 1952. Presently the script is not used officially, but has historical importance. MODI script is a cursive script and the character recognition task is difficult due to various reasons such as variations in the shapes of a character with different individuals and the presence of identical looking characters. MODI documents do not have any word demarcation symbols and that adds to the complexity of the task. The advances in various Machine Learning newlinetechniques have greatly contributed to the success of optical character recognition. newlineThe proposed work is aimed at exploring various Machine Learning techniques/ newlinemethods which can be effectively used in(to) recognizing(recognize) MODI script and newlinebuild a reliable and robust character recognition model for handwritten MODI script. This research work also aims at the development of a Machine Transliteration and text recognition system for MODI manuscripts. -
Design And Development Of Artificial Intelligence Based Knowledge Management System For Managing Software Security Vulnerabilities
Software development practices play a signifcant role in building the world s future. It is the place where exciting technological evolution begins in the world. Exploration of critical challenges in the area of software development plays a signifcant role in fueling the pace of technological progression in the industry. This work focuses on exploring important areas of software development practices and problems faced by the industry. Understanding the critical parts of the software system development eco-system and the stakeholders associated with those will be important. Customers of software development teams, the software development industry and knowledge newlinesources, and the software development internal eco-system are the broad focus areas of study. Leveraging the data already spread across the eco-system and facilitating easy newlineaccess to practitioners as and when there is a need will be one of the primary focuses. newlineThe software development landscape module, customer landscape module, and industry landscape module are the key modules that will be explored in this work. The core aspiration of the work will be to integrate all the possible data across the industry newlineand process the same and make it easily accessible to the practitioners as and when they are needed. The process also makes the data smarter and more insightful over time. -
Design and Development of Dual Fuzzy Technique to Optimize Job Scheduling and Execution Time in Cloud Environment
Cloud computing is a type of computing that relies on sharing a pool of computing resources, rather than deploying local or personal hardware and software. It enables convenient, on-demand network access to a shared pool of configurable computing re- sources (e.g., applications, storage, networks, services, and servers) that can be swiftly provisioned and released with minimal management control or through the interaction of the cloud service provider. The increasing demand for computing resources in the cloud has made elasticity an important issue in the cloud. The availability of extending the resources pool for the user provides an effective alternative to deploying applications with high scalability and processing requirements. Providing a satisfactory Quality of Service (QoS) is an important objective in cloud data centers. The QoS is measured in terms of response time, job completion time and reliability. If the user jobs cannot be executed in high load and the job is crashed, it will enormously increase the response time and also push up the job completion time. Also due to load, the jobs may be still in the waiting queue and could not find a resource to execute. In such a situation, the user notices a big response delay and it will affect the QoS. Towards ensuring QoS, this research proposes the following solution - Dual Fuzzy Load Balancing for jobs. Dual Fuzzy Load Balancing balances the load in the data center with an overall goal of reduction of response and execution time for tasks. The proposed solutions were simulated in the Cloudsim simulator and performance metrics in terms of job response time, job completion time, resource utilization, a number of SLA violations, and along with the cost comparison to the existing algorithms of Load Balancing. The proposed solutions are also implemented in a real cloud environment and the effectiveness of the solution is evaluated. -
Design and development of load balancing algorithm for enhance cloud computing performance
Software Applications have taken a leadership position in the field of Information Technology to reduce the human workload. In the case of distributed applications, the scalability of the application is a matter of newlineconcern in the present dynamic scenario. The fast developments in computing resources have reduced the cost of hardware and increased the processing capability of the system remarkably. Still, hosting a distributed newlineapplication in a higher end system is not recommended due to many reasons. Firstly, when there is a massive demand in the usage of the application which is beyond the limit of the system, there is no way to scale newlineit. The second reason is that when the system usage of the application is minimal, the entire infrastructure dedicated to the targeted application will remain idle. newlineDue to the wide acceptability of the industry on cloud computing, the variety of applications are designed to target the cloud platform which is one of the challenges for efficient load balancing in the cloud newlineenvironment. A fair distribution of workload among the available resources is mandatory to improve the efficiency of the cloud platform. To share the workload, a useful load balancing strategy, as well as a timely invocation of the plan, is essential. Invocation of the approach known as triggering policy can be different in centralised and distributed scenarios. Since cloud applications are running in a distributed situation, through this research newlinework, the researcher puts forward a complete framework for balancing load in different types of the request generated in Infrastructure as a Service (IaaS) platform. newlineAs a progressive model, this research work continuously focuses on improving the performance of the load balancer in the IaaS platform. Since the cloud data centres are spread across the globe, a centralised monitoring system to monitor and analyse the resource utilisation in different data newlinecentres is an essential requirement to see the load fluctuations in different clusters. -
Design and Implementation of Low Complexity Multiplier-Less Reconfigurable Band Tuning Filter Structure with Sharp Sub-Bands
Digital flter banks are extensively used for communication purposes for channelization. Reconfgurable non-uniform multi-channels with sharp transition widths are necessary for channelization in digital channelizer and spectrum sensing in wireless communication networks. The aim of this research work is to design reconfgurable flter structures featuring non-uniform and sharp transition newlinewidth channels with reduced number of flter coeffcients. The four different flter structures are proposed in this research for achieving low complexity reconfgurable structure for the design of multiple non-uniform sharp transition width arbitrary bandwidth channels. The foundational newlineelement of this research is centered around the design of a prototype flter. This prototype flter serves as a basis for developing various reconfgurable flter structures. Leveraging the prototype newlineflter s bandwidth characteristics, these structures are categorized into two main groups: narrow band prototype flters and wide band prototype flters. The narrow band prototype flter category comprises structures capable of designing a single fnite impulse response flter with a narrow passband characterized by sharp transition widths. In contrast, the wide band prototype flter category includes structures capable of designing a single FIR flter with a wide passband also characterized by sharp transition widths. A novel flter structures are designed with the help of interpolated newlinefnite impulse response, cosine modulation technique, complex exponential modulation technique and frequency response masking techniques. The proposed method is evaluated using MATLAB R2019b, where the linear phase FIR flter coeffcients are computed based on the Parks-McClellan algorithm. The examples are employed to illustrate the effcient operation of the proposed designs. The results point to the fact that the proposed designs have less multiplier complexity than existing cuttingedge techniques. -
Design of new access control structure for provenance based on secret sharing
Access control is one of the important elements in providing confidentiality to the secured data. Access specifiers helps us understand degree of rights given to the users in utilizing data records in a right manner. Tampering the records by unauthorized parties is a high concern in secure communication. Tamper detection plays an important role in trouble shooting an issue associated with network/ host intrusion scenario. The advances in information technology have driven the modern world to focus on the Web for digital information. People across the globe rely on the internet for all the data from generic information to distribution of personal data over heterogeneous networks. Technology has grown so wide to an extent, where almost all of the financial transactions are taking place through online portals. On the other hand, there has been high rise in the security threats towards users confidential data. This information is however shared by the online users while performing financial transaction in e-commerce portals. In order to maintain security mechanism over the untrusted networks various authentication techniques available in this regard. All these security procedures are said to be stubborn and adequate on contextual basis, on the other hand over a period of time the intruders would find out ways to break into systems. Data theft and intrusion into the information systems would increase on a daily basis if defensive measures are not in place. In this concern a new mechanism for securing the data using majority voting concept is proposed. Majority voting for securing the provenance data is applied. Provenance data is data of origin, genesis data and often observed as sensitive data. Provenance is a record of events, timestamps, versions, transformations occurred for the data of interest. Elicitation of an entity's genesis is termed provenance. This understanding is called as data provenance for data objects and their relationships. In most of the cases provenance data is tend to be sensitive, and a small variation or regulation paves way to changes to entire lineage of the connected data. This origin needs to be protected and only approved parties should have right to use the data. Specific description or classification of an instance's historical record or data object is called data provenance. It has many implications in different disciplines in terms of its significance for the acquisition of data flow mechanisms. Personal control of data privacy is a common scenario and various solutions towards security exists. In this regard a unique model is proposed where control of data and its related allies is available with several bodies but however not one; and if access control is to be permitted for a cause based on context, all the entities holding their right keys will have to agree and share on a common platform for accessing the data. Combining these shares in a peculiar pattern allows the grant for accessing data. The method of allocating controls to multiple bodies and allowing grants based on the combination of stakes is called a mechanism for secret sharing. Share separation may be derived from the concept of visual encryption methodology We integrate concepts of secret sharing and provenance to provide an indigenous solution for parameters of information security namely confidentiality, integrity and availability. It is with the availability of exceptional wireless internet access in mobile motivated situations, users and usage data has become massive with respect to media. For example, financial related operations carried out over online platforms by users in many ways were found insecure and unauthenticated. Procedure with appropriate algorithms are available for safe data communication in various modes, however lacks to attain high accuracy and performance with regards to the basic goals of security; confidentiality, integrity, availability at a significant level. Security is the main aspect of any communications among untrusted networks in the current world. Sincere gratitude to many researchers for their tremendous contributions to effective security algorithms despite various threats that compromise the computer systems vulnerabilities. The origin of the data, i.e., by which the transaction thread was created, is the pertinent question to be answered while the financial operation is finalized. This definition of 'data antiquity ' has received good interest from researchers in different fields for many decades and is often termed as data provenance. However, security in provenance has made some progress with recent research, particularly in the field of cyber security. This study emphases on the safety characteristics of data provenance with a distinctive cryptographic approach. The combination of these principles produces unique results for safeguarding the genesis data. -
Design, Analysis and Validation of Electric Vehicle Control and Safety for Different Path Profiles and Braking Conditions
Energy conservation and Environmental pollution are two major challenges today for our society. Currently, utilization of the latest technology, to reduce energy consumption and harmful emissions from vehicles, is gaining significance in the contexts related to automobile, energy and power industries. Considerations of these contexts enable us to form a more realistic newlineperspective and a need for developing fuel efficient, comfortable and affordable electric vehicles. The importance of design and development of electric vehicle (EV) is better perceived when, there is a major impact on our future society due to (i) the energy saving aspect from newlineboth the customer side on individual expenditure as well as from the national economy viewpoint and (ii) the huge benefit due to reduction of emissions from internal combustion engines using fossil fuels. EV offers the best solution which not only avoids emissions but overcome the dependency on petroleum resources as well. Due to fewer moving parts, monitoring and controlling of EV are also smooth and relatively much easier. The embedded control techniques used in EV also contribute for a better controllable, observable, predictable newlineand efficient vehicle drive. This current research work focuses mainly on Electric Vehicle Mobility and Control aspects for a deeper study. This research work addresses topics related to mathematical modelling and simulation studies for design and analysis of EV control and safety. Validations of the several case studies done during this research are supported by software tools namely MATLAB/Simulink and IPG Carmaker Virtual Driving Simulation Platform. Starting from modelling, throughout the various stages of this work, realistic vehicle parameters and specifications are considered. The newlinedifferent levels of testing, validation and trial runs of the model-based designs are also validated by software in loop and hardware in loop approaches. Automotive Safety Integrity Level B/C hardware was used for the implementation purpose. -
Design, Synthesis, and Applications of Carbon Dots with Controlled Physicochemical Properties
Modification of carbon dots (CDs) is essential to enhance their photophysical newlineproperties and applicability. Physical (e.g., composite material blending, coreshell architecture) and chemical (e.g., doping, surface passivation) methods exist to modify CDs. Different precursors can impart varied functionalities and heteroatomic dopants on CDs. Despite several modification strategies, the reproducibility and scalability of CDs still need to be improved. Newer approaches for modifying CDs are thus essential to ensure lab-to-lab and batchto-batch consistency. Our study focused on developing novel strategies for the physicochemical modifications of CDs. The theoretical simulation we performed for surface-functionalised CDs with the aid of density functional theory and time-dependent density functional theory helped to predict the mechanism of photoluminescence (PL) and to analyse the effect of hydrogen bonding on the newlineproperties of CDs (Chapter 3). We have developed a novel and general method for preparing amine functionalized CDs from modified paper precursors (Chapter 4). This strategy allows us to prepare CDs with customized functionalities, alleviating the post-synthesis modification. A novel ionimprinting strategy involving CDs synthesised from modified paper precursors newlinewas also developed through our research (Chapter 5). In another work, we utilized silk fibers as a matrix for immobilising CDs (Chapter 6). CDs prepared from mulberry leaves were fed to silkworms to produce CD-embedded silk fibres, which could be used to detect dopamine. In addition, we prepared CDs newlinefrom an unreported natural source (frankincense), which were used to detect lead ions (Chapter 7). We demonstrated the heavy metal sensing application of these newlineCDs in combination with a UV-light LED chip and a smartphone, which is relevant in resource-limited areas. The research results presented in the thesis are expected to inspire further investigations and applications related to CDs. -
Design, Training, and Implementation of A New Individualized Education Plan (IEP) Format For Special Educators And Students With Intellectual Disabilities At Selected Special Schools
An individualized Education Plan (IEP) is a multidisciplinary, teamdeveloped plan required for every child receiving special education services. The researcher delved into concerns surrounding Individualized newlineEducation Programs (IEPs) for students with intellectual disabilities. Two significant hurdles were discovered: existing IEPs lacked essential intervention areas, and special education teachers felt inadequately newlineequipped to construct effective plans. newlineThe study tackled these concerns head-on through a multi-pronged approach. Firstly, a meticulous analysis of existing IEPs revealed crucial sections missing from intervention plans, hindering their effectiveness. newlineThis analysis served as the blueprint for crafting a more comprehensive IEP format that addressed the identified gaps and provided a robust framework for intervention. Next, the study focused on empowering special education teachers. Sixty special education teachers certified by the Rehabilitation Council of newlineIndia, participated in training sessions on the new format, undergoing a vital skills and knowledge upgrade in IEP development. This equipped them with the tools and understanding necessary to create more effective plans tailored to individual student needs. The theory then transitioned to practice. Students with intellectual newlinedisabilities were included in interventions based on the improved IEPs, with their progress closely tracked and evaluated. The results were highly promising. Teachers demonstrated a tangible improvement in knowledge, translating into their ability to create more effective IEPs. More importantly, students thrived with the enhanced format. Those involved in interventions using the improved IEPs exhibited significant progress in various domains, highlighting the positive impact of the new approach. The study culminated in key recommendations for further newlineimprovement. Ongoing teacher training sessions were suggested to ensure teachers remain updated on best practices and evolving methodologies. -
Designing A New Encryption - Then - Compression System for Grayscale Images Utilizing Entropy Encryption
In the digital era, images and video sequences have dramatically increased newlinebecause of the rapid growth of the Internet and the widespread utilization of multimedia systems. The advancement in technology facilitates a faster way of transmitting data; however, the channel used for communication is an untrusted medium. The proposed research focus on the secure newlinetransmission of grayscale images over a social networking site (SNS) provider called the untrusted channel. Rigorous research has been conducted on the secure transmission of images and proposed different models, namely Compression-then-Encryption (CtE) Systems and newlineEncryption-then-Compression (EtC) Systems. In EtC, the encrypted information is transmitted over the channel. However, the channel is newlinecompressing the information to reduce the overall traffic. Due to the compression performed by the channel, the decryption process may fail on the receiver side. Constructing an efficient EtC model, as good as the standard compression algorithms, will address the gap in research. Four objectives were formulated, and schemes were proposed for each objective to address the problem. Two schemes were developed to address the first objective, eliminating noise incurred during transmission through the channel. The first scheme eliminates the noise using a two-pass hybrid mean and median filter. In the second scheme, a supervised curve fitting a linear regression model with a mean filter is applied. To secure the transmission of images over the untrusted channel, the objectives two and three address the scrambling and encryption of images. A hybrid of improved Arnold transforms and ElGamal encryption is experimented with in the first scheme to address scrambling and encryption. In this initially, a Block-wise scrambling is applied to the image, followed by pixels-wise newlinescrambling within the block followed by Arnolds transform. The outcome is given to ElGamal encryption. -
Destination Resilience and Smart Tourism Ecosystem : A Destination Management Framework for Competitiveness
Over the past many decades, the travel and tourism industry has been at the forefront of adapting to new changes and accepting the latest technologies. Today's travelers are sophisticated and knowledgeable, as they have all the information available to them easily, which contributes to fast and quick decision making. The world is gradually changing into a much more intelligent and advanced platform that makes it possible to employ techniques like augmented reality, virtual reality, and artificial intelligence. This has proven to be very successful in a variety of fields, including education, healthcare, marketing, and communication. The current study focuses on incorporating smart tourism strategies to build a sustainable ecosystem at destinations, which enhances the competitiveness of the destination and makes it easier for value co- creation among the different stakeholders. Research suggests that although industry-led and government-initiated projects seem to prioritize the use of smart applications in destinations in theory, practical implementation appears to lag behind. Less research has been done in India on gamification, smart wearable technology at travel destinations, and the practical application of AR and VR tools. The study revolves around the South Indian State of Kerala, which has been a pioneer in tourism promotion in the country. In addition to proposing a framework for destination management and tourism competitiveness with smart tourism applications, this study aims to investigate the practical implications of smart tourism tools and technologies at destinations. To shed more light on the findings, a mixed methodology approach is used to analyze the data using a mix of quantitative and qualitative methods. The study's conclusions have significant ramifications for destination management, strategic planning, and the application of smart technologies at travel locations. -
Determinants and Impacts of Mergers and Acquisitions in the Drugs and Pharmaceutical Industry in India
Mergers and Acquisitions (MandAs) are inorganic growth strategies adopted by firms for achieving the objective of long-term growth maximization. Compared to other inorganic growth strategies like joint ventures and strategic alliances, MandAs offer deeper restructuring opportunities and better control over business over a long-term newlinebasis. During the third wave of globalization which started in early 1990s, MandAs became a popular strategy for firms to expand their businesses beyond the national boundaries. newlineIndian economy has been witnessing buzzling activity in the MandA landscape. A sectoral analysis of MandA trends identifies pharmaceutical sector as one of the top 5 newlinesectors with the highest MandA deal values during the period 2013-2016. Though Pharma sector has witnessed a decline in deal values during few years in the recent past, the resilience of the sector is visible through its ability to bounce back with record newlinebreaking deal values. Due to the continuous regulatory changes occurring in the domestic and foreign markets, pharma companies have to constantly change their strategies to survive and grow in the industry. MandAs enable pharma companies to adapt to these changes quickly. This study explores how the firms in the pharmaceutical sector use MandA as a strategy to navigate through the dynamic competitive landscape. The objectives of this research are threefold developing an understanding of the motives behind MandA decisions of the pharma firms, identification of the firm level determinants of acquisition probability and assessment of impact of MandAs. This study newlineuses qualitative content analysis for identification of MandA motives. The firm level newlinedeterminants of acquisition probability have been explored using Random Effect Logistic (REL) regression using panel data. Case study approach has been employed to assess the MandA impact by comparing the MandA motives with the post MandA outcomes. -
Determinants of consumer product return behavior with respect to online shopping of apparels
This research aims at finding the determinants of consumer product return newlinebehavior with respect to online shopping of apparel in Bangalore city. The study was administered to 600 respondents, and the response received was from 465 respondents. The convenience sampling method was used to collect samples across Bangalore city. Product return behavior was measured using a newlinefive-point Likert scale for 34 items. The literature review was conducted extensively, covering both Indian and international context. This research is designed to address the literature gaps. Many hypotheses were proposed in the thesis and were examined using structural equation modeling. The hypotheses were tested with the software newlineAMOS 25 and SPSS 25 to fulfill the research objectives. Confirmatory factor analysis was done on the data to confirm the instrument reliability and validity. Confirmatory factor analysis was used to verify the constructs developed from the detailed literature review. ANOVA post hoc test was done to check the relationship among the demographic variables. Descriptive statistics were used newlineto interpret the data. With the help of structural equation modeling, the causal newlinerelationship between the dependent variable and the independent variables were identified. The study on the determinants of product return behavior has provided a lot of newlineinsights. Customer attitude has a significant and negative impact on product return behavior. The customers with a positive attitude towards online apparel purchases will be less likely to return products. The previous customer experience and their consumption pattern have a significant and negative impact on product return behavior. The customers with a bad experience with newlinebuying online apparel products, tend to return their products more. The perceived risk of online apparel purchases has a significant and positive impact on product return behavior. The customers with a high perceived risk of online apparel purchases will be more likely to return their products.