Browse Items (11809 total)
Sort by:
-
Secure authentication framework for cloud
The growing popularity of cloud based services is prompting organizations to consider shifting applications and data onto cloud. However, organizations dealing with highly sensitive information are apprehensive of moving its applications & data to public cloud owing to concern about security of its information. It is hence incumbent on service providers that only legitimate Users will access its services and resources in cloud. Verifying authenticity of remote users is a necessary pre-requisite in a cloud environment before allowing access to secure resources/services/ applications. The simplest & most commonly used user authentication mechanism is password based authentication. However, Users tend to choose easy to remember password, and many a times use same password for multiple accounts, which makes it often the weakest link in security. Furthermore, service providers authenticating Users on the basis of password, stores password verification information in their databases and such authentication schemes with verification table are known to be vulnerable to various attacks. From the perspective of authentication requirements, service providers in a cloud environment can be broadly categorized into two. Those service providers dealing with highly sensitive information and working in a regulated environment can be grouped into category one ?? as in those offering services for sectors like health care, finance. These providers require a strong and secure authentication mechanism to authenticate its users, without any additional functionality. Similarly, there is a second category of service providers dealing with secure information but operate in a collaborative environment ?? as providers providing their applications bundled through a web portal. To provide the Users with a seamless authentication experience, while accessing multiple services during a session, the second category of service providers prefer to have Single Signon functionality. Two-factor authentication technology overcomes the limitations of password authentication and decreases the probability that the claimant is presenting false evidence of its identity to verifier. If different service providers set up their own two-factor authentication services, Users have to do registration and login process repeatedly. Also, Users accessing multiple cloud services may be required to hold multiple authentication tokens associated with various service providers. Authentication factors such as crypto-tokens and smart cards with cryptographic capabilities have been vastly used as a second authentication factor. However, Users are required to always carry these authentication tokens which make it cumbersome from practical usability perspective. Also its usage involves cost thus restricting its adoption to corporate environments. The authentication process can be made more user-convenient if the authentication factor chosen is such that it is commonly used by all types of Users. Leveraging the use of mobile phone as an authentication factor can help address issue of user convenience at no extra cost while improving the security of authentication schemes. Though, there has been an increasing focus on strengthening the authentication methods of cloud service users, there is no significant work that discusses an authentication scheme that can be adopted by the two categories of cloud Service Providers. Taking cognizance of aforesaid issues related to secured authentication in cloud environment, this research focused on designing secure Two-Factor authentication schemes that can be adopted by the two categories of service providers. This research carried out in different levels, proposes authentication architecture and protocols for the two categories of service providers. At the first level, research proposes Direct Authentication architecture for cloud Service Providers who prefer to authenticate its users by using a strong authentication mechanism and does not require Single Sign-On (SSO) functionality. For those Providers who prefer to provide its user with a SSO functionality the research proposes Brokered Authentication architecture. The next level of research focuses on proposing User Authentication Protocols for both Direct Authentication Service Providers (DASPs) and Brokered Authentication Service Providers (BASPs). The research proposes use of strong, Two-Factor Authentication Protocols without Verifier Table. The suggested protocols, provides Users with flexibility of using a Password and either a Crypto-token or a Mobile-token to authenticate with Service Providers. The proposed approach eliminates the requirement of the User to remember multiple identities to access multiple services and provides the benefit of a higher level of security on account of second authentication factor and non-maintenance of verifier table at server. Access to different services offered by multiple service providers using a single authentication token requires interoperability between providers. Also, the Service Providers will have to address the task of issuing the second authentication factor to Users. As a result, the research intends to propose the utilization of proposed two-factor authentication scheme within a specific environment which includes a trusted entity called an Identity Provider (IdP), with whom Users and Service Providers will be registered. The IdP is responsible for issuing and managing the second authentication factor. In brokered authentication, the IdP playing the role of an authentication broker also provides Single Sign-on functionality. The Security Assertion Markup Language (SAML) is used by BASPs and the IdP to exchange authentication information about Users. A major objective of this research is to propose an authentication model that can be adopted by both categories of service providers. Hence, this research proposes an authentication framework for cloud which supports an integrated authentication architecture that provides the service providers with the flexibility to choose between direct and brokered authentication. The integrated two-factor authentication protocol, which does not require the server to maintain a verifier table, supported by the frame work allows users to do a single registration and access services of both direct & brokered authentication service providers using the same crypto-token/mobile-token. To verify claims about security strengths of the proposed authentication protocols, security analysis is done using theoretical intuition. The proposed protocols are found to offer desirable security features such as resistance to replay attack, stolen verifier attack, guessing attack, user impersonation attack etc. To verify the efficiency of the proposed protocols, the communication and computation costs are compared with similar schemes and it is seen that the costs are comparable. To validate the resistance of protocols to authentication attacks, they are analyzed using automated verification tool called ????Scyther??? and the protocol strength is verified by ???no attacks??? results. -
Secure authentication frame work for cloud
The growing popularity of cloud based services is prompting organizations to consider shifting applications and data onto cloud. However, organizations dealing with highly sensitive information are apprehensive of moving its applications and data to public cloud owing to concern about security of its information. It is hence incumbent on service providers that only legitimate Users will access its services and resources in cloud. newlineVerifying authenticity of remote users is a necessary pre-requisite in a cloud environment before allowing access to secure resources/services/ applications. The simplest and most commonly used user authentication mechanism is password based authentication. However, Users tend to choose easy to remember password, and many a times use same password for multiple accounts, which makes it often the weakest link in security. Furthermore, service providers authenticating Users on the basis of password, stores password verification information in their databases and such authentication schemes with verification table are known to be vulnerable to various attacks. newlineFrom the perspective of authentication requirements, service providers in a cloud environment can be broadly categorized into two. Those service providers dealing with highly sensitive information and working in a regulated environment can be grouped into category one as in those offering services for sectors like health care, finance. These providers require a strong and secure authentication mechanism to authenticate its users, without any additional functionality. Similarly, there is a second category of service providers dealing with secure information but operate in a collaborative environment as providers providing their applications bundled through a web portal. To provide the Users with a seamless authentication experience, while accessing multiple services during a session, the second category of service providers prefer to have Single Sign-on functionality. -
Secure Authenticated Communication Via Digital Signature And Clear List In VANETs
Vehicular ad hoc network (VANET) plays a vital role in the intelligent transportation system(ITS), When a vehicle receives a message through network, the CRL (certificate revocation list) checking process will operate before certificate and signature verification. After successful authentication,a CRL list is created based on authentication. This CRL is used to verify whether a vehicle node can be permitted for communication in the VANET network. But when using CRL, a huge amount of storage space and checking time is needed. So we proposed a method without CRL list, but mentions a key management list to overcome large storage space and checking time even it reduce the access delay too. For the access permission we can do an authentication system based digital novel signature authentication(DNSA) for each vehicles in the vanet with the RSU unit or with other participant node vehicles in the communication as per the Topology.So we can perform an efficient and secured communication in VANET. The Electrochemical Society -
Secure approach to sharing digitized medical data in a cloud environment
Without proper security mechanisms, medical records stored electronically can be accessed more easily than physical files. Patient health information is scattered throughout the hospital environment, including laboratories, pharmacies, and daily medical status reports. The electronic format of medical reports ensures that all information is available in a single place. However, it is difficult to store and manage large amounts of data. Dedicated servers and a data center are needed to store and manage patient data. However, self-managed data centers are expensive for hospitals. Storing data in a cloud is a cheaper alternative. The advantage of storing data in a cloud is that it can be retrieved anywhere and anytime using any device connected to the Internet. Therefore, doctors can easily access the medical history of a patient and diagnose diseases according to the context. It also helps prescribe the correct medicine to a patient in an appropriate way. The systematic storage of medical records could help reduce medical errors in hospitals. The challenge is to store medical records on a third-party cloud server while addressing privacy and security concerns. These servers are often semi-trusted. Thus, sensitive medical information must be protected. Open access to records and modifications performed on the information in those records may even cause patient fatalities. Patient-centric health-record security is a major concern. End-to-end file encryption before outsourcing data to a third-party cloud server ensures security. This paper presents a method that is a combination of the advanced encryption standard and the elliptical curve Diffie-Hellman method designed to increase the efficiency of medical record security for users. Comparisons of existing and proposed techniques are presented at the end of the article, with a focus on the analyzing the security approaches between the elliptic curve and secret-sharing methods. This study aims to provide a high level of security for patient health records. 2023 Xi'an Jiaotong University -
Secure and Private Federated Learning through Encrypted Parameter Aggregation
This chapter is dedicated to cross-silo private parameter aggregation. ML/DL has demonstrated promising results in a variety of application domains, especially when vast volumes of data are collected in one location, such as a data center or a cloud service. The goal of FL is to improve the quality of ML/DL models while minimizing their drawbacks. Participating devices in an FL task could range in size from a single smartphone or watch to a global corporation housing multiple data centers. It was originally believed that just a little amount of information about the original training data would be carried over into subsequent model updates as FL interactions occurred. The differential privacy framework is concerned with restricting the release of private information while sharing the outcomes of computations or queries performed on a dataset. Recently, many researchers have begun to employ differential privacy while training models in a federated setting. 2024 Saravanan Krishnan, A. Jose Anand, R. Srinivasan, R. Kavitha and S. Suresh. -
Secure and automated driving inspector powered by Arduino Raspberry Pi using deep learning /
Patent Number: 202141038074, Applicant: Raghunandan K R.
The product will be placed in the car that will read the carTMs diagnostic information such as acceleration, clutch, gear and braking. During the driving session, data from one or more sensors will be used to monitor driving test events. Each driving session and its related weights will be described in detail at these events. These weights are based on a rating system of the driving as well as the amount of time spent during driving. By delivering a driver feedback ranking, a driving session report will be generated. -
Secure and analytical agile framework for software continuous delivery /
Patent Number: 201741040903, Applicant: Anwin Varghese.
IT world thrives on quality software products that helps business sustain and grow. These software products help reduce the effort in time & bring in better economic efficiency making these products highly dependable. The challenge of having quality product releases frequently is not quite an easy task as it sounds. To meet this challenge of producing reliable software products, the IT managers & leads need a team of dedicated developers, system programmers, testers along with a highly efficient process in place. -
Secure and analytical agile framework for software continuous delivery /
Patent Number: 201741040903, Applicant: Anwin Varghese.
IT world thrives on quality software products that helps business sustain and grow. These software products help reduce the effort in time & bring in better economic efficiency making these products highly dependable. The challenge of having quality product releases frequently is not quite an easy task as it sounds. To meet this challenge of producing reliable software products, the IT managers & leads need a team of dedicated developers, system programmers, testers along with a highly efficient process in place. -
Secure and analytical agile framework for software continuous delivery /
Patent Number: 201741040903, Applicant: Anwin Varghese.
IT world thrives on quality software products that helps business sustain and grow. These software products help reduce the effort in time & bring in better economic efficiency making these products highly dependable. The challenge of having quality product releases frequently is not quite an easy task as it sounds. To meet this challenge of producing reliable software products, the IT managers & leads need a team of dedicated developers, system programmers, testers along with a highly efficient process in place. -
Secure & real-time status notification based dynamic messaging method using in-built message tool without using internet /
Patent Number: 201911045703, Applicant: Asik Rahaman Jamader.
The present invention disclosure is related to secure & real-time status notification based dynamic messaging method using in-built message tool without using internet. Advanced system, in-built tool, handy graphical user interface, and technique are delivered for electronics advances messaging communication with real-time status notification. A representation of the typing is shown on the destination device, with an indicator that modifies the appearance when the other member views the text. -
Sectoral correlations and interlinkages: NSE
An efficient portfolio is a well-diversified portfolio that gives the investor opportunities to earn money and provide cover against risks. Understanding the intersectoral linkages and correlations among various sectors in a stock market will help an investor to diversify the portfolio and reduce risk efficiently. This study aims at examining the underlying linkages and correlations among eight sectors in the Indian National Stock Exchange (NSE) using a Granger causality test under VAR environment. The results of the study based on nine years' data from 2009 to 2018 show that an effective portfolio can have two classifications -stocks from Pharma and Media as group one (defensive stocks) and picks from IT, Bank, Financial Services, Realty, Auto and FMCG sector as group two (somewhat Cyclical). The study further proves that the usual definition for cyclical and defensive sectors have undergone some profound changes. 2020 SCMS Group of Educational Institutions. All rights reserved. -
Second Order Parallel Tensor on Almost Kenmotsu Manifolds
Let M be an almost Kenmotsu manifold of dimension 2n + 1 having nonvanishing ?-sectional curvature such that tr? > -2n -2. We prove that any second order parallel tensor on M is a constant multiple of the associated metric tensor and obtained some consequences of this. Vector fields keeping curvature tensor invariant are characterized on M. Kyungpook Mathematical Journal -
Second Language Learners problems in acquiring Reading and Writing skills in English: A Study at the Higher Secondary Level
This dissertation is the result of a study that aimed at identifying problems in Reading and Writing skills in English faced by students studying at the higher secondary level. The study, by way of a series of tests administered to students from three schools in Bangalore, and questionnaires administered to the teachers, along with interviews of the faculty, found that students faced problems in acquiring language skills, and the teachers certain challenges in imparting these skills to students. The sample consisted of 110 participants. It included three schools. The survey employed three research instruments for data collection: four tests, questionnaire and interview. By listening to the teachers through the interviews, and the evaluation of their caliber as teachers through the questionnaire, and through the results of the tests conducted for the students, the researcher was able to garner information that helped to make the dissertation more comprehensive by identifying the specific area that students face problems in and the reasons behind these problems. Based on the above-mentioned information, the researcher was able to propose a few suggestions for the learners, teachers and the management. The objective of the study was to motivate students and to develop an interest in English language learning so that they will be better-equipped to cope with the challenges of society in future. The results impelled the researcher to conclude that one cannot generalize that students from English medium schools would perform better. Students from the regional medium marginally out performed those of the English medium, and this was a surprising element of the study. This helped the researcher to understand that the role of the teacher in imparting skills is very important. It was noticed that if provided with better learning facilities, and adequate motivation, appreciation and encouragement, students from regional medium schools can do well on English language tests. -
Seasonal Variation of Physicochemical Parameters and Their Impact on the Algal Flora of Chimmony Wildlife Sanctuary
Background and Objective: The lack of biodiversity knowledge and biodiversity loss are the two inevitable truths around us. Algae are the most crucial organism in our entire biodiversity. The seasonal variation of algal diversity can monitor the environmental changes of the freshwater ecosystem. The present study was conducted because the seasonal changes of algal diversity in Chimmony Wildlife Sanctuary were utterly unknown. Materials and Methods: The algal samples were collected and preserved from ten stations for three seasons (pre-monsoon, monsoon, post-monsoon). The physicochemical parameters of water like temperature, pH, total dissolved solids, total dissolved oxygen, total alkalinity and light intensity of the sampling stations were recorded. Results: The study revealed that the seasonal variation of physicochemical parameters provoked a change in the diversity of Algae. The Chimmony Wildlife Sanctuary has its highest algal diversity during pre-monsoon season. The Chlorophyceae Algae were dominant during the pre-monsoon season, while the Cyanophycean Algae were dominant during monsoon season. The ANOVA (two-way) analysis showed no significant difference between stations and there is a considerable difference between seasons for dissolved oxygen, alkalinity, temperature and total dissolved solids. While for pH, it showed no significant difference between seasons and stations but for light intensity, it showed a substantial difference between stations and seasons. A negative correlation was observed between algal species and seasons. The temperature and dissolved oxygen showed a negative correlation. Conclusion: The physicochemical parameters were changed according to the seasonal variation. Since Algae act as a biological pollution indicator for all the water resources, the study of algal flora according to the seasonal variation is crucial. 2022 Joel Jose and Jobi Xavier. -
Seasonal study on the Aquatic and Terrestrial Habitat of Edayar region, Ernakulam, Kerala, India
This study examines the plant diversity and physicochemical characteristics of both aquatic and terrestrial ecosystems in the industrialized region of Edayar, Kadungalloor, Ernakulam, Kerala, India. The research is conducted seasonally, encompassing the four seasons of Kerala: southwest monsoon, northeast monsoon, winter season and summer season. Edayar is home to approximately 400 industries. The main objective of this study is to assess the plant diversity with a specific focus on herb and macrophyte diversity, in the Edayar region, along with analyzing the physicochemical properties of soil and water. Random sampling using quadrat techniques is employed to collect data on species diversity. Diversity indices, such as the Simpson Index and Shannon-Wiener index are utilized to analyze the recorded species diversity. Scoparia dulcis L. among herb species and Eichhornia crassipes (Mart.) Solms among macrophytes were found dominating in all the seasons. The results for the physico-chemical analysis of water and soil were found approaching the threshold of standard limits.The findings provide valuable insights into plant diversity and ecological dynamics of the Edayar region, which have been significantly impacted by industrial activities. The outcomes serve as a basis for the development and implementation of effective conservation and management strategies to mitigate potential ecological risks associated with industrial activities in the region. 2024 World Researchers Associations. All rights reserved. -
Search for low-mass objects in the globular cluster M4. I. Detection of variable stars
With every new discovery of an extrasolar planet, the absence of planets in globular clusters (GCs) becomes more and more conspicuous. Null detection of transiting hot Jupiters in GCs 47 Tuc, ? Cen, and NGC 6397 presents an important puzzle, raising questions about the role played by cluster metallicity and environment on formation and survival of planetary systems in densely populated stellar clusters. GCs were postulated to have many free-floating planets, for which microlensing (ML) is an established tool for detection. Dense environments, well-constrained distances and kinematics of lenses and sources, and photometry of thousands of stars simultaneously make GCs the ideal targets to search for ML. We present first results of a multisite, 69-night-long campaign to search for ML signatures of low-mass objects in the GC M4, which was chosen because of its proximity, location, and the actual existence of a planet. M4 was observed in R and I bands by two telescopes, 1 m T40 and 18-inch C18, of the Wise Observatory, Tel Aviv, Israel, from 2011 April to July. Observations on the 1 m telescope were carried out in service mode, gathering 12 to 48 20 s exposures per night for a total of 69 nights. C18 observations were done for about 4 hr a night for six nights in 2011 May. We employ a semiautomated pipeline to calibrate and reduce the images to the light curves that our group is developing for this purpose, which includes the differential photometry package DIAPL, written by Wozniak and modified by W. Pych. Several different diagnostics are employed for search of variability/transients. While no high-significance ML event was found in this observational run, we have detected more than 20 new variables and variable candidates in the M4 field, which we present here. 2016. The American Astronomical Society. All rights reserved. -
Search for brown dwarfs in IC 1396 with Subaru HSC: interpreting the impact of environmental factors on substellar population
Young stellar clusters are predominantly the hub of star formation and hence, ideal to perform comprehensive studies over the least explored substellar regime. Various unanswered questions like the mass distribution in brown dwarf regime and the effect of diverse cluster environment on brown dwarf formation efficiency still plague the scientific community. The nearby young cluster, IC 1396 with its feedback-driven environment, is ideal to conduct such study. In this paper, we adopt a multiwavelength approach, using deep Subaru HSC along with other data sets and machine learning techniques to identify the cluster members complete down to ? 0.03 M? in the central 22 arcmin area of IC 1396. We identify 458 cluster members including 62 brown dwarfs which are used to determine mass distribution in the region. We obtain a star-to-brown dwarf ratio of ? 6 for a stellar mass range 0.03-1 M? in the studied cluster. The brown dwarf fraction is observed to increase across the cluster as radial distance from the central OB-stars increases. This study also compiles 15 young stellar clusters to check the variation of star-to-brown dwarf ratio relative to stellar density and ultraviolet (UV) flux ranging within 4-2500 stars pc?2 and 0.7-7.3 G0, respectively. The brown dwarf fraction is observed to increase with stellar density but the results about the influence of incident UV flux are inconclusive within this range. This is the deepest study of IC 1396 as of yet and it will pave the way to understand various aspects of brown dwarfs using spectroscopic observations in future. 2024 The Author(s). -
Search Engine Optimization for Digital Marketing to Raise the Rank, Traffic, and Usability of the Website
According to the Content Marketing Institute, 93% of online experiences start with search. That is the explanation search. Thats why search promoting is a crucial procedure for all organizations to improve and develop their organizations. At that time the marketers and the clients who paid for advertisements started analyzing SEO and SEM. Web crawler promoting expands the perceivability of sites through SEO or through paid publicizing with the plan of expanding traffic to the site. SEM eludes to all advertising exercises that utilization web index innovation for promoting purposes. These incorporate SEO, paid postings and advertisements, and other web crawler related administrations and capacities that will expand reach and introduction of the site, bringing about more prominent traffic. 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. -
Search and analysis of giant radio galaxies with associated nuclei (SAGAN): III. New insights into giant radio quasars
Giant radio quasars (GRQs) are radio-loud active galactic nuclei (AGN) that propel megaparsec-scale jets. In order to understand GRQs and their properties, we have compiled all known GRQs (the GRQ catalogue) and a subset of small (size < 700 kpc) radio quasars (SRQs) from the literature. In the process, we have found ten new Fanaroff-Riley type-II GRQs in the redshift range of 0.66 < z < 1.72, which we include in the GRQ catalogue. Using the above samples, we have carried out a systematic comparative study of GRQs and SRQs using optical and radio data. Our results show that the GRQs and SRQs statistically have similar spectral index and black hole mass distributions. However, SRQs have a higher radio core power, core dominance factor, total radio power, jet kinetic power, and Eddington ratio compared to GRQs. On the other hand, when compared to giant radio galaxies (GRGs), GRQs have a higher black hole mass and Eddington ratio. The high core dominance factor of SRQs is an indicator of them lying closer to the line of sight than GRQs. We also find a correlation between the accretion disc luminosity and the radio core and jet power of GRQs, which provides evidence for disc-jet coupling. Lastly, we find the distributions of Eddington ratios of GRGs and GRQs to be bi-modal, similar to that found in small radio galaxies (SRGs) and SRQs, which indicates that size is not strongly dependent on the accretion state. Using all of this, we provide a basic model for the growth of SRQs to GRQs. ESO 2022. -
SCSLnO-SqueezeNet: Sine Cosine-Sea Lion Optimization enabled SqueezeNet for intrusion detection in IoT
Security and privacy are regarded as the greatest priority in any real-world smart ecosystem built on the Internet of Things (IoT) paradigm. In this study, a SqueezeNet model for IoT threat detection is built using Sine Cosine Sea Lion Optimization (SCSLnO). The Base Station (BS) carries out intrusion detection. The Hausdorff distance is used to determine which features are important. Using the SqueezeNet model, attack detection is carried out, and the network classifier is trained using SCSLnO, which is developed by combining the Sine Cosine Algorithm (SCA) with Sea Lion Optimization (SLnO). BoT-IoT and NSL-KDD datasets are used for the analysis. In comparison to existing approaches, PSO-KNN/SVM, Voting Ensemble Classifier, Deep NN, and Deep learning, the accuracy value produced by devised method for the BoT-IoT dataset is 10.75%, 8.45%, 6.36%, and 3.51% higher when the training percentage is 90. 2023 Informa UK Limited, trading as Taylor & Francis Group.