Browse Items (2150 total)
Sort by:
-
Message efficient ring leader election in distributed systems
Leader Election Algorithm, not only in distributed systems but in any communication network, is an essential matter for discussion. Tremendous amount of work are happening in the research community on election as network protocols are in need of co-ordinator process for the smooth running of the system. These so called Coordinator processes are responsible for the synchronization of the system otherwise, the system loses its reliability. Furthermore, if the leader process crashes, the new leader process should take the charge as early as possible. New leader is one among the currently running processes with the highest process id. In this paper we have presented a modified version of ring algorithm. Our work involves substantial modifications of the existing ring election algorithm and the comparison of message complexity with the original algorithm. Simulation results show that our algorithm minimizes the number of messages even in worst case scenario. 2013 Springer Science+Business Media. -
Mapping extinction using GALEX and SDSS photometric observations
The primary objective of this work is to create an all sky extinction map of the Milky Way galaxy. We have cross-matched the Sloan Digital Sky Survey (SDSS data release 8) photometric observations with that of Galaxy Evolution Explorer (GALEX data release 6). This provides a wide range of wavelength coverage from Far Ultra-Violet through the optical spectrum and gives one unique SDSS source for every GALEX source. We discuss a sample of ?32000 objects in the north galactic pole (?75 latitude) from this combined database. The Castelli and Kurucz Atlas was fit to the photometric observations of each star, best fit being determined using a chi-square test. Best fit parameters provide the spectral type and extinction towards each of the objects. The shift in magnitude obtained during the best-fit can be used to determine the distance to each of the stars. With this data, a comprehensive extinction map can be made for the high-latitude objects and later extended to all-sky. 2013 AIP Publishing LLC. -
Verification and validation of Parallel Support Vector Machine algorithm based on MapReduce Program model on Hadoop cluster
From the recent years the large volume of data is growing bigger and bigger. It is difficult to measure the total volume of structured and unstructured data that require machine-based systems and technologies in order to be fully analyzed. Efficient implementation techniques are the key to meeting the scalability and performance requirements entailed in such scientific data analysis. So for the same in this paper the Sequential Support Vector Machine in WEKA and various MapReduce Programs including Parallel Support Vector Machine on Hadoop cluster is analyzed and thus, in this way Algorithms are Verified and Validated on Hadoop Cluster using the Concept of MapReduce. In this paper, the performance of above applications has been shown with respect to execution time/training time and number of nodes. Experimental Results shows that as the number of nodes increases the execution time decreases. This experiment is basically a research study of above MapReduce applications. 2013 IEEE. -
User profiling based on keyword clusters for improved recommendations
Recommender Systems (RS) have risen in popularity over the years, and their ability to ease decision-making for the user in various domains has made them ubiquitous. However, the sparsity of data continues to be one of the biggest shortcomings of the suggestions offered. Recommendation algorithms typically model user preferences in the form of a profile, which is then used to match user preferences to items of their interest. Consequently, the quality of recommendations is directly related to the level of detail contained in these profiles. Several attempts at enriching the user profiles leveraging both user preference data and item content details have been explored in the past. We propose a method of constructing a user profile, specifically for the movie domain, based on user preference for keyword clusters, which indirectly captures preferences for various narrative styles. These profiles are then utilized to perform both content-based (CB) filtering as well as collaborative filtering (CF). The proposed approach scores over the direct keyword-matching, genre-based user profiling and the traditional CF methods under sparse data scenarios as established by various experiments. It has the advantage of a compact user model representation, while at the same time capturing the essence of the styles or genres preferred by the user. The identification of implicit genres is captured effectively through clustering without requiring labeled data for training. 2014 Springer International Publishing Switzerland. -
An ettective dynamic scheduler tor reconfigurable high speed computing system
High Speed Computing is a promising technology that meets ever increasing real-time computational demands through leveraging of flexibility and parallelism. This paper introduces a reconfigurable fabric named Reconfigurable High Speed Computing System (RHSCS) and offers high degree of flexibility and parallelism. RHSCS contains Field Programmable Gate Array (FPGA) as a Processing Element (PE). Thus, RHSCS made to share the FPGA resources among the tasks within single application. In this paper an efficient dynamic scheduler is proposed to get full advantage of hardware utilization and also to speed up the application execution. The addressed scheduler distributes the tasks of an application to the resources of RHSCS platform based on the cost function called Minimum Laxity First (MLF). Finally, comparative study has been made for designed scheduling technique with the existing techniques. The proposed platform RHSCS and scheduler with Minimum Laxity First (MLF) as cost function, enhances the speed of an application up to 80.30%. 2014 IEEE. -
An approach for document pre-processing and K Means algorithm implementation
The web mining is a cutting edge technology, which includes information gathering and classification of information over web. This paper puts forth the concepts of document pre-processing, which is achieved by extraction of keywords from the documents fetched from the web, processing it and generating a term-document matrix, TF-IDF and the different approaches of TF-IDF (term frequency Inverse document frequency) for each respective document. The last step is the clustering of these results through K Means algorithm, by comparing the performance of each approach used. The algorithm is realized on an X64 architecture and coded on Java and Matlab platform. The results are tabulated. 2014 IEEE. -
GPR based subsurface geotechnical exploration
The Seismic refraction technique (SRT) and Electrical resistivity technique (ERT) have long been in use in geotechnical exploration. A relatively recent technique is Ground penetrating radar (GPR). The study presented in this paper is on GPR-aided geotechnical subsurface exploration. The usual method of exploration is drilling, which gives much-needed site-specific information, but is expensive and restricted to a few point locations. The possibilities of non-invasive investigation offered by GPR make it useful for supplementing geotechnical investigations. The present work describes GPR survey at a construction site in Mumbai. The objective was to derive subsurface logs from GPR signals. Conventionally, subsurface logging is done using boreholes. First, the extracted soil and rock samples are examined visually. Second, additional information such as Core recovery ratios (CRR), Rock quality designation (RQD) and Standard penetration test (SPT) N values are collected and strata are demarcated. In comparison, the amplitude variations of GPR signals may not correspond directly to variations of these physical properties with depth. However, the study shows that fairly good correlations do exist with the subsurface stratification and transformed signals. -
Depletion studies in the interstellar medium
We report interstellar Si depletion and dust-phase column densities of Si along 131 Galactic sight lines using previously reported gas-phase Si II column densities, after correcting for the differences in oscillator strengths. With our large sample, we could reproduce the previously reported correlations between depletion of Si and average density of hydrogen along the line of sight () as well as molecular fraction of hydrogen (f(H2). We have also studied the variation of amount of Si incorporated in dust with respect to different extinction parameters. With the limitations we have with the quality of data, we could find a strong relation between the Si dust and extinction. While we cannot predict the density dependent distribution of size of Si grains, we discuss about the large depletion fraction of Si and the bigger size of the silicate grains. 2013 AIP Publishing LLC. -
Cataloging of happy facial affect using a radial basis function neural network
The paper entitled "Cataloging of Happy facial Affect using a Radial Basis Function Neural Network" has developed an affect recognition system for identifying happy affect from faces using a RBF neural network. The methodology adapted by this research is a four step process: image preprocessing, marking of region of interest, feature extraction and a classification network. The emotion recognition system has been a momentous field in human-computer interaction. Though it is considerably a challenging field to make a system intelligent that is able to identify and understand human emotions for various vital purposes, e.g. security, society, entertainment but many research work has been done and going on, in order to produce an accurate and effective emotion recognition system. Emotion recognition system can be classified into facial emotion recognition and speech emotion recognition. This work is on facial emotion recognition that identifies one of the seven basic emotions i.e. happy affect. This is carried out by extracting unique facial expression feature; calculating euclidean distance, and building the feature vector. For classification radial basis function neural network is used. The deployment was done in Matlab. The happy affect recognition system gave satisfactory results. 2013 Springer. -
Load balancing with availability checker and load reporters (LB-ACLRs) for improved performance in distributed systems
Distributed system has quite a lot of servers to attain increased availability of service and for fault tolerance. Balancing the load among these servers is an important task to achieve better performance. There are various hardware and software based load balancing solutions available. However there is always an overhead on Servers and the Load Balancer while communicating with each other and sharing their availability and the current load status information. Load balancer is always busy in listening to clients' request and redirecting them. It also needs to collect the servers' availability status frequently, to keep itself up-to-date. Servers are busy in not only providing service to clients but also sharing their current load information with load balancing algorithms. In this paper we have proposed and discussed the concept and system model for software based load balancer along with Availability-Checker and Load Reporters (LB-ACLRs) which reduces the overhead on server and the load balancer. We have also described the architectural components with their roles and responsibilities. We have presented a detailed analysis to show how our proposed Availability Checker significantly increases the performance of the system. 2014 IEEE. -
Radon transform processed neural network for lung X-ray image based diagnosis
A novel method for image diagnosis with artificial learning is presented-ray images tuberculosis patients is subjected to neural network learning for prediction of diagnosis. The X-ray images of lungs are normally difficult for diagnosis, since its similarity to lung cancer. Under and over diagnosis of lung X-ray images is a difficult medical problem to resolve. In the present work radon transform of the x-ray images is fed to back propagation neural network trained with Levenberg algorithm. The present methodology gives sharp results, distincting the normal and abnormal images. 2014 IEEE. -
A Single Sign on based secure remote user authentication scheme for Multi-Server Environments
A Multi-Server Architecture comprises of a server environment having many different servers which provides the user the flexibility of accessing resources from multiple Service Providing Servers using the same credential. The primary objective of a Multi Server Environment (MSE) is to provide services of different Service Providers (SPs) without repeating registration at each SP server, and to get a unique single credential for all the servers in MSE. However, the conventional MSEs, proposed by various researchers, proposes the individual authentication service by each SP on their respective server using the credential issued by the Registration Authority of MSE. The mechanism requires the user to access each SP by keying the same credentials for every SP separately. Single Sign On (SSO) is an authentication mechanism that enables a user to sign-on once and access the services of various SPs in the same session. SAML is generally used as a Single Sign-On protocol. This work analyzes the smart card based authentication scheme for Multi-Server Environment proposed by Li et al.'s and discuss various security attacks on the said scheme. The paper also proposes a Secure Dynamic-ID based scheme using smart cards or crypto cards which do not require a verifier table and implements Single Sign On feature using SAML protocol, thus allowing the user to enjoy all the features of an MSE along with SSO. 2014 IEEE. -
Convective instability in a horizontal porous layer saturated with a chemically reacting Maxwell fluid
The problem of onset of convective instability in a horizontal inert porous layer saturated with a Maxwell viscoelastic fluid subject to zero-order chemical reaction is investigated by linear stability analysis. Modified Darcy-Maxwell model is used to describe the fluid motion. The horizontal porous layer is cooled from the upper boundary while an isothermal boundary condition is imposed at the lower boundary. Closed form solution pertaining to the basic quiescent state is obtained. The resulting eigenvalue problem is solved approximately using the Galerkin method. The Rayleigh number, characterizing the stability of the system, is calculated as a function of viscoelastic parameter, Darcy-Prandtl number, normalized porosity, and the Frank-Kamenetskii number. The possibility of oscillatory instability is discussed. 2013 AIP Publishing LLC. -
Convergent replicated data structures that tolerate eventual consistency in NoSQL databases
The Eventual consistency is a new type of database approach that has emerged in the field of NoSQL, which provides tremendous benefits over the traditional databases as it allows one to scale an application to new levels. The consistency models used by NoSQL database is explained in this paper. It elaborates how the eventually consistent data structure ensures consistency on storage system with multiple independent components which replicate data with loose coordination. The paper also discusses the eventually consistent model that ensures that all updates are applied in a consistent manner to all replicas. 2013 IEEE. -
DNA for information security: A Survey on DNA computing and a pseudo DNA method based on central dogma of molecular biology
Biology is a life science which has high significance on the quality of life and information security is that aspect for social edification, which human beings will never compromise. Both are subjects of high relevance and inevitable for mankind. So, an amalgamation of these subjects definitely turns up as utility technology, either for security or data storage and is known as Bio computing. The secure transfer of information was a major concern from ancient civilizations. Various techniques have been proposed to maintain security of data so that only intended recipient should be able to receive the message other than the sender. These practices became more significant with the introduction of the Internet. Information varies from big data to a particular word, but every piece of information requires proper storage and protection which is a major concern. Cryptography is an art or science of secrecy which protects information from unauthorized access. Various techniques evolved through years for information protection, including Ciphers, Cryptography, Steganography, Biometrics and recent DNA for security.DNA cryptography was a major breakthrough in the field of security which uses Bio-molecular concepts and gives us a new hope of unbreakable algorithms. This paper discusses various DNA based Cryptographic methods proposed till now. It also proposes a DNA symmetric algorithm based on the Pseudo DNA Cryptography and Central dogma of molecular biology. The suggested algorithm uses splicing and padding techniques along with complementary rules which make the algorithm more secure as it is an additional layer of security than conventional cryptographic techniques. 2014 IEEE. -
Upgradation of business applications with autonomic computing
Autonomic computing has come a long way since its inception a decade ago and has been positioned as a venerable and value-adding technology for producing and sustaining self-managing, real-time, and resilient systems for the future. A series of concerted efforts by multiple IT companies and academic research laboratories across the world have brought in a number of advancements in this discipline with vigorous study and research. A variety of proven and potential mathematical and computational concepts have been selected and synchronized to arrive at noteworthy improvements in the autonomic systems design, development, deployment, and delivery methods. Having understood the unique value-proposition and the significant achievements in the autonomic computing space, business executives and IT experts are consciously embracing the autonomic idea, which is very generic to be easily embedded in any kind of business and IT systems. However, the penetration of this technology into both IT and business applications has not been as originally envisaged by its creators due to various reasons. The business environment is still filled and saturated with large-scale packaged and monolithic applications. If the autonomic capabilities are innately squeezed into business and IT applications, then there can be major differentiators in how those applications function in seamlessly and spontaneously automating business operations. Both, existing as well as emerging applications can be targeted to become autonomic in their operations, outputs, and outlooks. In this paper, we have described how the leading enterprise packages (ERP, CRM, SCM, and so on.) can be enabled to be adaptive, highly available, secure, and scalable in their actions and reactions. The well-known enterprise applications such as CRM, Online Retail, and Marketing with focus on self-optimization characteristics are described here. A detailed analysis of a Discount Manager in an online retail scenario is also explained. The simulation results obtained clearly show how embedded autonomic capability is very close to human thinking and decision-making ability. 2013 ACM. -
Comparative analysis of Histogram Equalization techniques
Histogram Equalization (HE) is one of the techniques which is used for Image enhancement. This paper shows the comparative studies of Global Histogram Equalization, Local Histogram Equalization and Fast Quadratic Dynamic Histogram Equalization based on the execution time, mean squared error and Peak Signal to Noise Ratio (PSNR). This paper shows the experimental results for these three methods with graphical representation. 2014 IEEE. -
Structural characterization of graphene layers in various Indian coals by X-Ray Diffraction technique
The results of the structural investigation of three Indian coals showed that, the structural parameters like fa & Lc increased where as interlayer spacing d002 decreased with increase in carbon content, aromaticity and coal rank. These structural parameters change just opposite with increase in volatile matter content. Considering the 'turbostratic' structure for coals, the minimum separation between aromatic lamellae was found to vary between 3.34 to 3.61 A for these coals. As the aromaticity increased, the interlayer spacing decreased an indication of more graphitization of the sample. Volatile matter and carbon content had a strong influence on the aromaticity, interlayer spacing and stacking height on the sample. The average number of carbon atoms per aromatic lamellae and number of layers in the lamellae was found to be 16-21 and 7-8 for all the samples. Published under licence by IOP Publishing Ltd. -
Application of artificial neural networks in optimizing MPPT control for standalone solar PV system
Increasing demand of power supply and the limited nature of fossil fuel has resulted for the world to focus on renewable energy resources. Solar photovoltaic (PV) energy source being the most easily available, it is considered to have the potential to meet the ever increasing energy demand. Developing an intelligent system with Artificial Neural Networks (ANN) to track the Maximum Power Point (MPP) of a PV Array is being proposed in this paper. The system adopts Radial Basis Function Network (RBFN) architecture to optimize the control of Maximum Power Point Tracking (MPPT) for PV Systems. A PV array has non-linear output characteristics due to the insolation, temperature variations and the optimum operating point needs to be tracked in order to draw maximum power from the system. The output of the intelligent MPPT controller can be used to control the DC/DC converters to achieve maximum efficiency. 2014 IEEE. -
Cloud Computing with Machine Learning Could Help Us in the Early Diagnosis of Breast Cancer
The purpose of this study is to develop tools which could help the clinicians in the primary care hospitals with the early diagnosis of breast cancer diagnosis. Breast cancer is one of the leading forms of cancer in developing countries and often gets detected at the lateral stages. The detection of cancer at later stages results not only in pain and in agony to the patients but also puts lot of financial burden on the caregivers. In this work, we are presenting the preliminary results of the project code named BCDM (Breast Cancer Diagnosis using Machine Learning) developed using Mat lab. The algorithm developed in this research cancer work based on adaptive resonance theory. In this research work, we concluded how Art 1 network will help in classification of breast. The aim of the project is to eventually run the algorithm on a cloud computer and a clinician at a primary healthcare can use the system for the early diagnosis of the patients using web based interface from anywhere in the world. 2015 IEEE.