Browse Items (11807 total)
Sort by:
-
An approach for document pre-processing and K Means algorithm implementation
The web mining is a cutting edge technology, which includes information gathering and classification of information over web. This paper puts forth the concepts of document pre-processing, which is achieved by extraction of keywords from the documents fetched from the web, processing it and generating a term-document matrix, TF-IDF and the different approaches of TF-IDF (term frequency Inverse document frequency) for each respective document. The last step is the clustering of these results through K Means algorithm, by comparing the performance of each approach used. The algorithm is realized on an X64 architecture and coded on Java and Matlab platform. The results are tabulated. 2014 IEEE. -
GPR based subsurface geotechnical exploration
The Seismic refraction technique (SRT) and Electrical resistivity technique (ERT) have long been in use in geotechnical exploration. A relatively recent technique is Ground penetrating radar (GPR). The study presented in this paper is on GPR-aided geotechnical subsurface exploration. The usual method of exploration is drilling, which gives much-needed site-specific information, but is expensive and restricted to a few point locations. The possibilities of non-invasive investigation offered by GPR make it useful for supplementing geotechnical investigations. The present work describes GPR survey at a construction site in Mumbai. The objective was to derive subsurface logs from GPR signals. Conventionally, subsurface logging is done using boreholes. First, the extracted soil and rock samples are examined visually. Second, additional information such as Core recovery ratios (CRR), Rock quality designation (RQD) and Standard penetration test (SPT) N values are collected and strata are demarcated. In comparison, the amplitude variations of GPR signals may not correspond directly to variations of these physical properties with depth. However, the study shows that fairly good correlations do exist with the subsurface stratification and transformed signals. -
Depletion studies in the interstellar medium
We report interstellar Si depletion and dust-phase column densities of Si along 131 Galactic sight lines using previously reported gas-phase Si II column densities, after correcting for the differences in oscillator strengths. With our large sample, we could reproduce the previously reported correlations between depletion of Si and average density of hydrogen along the line of sight () as well as molecular fraction of hydrogen (f(H2). We have also studied the variation of amount of Si incorporated in dust with respect to different extinction parameters. With the limitations we have with the quality of data, we could find a strong relation between the Si dust and extinction. While we cannot predict the density dependent distribution of size of Si grains, we discuss about the large depletion fraction of Si and the bigger size of the silicate grains. 2013 AIP Publishing LLC. -
Cataloging of happy facial affect using a radial basis function neural network
The paper entitled "Cataloging of Happy facial Affect using a Radial Basis Function Neural Network" has developed an affect recognition system for identifying happy affect from faces using a RBF neural network. The methodology adapted by this research is a four step process: image preprocessing, marking of region of interest, feature extraction and a classification network. The emotion recognition system has been a momentous field in human-computer interaction. Though it is considerably a challenging field to make a system intelligent that is able to identify and understand human emotions for various vital purposes, e.g. security, society, entertainment but many research work has been done and going on, in order to produce an accurate and effective emotion recognition system. Emotion recognition system can be classified into facial emotion recognition and speech emotion recognition. This work is on facial emotion recognition that identifies one of the seven basic emotions i.e. happy affect. This is carried out by extracting unique facial expression feature; calculating euclidean distance, and building the feature vector. For classification radial basis function neural network is used. The deployment was done in Matlab. The happy affect recognition system gave satisfactory results. 2013 Springer. -
Load balancing with availability checker and load reporters (LB-ACLRs) for improved performance in distributed systems
Distributed system has quite a lot of servers to attain increased availability of service and for fault tolerance. Balancing the load among these servers is an important task to achieve better performance. There are various hardware and software based load balancing solutions available. However there is always an overhead on Servers and the Load Balancer while communicating with each other and sharing their availability and the current load status information. Load balancer is always busy in listening to clients' request and redirecting them. It also needs to collect the servers' availability status frequently, to keep itself up-to-date. Servers are busy in not only providing service to clients but also sharing their current load information with load balancing algorithms. In this paper we have proposed and discussed the concept and system model for software based load balancer along with Availability-Checker and Load Reporters (LB-ACLRs) which reduces the overhead on server and the load balancer. We have also described the architectural components with their roles and responsibilities. We have presented a detailed analysis to show how our proposed Availability Checker significantly increases the performance of the system. 2014 IEEE. -
Radon transform processed neural network for lung X-ray image based diagnosis
A novel method for image diagnosis with artificial learning is presented-ray images tuberculosis patients is subjected to neural network learning for prediction of diagnosis. The X-ray images of lungs are normally difficult for diagnosis, since its similarity to lung cancer. Under and over diagnosis of lung X-ray images is a difficult medical problem to resolve. In the present work radon transform of the x-ray images is fed to back propagation neural network trained with Levenberg algorithm. The present methodology gives sharp results, distincting the normal and abnormal images. 2014 IEEE. -
A Single Sign on based secure remote user authentication scheme for Multi-Server Environments
A Multi-Server Architecture comprises of a server environment having many different servers which provides the user the flexibility of accessing resources from multiple Service Providing Servers using the same credential. The primary objective of a Multi Server Environment (MSE) is to provide services of different Service Providers (SPs) without repeating registration at each SP server, and to get a unique single credential for all the servers in MSE. However, the conventional MSEs, proposed by various researchers, proposes the individual authentication service by each SP on their respective server using the credential issued by the Registration Authority of MSE. The mechanism requires the user to access each SP by keying the same credentials for every SP separately. Single Sign On (SSO) is an authentication mechanism that enables a user to sign-on once and access the services of various SPs in the same session. SAML is generally used as a Single Sign-On protocol. This work analyzes the smart card based authentication scheme for Multi-Server Environment proposed by Li et al.'s and discuss various security attacks on the said scheme. The paper also proposes a Secure Dynamic-ID based scheme using smart cards or crypto cards which do not require a verifier table and implements Single Sign On feature using SAML protocol, thus allowing the user to enjoy all the features of an MSE along with SSO. 2014 IEEE. -
Convective instability in a horizontal porous layer saturated with a chemically reacting Maxwell fluid
The problem of onset of convective instability in a horizontal inert porous layer saturated with a Maxwell viscoelastic fluid subject to zero-order chemical reaction is investigated by linear stability analysis. Modified Darcy-Maxwell model is used to describe the fluid motion. The horizontal porous layer is cooled from the upper boundary while an isothermal boundary condition is imposed at the lower boundary. Closed form solution pertaining to the basic quiescent state is obtained. The resulting eigenvalue problem is solved approximately using the Galerkin method. The Rayleigh number, characterizing the stability of the system, is calculated as a function of viscoelastic parameter, Darcy-Prandtl number, normalized porosity, and the Frank-Kamenetskii number. The possibility of oscillatory instability is discussed. 2013 AIP Publishing LLC. -
Convergent replicated data structures that tolerate eventual consistency in NoSQL databases
The Eventual consistency is a new type of database approach that has emerged in the field of NoSQL, which provides tremendous benefits over the traditional databases as it allows one to scale an application to new levels. The consistency models used by NoSQL database is explained in this paper. It elaborates how the eventually consistent data structure ensures consistency on storage system with multiple independent components which replicate data with loose coordination. The paper also discusses the eventually consistent model that ensures that all updates are applied in a consistent manner to all replicas. 2013 IEEE. -
DNA for information security: A Survey on DNA computing and a pseudo DNA method based on central dogma of molecular biology
Biology is a life science which has high significance on the quality of life and information security is that aspect for social edification, which human beings will never compromise. Both are subjects of high relevance and inevitable for mankind. So, an amalgamation of these subjects definitely turns up as utility technology, either for security or data storage and is known as Bio computing. The secure transfer of information was a major concern from ancient civilizations. Various techniques have been proposed to maintain security of data so that only intended recipient should be able to receive the message other than the sender. These practices became more significant with the introduction of the Internet. Information varies from big data to a particular word, but every piece of information requires proper storage and protection which is a major concern. Cryptography is an art or science of secrecy which protects information from unauthorized access. Various techniques evolved through years for information protection, including Ciphers, Cryptography, Steganography, Biometrics and recent DNA for security.DNA cryptography was a major breakthrough in the field of security which uses Bio-molecular concepts and gives us a new hope of unbreakable algorithms. This paper discusses various DNA based Cryptographic methods proposed till now. It also proposes a DNA symmetric algorithm based on the Pseudo DNA Cryptography and Central dogma of molecular biology. The suggested algorithm uses splicing and padding techniques along with complementary rules which make the algorithm more secure as it is an additional layer of security than conventional cryptographic techniques. 2014 IEEE. -
Upgradation of business applications with autonomic computing
Autonomic computing has come a long way since its inception a decade ago and has been positioned as a venerable and value-adding technology for producing and sustaining self-managing, real-time, and resilient systems for the future. A series of concerted efforts by multiple IT companies and academic research laboratories across the world have brought in a number of advancements in this discipline with vigorous study and research. A variety of proven and potential mathematical and computational concepts have been selected and synchronized to arrive at noteworthy improvements in the autonomic systems design, development, deployment, and delivery methods. Having understood the unique value-proposition and the significant achievements in the autonomic computing space, business executives and IT experts are consciously embracing the autonomic idea, which is very generic to be easily embedded in any kind of business and IT systems. However, the penetration of this technology into both IT and business applications has not been as originally envisaged by its creators due to various reasons. The business environment is still filled and saturated with large-scale packaged and monolithic applications. If the autonomic capabilities are innately squeezed into business and IT applications, then there can be major differentiators in how those applications function in seamlessly and spontaneously automating business operations. Both, existing as well as emerging applications can be targeted to become autonomic in their operations, outputs, and outlooks. In this paper, we have described how the leading enterprise packages (ERP, CRM, SCM, and so on.) can be enabled to be adaptive, highly available, secure, and scalable in their actions and reactions. The well-known enterprise applications such as CRM, Online Retail, and Marketing with focus on self-optimization characteristics are described here. A detailed analysis of a Discount Manager in an online retail scenario is also explained. The simulation results obtained clearly show how embedded autonomic capability is very close to human thinking and decision-making ability. 2013 ACM. -
Comparative analysis of Histogram Equalization techniques
Histogram Equalization (HE) is one of the techniques which is used for Image enhancement. This paper shows the comparative studies of Global Histogram Equalization, Local Histogram Equalization and Fast Quadratic Dynamic Histogram Equalization based on the execution time, mean squared error and Peak Signal to Noise Ratio (PSNR). This paper shows the experimental results for these three methods with graphical representation. 2014 IEEE. -
Structural characterization of graphene layers in various Indian coals by X-Ray Diffraction technique
The results of the structural investigation of three Indian coals showed that, the structural parameters like fa & Lc increased where as interlayer spacing d002 decreased with increase in carbon content, aromaticity and coal rank. These structural parameters change just opposite with increase in volatile matter content. Considering the 'turbostratic' structure for coals, the minimum separation between aromatic lamellae was found to vary between 3.34 to 3.61 A for these coals. As the aromaticity increased, the interlayer spacing decreased an indication of more graphitization of the sample. Volatile matter and carbon content had a strong influence on the aromaticity, interlayer spacing and stacking height on the sample. The average number of carbon atoms per aromatic lamellae and number of layers in the lamellae was found to be 16-21 and 7-8 for all the samples. Published under licence by IOP Publishing Ltd. -
Application of artificial neural networks in optimizing MPPT control for standalone solar PV system
Increasing demand of power supply and the limited nature of fossil fuel has resulted for the world to focus on renewable energy resources. Solar photovoltaic (PV) energy source being the most easily available, it is considered to have the potential to meet the ever increasing energy demand. Developing an intelligent system with Artificial Neural Networks (ANN) to track the Maximum Power Point (MPP) of a PV Array is being proposed in this paper. The system adopts Radial Basis Function Network (RBFN) architecture to optimize the control of Maximum Power Point Tracking (MPPT) for PV Systems. A PV array has non-linear output characteristics due to the insolation, temperature variations and the optimum operating point needs to be tracked in order to draw maximum power from the system. The output of the intelligent MPPT controller can be used to control the DC/DC converters to achieve maximum efficiency. 2014 IEEE. -
Cloud Computing with Machine Learning Could Help Us in the Early Diagnosis of Breast Cancer
The purpose of this study is to develop tools which could help the clinicians in the primary care hospitals with the early diagnosis of breast cancer diagnosis. Breast cancer is one of the leading forms of cancer in developing countries and often gets detected at the lateral stages. The detection of cancer at later stages results not only in pain and in agony to the patients but also puts lot of financial burden on the caregivers. In this work, we are presenting the preliminary results of the project code named BCDM (Breast Cancer Diagnosis using Machine Learning) developed using Mat lab. The algorithm developed in this research cancer work based on adaptive resonance theory. In this research work, we concluded how Art 1 network will help in classification of breast. The aim of the project is to eventually run the algorithm on a cloud computer and a clinician at a primary healthcare can use the system for the early diagnosis of the patients using web based interface from anywhere in the world. 2015 IEEE. -
Safe cloud: Secure and usable authentication framework for cloud environment
Cloud computing an emerging computing model having its roots in grid and utility computing is gaining increasing attention of both the industry and laymen. The ready availability of storage, compute, and infrastructure services provides a potentially attractive option for business enterprises to process and store data without investing on computing infrastructure. The attractions of Cloud are accompanied by many concerns among which Data Security is the one that requires immediate attention. Strong user authentication mechanisms which prevent illegal access to Cloud services and resources are one of the core requirements to ensure secure access. This paper proposes a user authentication framework for Cloud which facilitates authentication by individual service providers as well as by a third party identity provider. The proposed two-factor authentication protocols uses password as the first factor and a Smart card or Mobile Phone as the second factor. The protocols are resistant to various known security attacks. Springer India 2016. -
Multi-view video summarization
Video summarization is the most important video content service which gives us a short and condensed representation of the whole video content. It also ensures the browsing, mining, and storage of the original videos. The multi- view video summaries will produce only the most vital events with more detailed information than those of less salient ones. As such, it allows the interface user to get only the important information or the video from different perspectives of the multi-view videos without watching the whole video. In our research paper, we are focusing on a series of approaches to summarize the video content and to get a compact and succinct visual summary that encapsulates the key components of the video. Its main advantage is that the video summarization can turn numbers of hours long video into a short summary that an individual viewer can see in just few seconds. Springer India 2016. -
Extraction of Web News from Web Pages Using a Ternary Tree Approach
The spread of information available in the World Wide Web, it appears that the pursuit of quality data is effortless and simple but it has been a significant matter of concern. Various extractors, wrappers systems with advanced techniques have been studied that retrieves the desired data from a collection of web pages. In this paper we propose a method for extracting the news content from multiple news web sites considering the occurrence of similar pattern in their representation such as date, place and the content of the news that overcomes the cost and space constraint observed in previous studies which work on single web document at a time. The method is an unsupervised web extraction technique which builds a pattern representing the structure of the pages using the extraction rules learned from the web pages by creating a ternary tree which expands when a series of common tags are found in the web pages. The pattern can then be used to extract news from other new web pages. The analysis and the results on real time web sites validate the effectiveness of our approach. 2015 IEEE. -
A parallel approach for region-growing segmentation
Image Segmentations play a heavy role in areas such as computer vision and image processing due to its broad usage and immense applications. Because of the large importance of image segmentation a number of algorithms have been proposed and different approaches have been adopted. In this theme I tried to parallelize the image segmentation using a region growing algorithm. The primary goal behind this theme is to enhance performance or speed up the image segmentation on large volume image data sets, i.e. Very high resolution images (VHR). In parliamentary law to get the full advantage of GPU computing, equally spread the workload among the available threads. Threads assigned to individual pixels iteratively merge with adjacent segments and always ensuring the standards that the heterogeneity of image objects should be belittled. An experimental analysis upon different orbital sensor images has made out in order to assess the quality of results. 2015 IEEE. -
Performance analysis of OFF-GRID solar photo voltaic system
Day by day the demand for electrical energy is increasing. We can't rely on conventional energy sources for meeting this increasing demand as they are depleting. So it is necessary to find an alternative method to harness the energy that we are lacking. Solar energy generation seems to be a promising technology for this dilemma. It is environmental friendly and infinite source of energy. Photovoltaic systems can be broadly classified into two-an on-grid system or an off-grid system. The energy generated from a solar PV system is based on several factors like irradiance, types of solar PV used and temperature. Analyzing the existing system efficiency is of prime importance for the characterization of the problems and for the improvements. This study deals with the performance analysis of an on-grid and off-grid system. The analysis is carried out by modeling an existing system in MATLAB/SIMULINK which is already in operation. It can be extended to analyze the grid stability. This study aims the quantification of various performance parameters like power output, losses in the system, system efficiency and the total energy transfer. 2015 IEEE.