Browse Items (2150 total)
Sort by:
-
Parallelizing keyframe extraction for video summarization
In current era, most of the information is captured using multimedia techniques. Most used methods for information capturing is through images and videos. In processing a video, large information needs to be processed and a number of frames could contain similar information which could cause unnecessary delay in gathering the required information. Video summarization can speed up video processing. There are different techniques for video summarization. In this paper key frames are used for summarization. Key frames are extracted using discrete wavelet transforms. Two HD videos having 356 frames and 7293 frames were used as test videos and the runtime was 17 seconds and 98 seconds respectively in CPU and 11 seconds and 53 seconds respectively in GPU. 2015 IEEE. -
P-phase picker using virtual cloud-based Wireless Sensor Networks
Wireless Sensor Networks, mainly regarded as numerous resource-limited nodes linked via low bandwidth, have been intensively deployed for active volcano monitoring during the few past years. This paper studies the problem of primary waves received by seismic wireless sensors suffering from limited bandwidth, processing capacity, battery life and memory. To address these challenges, a new P-phase picking approach where sensors are virtualized using cloud computing architecture followed by a novel in-network signal processing algorithm, is proposed. The two principal merits of this paper are the clear demonstration that the Cloud Computing model is a good fit with the dynamic computational requirements of volcano monitoring and the novel signal processing algorithm for accurate P-phases picking. The proposed new model has been evaluated on Mount Nyiragongo using Eucalyptus/Open Stack with Orchestra-Juju for Private Sensor Cloud then to some famous public clouds such as Amozon EC2, ThingSpeak, SensorCloud and Pachube. The testing has been successful at 75%. The recommendation for future work would be to improve the effectiveness of virtual sensors by applying optimization techniques and other methods. 2015 IEEE. -
Characterization and comparison studies of Bentonite and Flyash for electrical grounding
Earthing or Grounding is an Electrical system consists of electrodes which serves as an electrical connection from an electric circuit in the system to the earth or ground. Traditional Earthing- where we mix charcoal and salt offers low resistance to the fault current flow developed from a Low operating Voltages. Since operating voltages are high now a days, Short circuit current also increased. Traditional method of Earthing is replaced by chemical Earthing.Bentonite which is mainly used in chemical Earthing serves the requirement of Low resistance Earthing pits and also have the property to retain the moisture. In this paper an attempt had been made to assure the Flyash usage in the grounding pit and this paper discusses the Characterization, Comparison and Field Studies on Earthing Pit constructed with Bentonite and Fly ash layers. 2015 IEEE. -
Stock price forecasting using ANN method
Ability to predict stock price direction accurately is essential for investors to maximize their wealth. Neural networks, as a highly effective data mining method, have been used in many different complex pattern recognition problems including stock market prediction. But the ongoing way of using neural networks for a dynamic and volatile behavior of stock markets has not resulted in more efficient and correct values. In this research paper, we propose methods to provide more accurately by hidden layer data processing and decision tree methods for stock market prediction for the case of volatile markets. We also compare and determine our proposed method against three layer feed forward neural network for the accuracy of market direction. From the analysis, we prove that with our way of application of neural networks, the accuracy of prediction is improved. Springer India 2016. -
A multi-Threshold triggering and QoS aware vertical handover management in heterogeneous wireless networks
Vertical handover management provides seamless connectivity in heterogeneous wireless networks. But still there are different challenges that need to be addressed. These challenges include the inappropriate network selection, wrong cell handover, etc. Therefore, in this article, we proposed a handover management scheme based on the data rate and QoS of available networks. The handover triggering is performed on the data rate requires by different applications. Similarly, the network selection is performed by considering the cost, data rate of available networks and energy consumption by the mobile interface. The proposed scheme is simulated in different mobility scenarios with a random number of applications running on various numbers of mobile nodes. The simulation results show that the proposed scheme requires less energy during the scanning and selection of available networks. 2015 IEEE. -
Performance analysis of OFF-GRID solar photo voltaic system
Day by day the demand for electrical energy is increasing. We can't rely on conventional energy sources for meeting this increasing demand as they are depleting. So it is necessary to find an alternative method to harness the energy that we are lacking. Solar energy generation seems to be a promising technology for this dilemma. It is environmental friendly and infinite source of energy. Photovoltaic systems can be broadly classified into two-an on-grid system or an off-grid system. The energy generated from a solar PV system is based on several factors like irradiance, types of solar PV used and temperature. Analyzing the existing system efficiency is of prime importance for the characterization of the problems and for the improvements. This study deals with the performance analysis of an on-grid and off-grid system. The analysis is carried out by modeling an existing system in MATLAB/SIMULINK which is already in operation. It can be extended to analyze the grid stability. This study aims the quantification of various performance parameters like power output, losses in the system, system efficiency and the total energy transfer. 2015 IEEE. -
A parallel approach for region-growing segmentation
Image Segmentations play a heavy role in areas such as computer vision and image processing due to its broad usage and immense applications. Because of the large importance of image segmentation a number of algorithms have been proposed and different approaches have been adopted. In this theme I tried to parallelize the image segmentation using a region growing algorithm. The primary goal behind this theme is to enhance performance or speed up the image segmentation on large volume image data sets, i.e. Very high resolution images (VHR). In parliamentary law to get the full advantage of GPU computing, equally spread the workload among the available threads. Threads assigned to individual pixels iteratively merge with adjacent segments and always ensuring the standards that the heterogeneity of image objects should be belittled. An experimental analysis upon different orbital sensor images has made out in order to assess the quality of results. 2015 IEEE. -
Extraction of Web News from Web Pages Using a Ternary Tree Approach
The spread of information available in the World Wide Web, it appears that the pursuit of quality data is effortless and simple but it has been a significant matter of concern. Various extractors, wrappers systems with advanced techniques have been studied that retrieves the desired data from a collection of web pages. In this paper we propose a method for extracting the news content from multiple news web sites considering the occurrence of similar pattern in their representation such as date, place and the content of the news that overcomes the cost and space constraint observed in previous studies which work on single web document at a time. The method is an unsupervised web extraction technique which builds a pattern representing the structure of the pages using the extraction rules learned from the web pages by creating a ternary tree which expands when a series of common tags are found in the web pages. The pattern can then be used to extract news from other new web pages. The analysis and the results on real time web sites validate the effectiveness of our approach. 2015 IEEE. -
Multi-view video summarization
Video summarization is the most important video content service which gives us a short and condensed representation of the whole video content. It also ensures the browsing, mining, and storage of the original videos. The multi- view video summaries will produce only the most vital events with more detailed information than those of less salient ones. As such, it allows the interface user to get only the important information or the video from different perspectives of the multi-view videos without watching the whole video. In our research paper, we are focusing on a series of approaches to summarize the video content and to get a compact and succinct visual summary that encapsulates the key components of the video. Its main advantage is that the video summarization can turn numbers of hours long video into a short summary that an individual viewer can see in just few seconds. Springer India 2016. -
Safe cloud: Secure and usable authentication framework for cloud environment
Cloud computing an emerging computing model having its roots in grid and utility computing is gaining increasing attention of both the industry and laymen. The ready availability of storage, compute, and infrastructure services provides a potentially attractive option for business enterprises to process and store data without investing on computing infrastructure. The attractions of Cloud are accompanied by many concerns among which Data Security is the one that requires immediate attention. Strong user authentication mechanisms which prevent illegal access to Cloud services and resources are one of the core requirements to ensure secure access. This paper proposes a user authentication framework for Cloud which facilitates authentication by individual service providers as well as by a third party identity provider. The proposed two-factor authentication protocols uses password as the first factor and a Smart card or Mobile Phone as the second factor. The protocols are resistant to various known security attacks. Springer India 2016. -
Cloud Computing with Machine Learning Could Help Us in the Early Diagnosis of Breast Cancer
The purpose of this study is to develop tools which could help the clinicians in the primary care hospitals with the early diagnosis of breast cancer diagnosis. Breast cancer is one of the leading forms of cancer in developing countries and often gets detected at the lateral stages. The detection of cancer at later stages results not only in pain and in agony to the patients but also puts lot of financial burden on the caregivers. In this work, we are presenting the preliminary results of the project code named BCDM (Breast Cancer Diagnosis using Machine Learning) developed using Mat lab. The algorithm developed in this research cancer work based on adaptive resonance theory. In this research work, we concluded how Art 1 network will help in classification of breast. The aim of the project is to eventually run the algorithm on a cloud computer and a clinician at a primary healthcare can use the system for the early diagnosis of the patients using web based interface from anywhere in the world. 2015 IEEE. -
Application of artificial neural networks in optimizing MPPT control for standalone solar PV system
Increasing demand of power supply and the limited nature of fossil fuel has resulted for the world to focus on renewable energy resources. Solar photovoltaic (PV) energy source being the most easily available, it is considered to have the potential to meet the ever increasing energy demand. Developing an intelligent system with Artificial Neural Networks (ANN) to track the Maximum Power Point (MPP) of a PV Array is being proposed in this paper. The system adopts Radial Basis Function Network (RBFN) architecture to optimize the control of Maximum Power Point Tracking (MPPT) for PV Systems. A PV array has non-linear output characteristics due to the insolation, temperature variations and the optimum operating point needs to be tracked in order to draw maximum power from the system. The output of the intelligent MPPT controller can be used to control the DC/DC converters to achieve maximum efficiency. 2014 IEEE. -
Structural characterization of graphene layers in various Indian coals by X-Ray Diffraction technique
The results of the structural investigation of three Indian coals showed that, the structural parameters like fa & Lc increased where as interlayer spacing d002 decreased with increase in carbon content, aromaticity and coal rank. These structural parameters change just opposite with increase in volatile matter content. Considering the 'turbostratic' structure for coals, the minimum separation between aromatic lamellae was found to vary between 3.34 to 3.61 A for these coals. As the aromaticity increased, the interlayer spacing decreased an indication of more graphitization of the sample. Volatile matter and carbon content had a strong influence on the aromaticity, interlayer spacing and stacking height on the sample. The average number of carbon atoms per aromatic lamellae and number of layers in the lamellae was found to be 16-21 and 7-8 for all the samples. Published under licence by IOP Publishing Ltd. -
Comparative analysis of Histogram Equalization techniques
Histogram Equalization (HE) is one of the techniques which is used for Image enhancement. This paper shows the comparative studies of Global Histogram Equalization, Local Histogram Equalization and Fast Quadratic Dynamic Histogram Equalization based on the execution time, mean squared error and Peak Signal to Noise Ratio (PSNR). This paper shows the experimental results for these three methods with graphical representation. 2014 IEEE. -
Upgradation of business applications with autonomic computing
Autonomic computing has come a long way since its inception a decade ago and has been positioned as a venerable and value-adding technology for producing and sustaining self-managing, real-time, and resilient systems for the future. A series of concerted efforts by multiple IT companies and academic research laboratories across the world have brought in a number of advancements in this discipline with vigorous study and research. A variety of proven and potential mathematical and computational concepts have been selected and synchronized to arrive at noteworthy improvements in the autonomic systems design, development, deployment, and delivery methods. Having understood the unique value-proposition and the significant achievements in the autonomic computing space, business executives and IT experts are consciously embracing the autonomic idea, which is very generic to be easily embedded in any kind of business and IT systems. However, the penetration of this technology into both IT and business applications has not been as originally envisaged by its creators due to various reasons. The business environment is still filled and saturated with large-scale packaged and monolithic applications. If the autonomic capabilities are innately squeezed into business and IT applications, then there can be major differentiators in how those applications function in seamlessly and spontaneously automating business operations. Both, existing as well as emerging applications can be targeted to become autonomic in their operations, outputs, and outlooks. In this paper, we have described how the leading enterprise packages (ERP, CRM, SCM, and so on.) can be enabled to be adaptive, highly available, secure, and scalable in their actions and reactions. The well-known enterprise applications such as CRM, Online Retail, and Marketing with focus on self-optimization characteristics are described here. A detailed analysis of a Discount Manager in an online retail scenario is also explained. The simulation results obtained clearly show how embedded autonomic capability is very close to human thinking and decision-making ability. 2013 ACM. -
DNA for information security: A Survey on DNA computing and a pseudo DNA method based on central dogma of molecular biology
Biology is a life science which has high significance on the quality of life and information security is that aspect for social edification, which human beings will never compromise. Both are subjects of high relevance and inevitable for mankind. So, an amalgamation of these subjects definitely turns up as utility technology, either for security or data storage and is known as Bio computing. The secure transfer of information was a major concern from ancient civilizations. Various techniques have been proposed to maintain security of data so that only intended recipient should be able to receive the message other than the sender. These practices became more significant with the introduction of the Internet. Information varies from big data to a particular word, but every piece of information requires proper storage and protection which is a major concern. Cryptography is an art or science of secrecy which protects information from unauthorized access. Various techniques evolved through years for information protection, including Ciphers, Cryptography, Steganography, Biometrics and recent DNA for security.DNA cryptography was a major breakthrough in the field of security which uses Bio-molecular concepts and gives us a new hope of unbreakable algorithms. This paper discusses various DNA based Cryptographic methods proposed till now. It also proposes a DNA symmetric algorithm based on the Pseudo DNA Cryptography and Central dogma of molecular biology. The suggested algorithm uses splicing and padding techniques along with complementary rules which make the algorithm more secure as it is an additional layer of security than conventional cryptographic techniques. 2014 IEEE. -
Convergent replicated data structures that tolerate eventual consistency in NoSQL databases
The Eventual consistency is a new type of database approach that has emerged in the field of NoSQL, which provides tremendous benefits over the traditional databases as it allows one to scale an application to new levels. The consistency models used by NoSQL database is explained in this paper. It elaborates how the eventually consistent data structure ensures consistency on storage system with multiple independent components which replicate data with loose coordination. The paper also discusses the eventually consistent model that ensures that all updates are applied in a consistent manner to all replicas. 2013 IEEE. -
Convective instability in a horizontal porous layer saturated with a chemically reacting Maxwell fluid
The problem of onset of convective instability in a horizontal inert porous layer saturated with a Maxwell viscoelastic fluid subject to zero-order chemical reaction is investigated by linear stability analysis. Modified Darcy-Maxwell model is used to describe the fluid motion. The horizontal porous layer is cooled from the upper boundary while an isothermal boundary condition is imposed at the lower boundary. Closed form solution pertaining to the basic quiescent state is obtained. The resulting eigenvalue problem is solved approximately using the Galerkin method. The Rayleigh number, characterizing the stability of the system, is calculated as a function of viscoelastic parameter, Darcy-Prandtl number, normalized porosity, and the Frank-Kamenetskii number. The possibility of oscillatory instability is discussed. 2013 AIP Publishing LLC. -
A Single Sign on based secure remote user authentication scheme for Multi-Server Environments
A Multi-Server Architecture comprises of a server environment having many different servers which provides the user the flexibility of accessing resources from multiple Service Providing Servers using the same credential. The primary objective of a Multi Server Environment (MSE) is to provide services of different Service Providers (SPs) without repeating registration at each SP server, and to get a unique single credential for all the servers in MSE. However, the conventional MSEs, proposed by various researchers, proposes the individual authentication service by each SP on their respective server using the credential issued by the Registration Authority of MSE. The mechanism requires the user to access each SP by keying the same credentials for every SP separately. Single Sign On (SSO) is an authentication mechanism that enables a user to sign-on once and access the services of various SPs in the same session. SAML is generally used as a Single Sign-On protocol. This work analyzes the smart card based authentication scheme for Multi-Server Environment proposed by Li et al.'s and discuss various security attacks on the said scheme. The paper also proposes a Secure Dynamic-ID based scheme using smart cards or crypto cards which do not require a verifier table and implements Single Sign On feature using SAML protocol, thus allowing the user to enjoy all the features of an MSE along with SSO. 2014 IEEE. -
Radon transform processed neural network for lung X-ray image based diagnosis
A novel method for image diagnosis with artificial learning is presented-ray images tuberculosis patients is subjected to neural network learning for prediction of diagnosis. The X-ray images of lungs are normally difficult for diagnosis, since its similarity to lung cancer. Under and over diagnosis of lung X-ray images is a difficult medical problem to resolve. In the present work radon transform of the x-ray images is fed to back propagation neural network trained with Levenberg algorithm. The present methodology gives sharp results, distincting the normal and abnormal images. 2014 IEEE.