Volume-2, Issue-4

July-August, 2016

1. The Art of Scheduling in Cloud Computing

Cloud computing is one of the fastest growing technologies which has replaced machine paradigm shift. Cloud computing provides very large scalable and virtualised resources over Internet. In Cloud computing, there are many jobs that are required to be executed by available resources while achieving best performance, minimal total time for completion, shortest response time, utilization of resource etc. To achieve these objectives we need to design, develop and propose a scheduling algorithm. In this paper we are surveying various types of scheduling techniques and issues related to them in Cloud computing. Here we have also surveyed various existing algorithms to find their appropriation according to our needs and their shortcomings.

Published by: Harshita Vashishth, Kamal PrakashResearch Area: Information Technology Department

Organisation: Maharishi Markandeshwar University Mullana, HaryanaKeywords: Cloud Computing, Scheduling, Algorithm, Quality of Service, Job Scheduling.

2. A Closer Overview on Blur Detection-A Review

The image blurring is caused by motion and out of focus parameters and type of blur can be classified as global blur and local blur.in this paper the most challenging spatially varying blured detection schemes are proposed..In this the blur detection techniques for digital images are used in order to determine the blur detection several classifiers are used.In this paper we reviewed SVM & DCT based different blur detections.

Published by: Ravi Saini, Sarita BajajResearch Area: Electronics and Communication

Organisation: DIET, Karnal, HaryanaKeywords: Blur detection,SVM and DCT,motion blur.

3. Study of Different Techniques for Human Identification using Finger Knuckle Approach

There are different biometric modalities used to identify person which includes palmprint, face, fingerprint, iris and hand geometry. Apart from these biometric modalities, finger knuckle print also used as one of the cost effective biometric identifier. Finger knuckle print is defined by the back side of fingers. On the back side of fingers there are three joints named as Metacarpophalangeal (MCP) joint, Proximal InterPhalangeal (PIP) joint, distal InterPhalangeal (DIP) joint. The joint which connects hand with the fingers is known as MCP joint and the pattern generated on MCP joint is referred as second minor finger knuckle print. The joint in the middle of finger is known as PIP joint and the pattern generated on this joint is referred as major finger knuckle print. The joint on the tip of finger is known as DIP joint and the pattern generated on this joint is referred as minor finger knuckle print.

Published by: Sanjna Singla, Supreet KaurResearch Area: Computer Science and Engineering

Organisation: Punjabi University Regional Centre for Information Technology and Management, Mohali, Punjab, IndiaKeywords: Matching, Biometrics, Knuckles, Accuracy.

4. Non-Probabilistic K-Nearest Neighbor for Automatic News Classification Model with K-Means Clustering

The news classification is the branch of text classification or text mining. The researchers have already done a lot of work on the text classification models with different approaches. The news works has to be classified in the form of various categories such as sports, political, technology, business, science, health, regional and many other similar categories. The researchers have already worked with many supervised and unsupervised methods for the purpose of news classification. The supervised models have been found more efficient for the purpose of news classification. The k-means algorithm has been used for the classification of the keywords into the multiple groups. The k-nearest neighbor (kNN) classification algorithm has been utilized to estimate the category of the news in the processing. The proposed model has been recorded with the average accuracy of the 93.28% obtained after averaging the accuracy of all test cases, which higher than the previous best performer naïve bayes and SVM based news classifier, which has posted nearly 83.5% of accuracy for classifying the news data. The proposed model has been tested with the 91%, 95%, 90% and 97% of the accuracy over the input test cases of S1, S2, S3 and S4 respectively, which higher than all of the existing models. Hence the proposed model can be declared as the better solution than the previous classification models.

Published by: Akanksha GuptaResearch Area: Text Mining

Organisation: Shaheed Udham Singh College of Engineering & Technology, Tangori,PunjabKeywords: News classification, k-nearest neighbor, k-means classification, support vector machine, N-gram analysis

5. MRI Fuzzy Segmentation of Brain Tumor with Fuzzy Level Set Optimization

Image segmentation is a task that is fundamental many image processing and computer vision applications. Due to the existence of noise, low contrast, and intensity in homogeneity, it really is still a difficult issue in majority of applications. One of the steps that are first way of understanding images is to segment them in order to find down different objects inside them. However, in real images such as MRI graphics, noise is corrupting the image information or image usually consists of textured sections. The images produced by MRI scans are frequently grey images with strength in the product range scale that is gray. The MRI image associated with the brain comprises of the cortex that lines the surface that is outside of brain additionally the gray nuclei deep inside of the mind including the thalami and basal ganglia. As Cancer may be the leading cause of death for all as the explanation for the condition remains unknown, very early detection and diagnosis is one of the keys to cancer control, and it will increase the success of treatment, save lives and reduce expenses. Health imaging is very often used tools which can be diagnostic detect and classify defects. To eliminate the dependence of the operator and increase the precision of diagnosis system aided diagnosis computer are a valuable and ensures that are advantageous the detection of cancer tumors and classification. Segmentation techniques based on gray level techniques such as for instance threshold and methods based on region are the easiest and find application that is restricted. However, their performance can be improved by incorporating them with the ways of hybrid clustering. practices based on textural characteristics atlas that is using look-up table can have very good results on the segmentation of medical pictures , however, they require expertise within the construction of the atlas Limiting the technical atlas based is that , in some circumstances , it becomes difficult to choose correctly and label information has difficulty in segmenting complex structure with variable form, size and properties such circumstances it is best to use unsupervised methods such as fuzzy algorithms. In this work we proposed a novel fuzzy based MRI Image Segmentation algorithm, Fuzzy Segmentation involves the task of dividing data points into homogeneous classes or clusters making sure that things within the same class are as similar as possible and items in numerous classes are as dissimilar as you can.

Published by: Poonam Khokher, Kiran JainResearch Area: Department Of Computer Science and Engineering

Organisation: DVIET, Karnal, HaryanaKeywords: Fuzzy segmentation, MRI, segmentation techniques

6. Tumor Segmentation and Automated Training for Liver Cancer Isolation

Image segmentation is the process of subdividing the image to into its parts that are constituent and is considered one of the most difficult tasks in image processing. It plays a task that is a must any application and its particular success is based on the effective implementation of the segmentation technique. For numerous applications, segmentation reduces to locating an object in an image. This involves partitioning the image into two classes, background or object. Into the individual system that is visual segmentation happens obviously. Our company is experts on detecting patterns, lines, edges and forms, and making decisions based upon the information that is visual. At that time that is same we have been overwhelmed by the quantity of image information which can be captured by technology, as it is not feasible to manually process all such images.Automatic segmentation of tumor faction from medical pictures is difficult due to size, shape, place and presence of other objects with the intensity that is exact same in the image. Therefore, cancer segments from the liver where tumor persists cannot be easily segmented accurately from medical scans utilizing approaches that are traditional. The performance of ANN been examined in classifying the Liver Tumor in this research. An approach for segmentation of tumor and liver from medical pictures is principally used for computer aided diagnosis of liver is required. The method is use contour detection with optimized threshold algorithm. The liver is segmented region that is utilizing technique efficiently close around the liver tumors. The whole process is a learning that is supervised; the classifiers require training information set which can be segmented. The classifier that is last evaluated with test set total error in tumor segmentation of this liver is be calculated. Algorithm should be based on segmentation of abnormal regions in the liver. The category regarding the regions can be carried out based on shape categorization and lots of other features using methods such artificial networks that are neural.

Published by: Shikha Mandhan, Kiran JainResearch Area: Computer Science and Engineering

Organisation: DVIET, Karnal, HaryanaKeywords:

7. Implementation of OLSR Protocol in MANET

Mobile ad hoc networks (MANETs) are autonomously self-organized networks without infrastructure support. In a mobile ad hoc network, nodes move arbitrarily; therefore the network may experience rapid and unpredictable topology changes. Because nodes in a MANET normally have limited transmission ranges, some nodes cannot communicate directly with each other. Hence, routing paths in mobile ad hoc networks potentially contain multiple hops, and every node in mobile ad hoc networks has the responsibility to act as a router. In this paper, we implement the OLSR Protocol in MANET to know how much data sent by the OLSR in bits/sec.

Published by: Rohit Katoch, Anuj GuptaResearch Area: Computer Science Engineering

Organisation: Sri Sai University Palampur,(H.P.)Keywords: Mobile ad hoc networks, Routing Protocols, OLSR

Research Paper

8. Performance Analysis of Multi-Hop Parallel Free-Space Optical Systems over Exponentiated Weibull Fading Channels Optimize by Particle Swarm Optimization

The performance of multihop parallel free- space optical (FSO) communication systems with decode-and-forward (DF) protocol over exponentiated Weibull (EW) fading channels has been investigated. The ABER and outage probability performance are analyzed under different turbulence conditions, receiver aperture sizes and structure parameters (R, C). The ABER and outage probability for FSO system is derived based on PSO. The ABER performance of the considered systems are investigated systematically combined with MC simulations. The comparison between EW fading model and PSO based EW fading channel demonstrates that performance of the both systems could be enhanced by large aperture sizes with the structure parameters R and C. With the particle swarm optimization (PSO) optimize path selection, fast average bit error rate (ABER) and outage probability reduction are achieved.

Published by: Babita, Dr. Manjit Singh BhamrahResearch Area: Electronics and Communication

Organisation: Punjabi University, Patiala,PunjabKeywords: Multihop parallel relay, free-space optical communication, exponentiated weibull distribution, particle swarm optimization

9. A Hybrid Approach for Enhancing Security in RFID Networks

RFID (Radio-Frequency Identification) is a technology for automatic identification of things and people. Human beings are skillful at identifying things under many different challenge circumstances. A bleary-eyed person can quickly pick a cup out of coffee on a cluttered breakfast dining table each day, as an example. Computer sight, though, executes jobs which are such. RFID might be considered an easy method of explicitly objects that are labeling facilitate their “perception” by processing devices. An RFID device frequently only called an RFID label isa microchip that is small for wireless information transmission. It is generally mounted on an antenna in a package that resembles an adhesive sticker that is ordinary. The word “RFID” to denote any RF device whose function that is main identification of an object or person. Thisdefinition excludes simple products like retail stock tags, which simply indicate their particular presence and on/off condition during the standard end of the practical range. It also excludes products being transportable smart phones, which do a lot more than merely identify by themselves or their particular bearers. Numerous cryptographic models of security neglect to show crucial features of RFID systems. A straightforward design that is cryptographic as an example, catches the top-layer communication protocol between a tag and audience. In the reduced layers are anticollision protocols along with other RF that is basic notably enumerate the safety dilemmas present at multiple interaction layers in RFID methods. This work proposes a hybrid that is brand new and AES based Encryption mechanism for RFID program.

Published by: Bhawna Sharma, Dr. R.K. ChauhanResearch Area: Computer Science and Engineering

Organisation: DCSA , KUK, HaryanaKeywords:

10. Credit Card Fraud Detection and False Alarms Reduction using Support Vector Machines

In day to day life credit cards are used for purchasing goods and services with the help of virtual card for online transaction or physical card for offline transaction. In a physical-card based purchase, the cardholder presents his card physically to a merchant for making a payment. To carry out fraudulent transactions in this kind of purchase; an attacker has to steal the credit card. To commit fraud in these types of purchases, a fraudster simply needs to know the card details. Most of the time, the genuine cardholder is not aware that someone else has seen or stolen his card information. The only way to detect this kind of fraud is to analyze the spending patterns on every card and to figure out any inconsistency with respect to the “usual” spending patterns. To commit fraud in these types of purchases, a fraudster simply needs to know the card details. Most of the time, the genuine cardholder is not aware that someone else has seen or stolen his card information. The only way to detect this kind of fraud is to analyze the spending patterns on every card and to figure out any inconsistency with respect to the “usual” spending patterns. Fraud detection based on the analysis of existing purchase data of cardholder is a promising way to reduce the rate of successful credit card frauds. As manually processing credit card transactions is a time-consuming and resource-demanding task, credit card issuers search for high-performing and efficient algorithms that automatically look for anomalies in the set of incoming transactions. Data mining is a well-known and often suitable solution to big data problems involving risk such as credit risk modelling, churn prediction and survival analysis. Nevertheless, fraud detection in general is an atypical prediction task which requires a tailored approach to address and predict future fraud. Though most of the fraud detection systems show good results in detecting fraudulent transactions, they also lead to the generation of too many false alarms. This assumes significance especially in the domain of credit card fraud detection where a credit card company needs to minimize its losses but, at the same time, does not wish the cardholder to feel restricted too often. In this work, we propose a novel credit card fraud detection system based on the integration support vector machines.

Published by: Mehak Kamboj, Shankey GuptaResearch Area: Computer Science and Engineering

Organisation: DVIET, Karnal,HaryanaKeywords:

11. Extensive Labview based Power Quality Monitoring and Protection System

Power quality issues and mitigation techniques became hot research topics soon after the introduction of solid state devices in power system. The equipments of non-linear nature introduce power quality issues such as harmonics, reduction in power factor, voltage unbalance, transients etc. and cause malfunction or damage of power system equipments. In this paper, harmonics, noises, reactive power etc. are considered as major issues. There is an ever increasing need for power quality monitoring systems due to the growing number of sources of disturbances in AC power systems. Monitoring of power quality is essential to maintain proper functioning of utilities, customer services and equipments. The authors surveyed different existing methods of power quality monitoring already in use and available in literature and arrived at the conclusion that an improved and affordable power quality monitoring system is the need of the hour. This paper presents the development of a simple power quality system for the purpose of measurement by designing virtual instruments using LabVIEW software. The real time data of hardware are acquired and fed to the software using Arduino for interfacing with LabVIEW. All power quality parameters are also measured by fluke power analyzer for validation. Observations taken from the hardware under test depict the importance of power quality monitoring, and also the accuracy and the precision of the developed system. The testing results and analysis indicate that the proposed method is feasible and practical for analyzing power quality disturbances.

Published by: Anurag Verma, Mrs. Shimi S.LResearch Area: Mechanical Engineering

Organisation: NITTTR, ChandigarhKeywords: Power quality monitoring, power quality software, power quality parameters.

12. Arduino based Low Cost Power Protection System

In this paper, harmonics, noises, reactive power etc. are considered as major concerns. This paper presents the development of simple power quality software for the purpose of protection of any system under fault conditions. By designing virtual instruments using LabVIEW software, the real time data of hardware are fed to the software using Arduino for interfacing with LabVIEW. The software recognizes the different types of fault conditions based on pre set values and indicates the type of fault occurred in system. It also disconnects the equipments on load side. Testing results and analysis indicate that the proposed method is feasible and practical for protection of the system during fault conditions.

Published by: Anurag Verma, Mrs. Shimi S.LResearch Area: Mechanical Engineering

Organisation: NITTTR, ChandigarhKeywords: Power quality monitoring, power quality software, power quality parameters.

13. A Malicious Data Prevention Mechanism to Improve Intruders in Cloud Environment

We proposed a new model presents improved key management architecture, called multi-level complex key exchange and authorizing model (Multi-Level CK-EAM) for the Cloud Computing, to enable comprehensive, trustworthy, user-verifiable, and cost-effective key management. In this research, we will develop the proposed scheme named Multi-Level CK-EAM for corporate key management technique adaptable for the Cloud Computing platforms by making them integral and confidential. To add more security, there is a next step which includes Captcha, user has to fill the correct given Captcha which eliminates the possibility of robot, botnet etc. In addition, it also has to be created in way to work efficiently with Cloud nodes, which means it must use less computational power of the Cloud computing platforms.

Published by: Tajinder KaurResearch Area: Cloud Computing

Organisation: Sainik Institute of Management & Technology, Ropar, PunjabKeywords: Cloud computing ,cloud environment,Problem Formulation

14. A Robust Cryptographic Approach using Multilevel Key Sharing Paradigm

Cloud computing is a popular technology that provides services to the users on demand and on pay-per-usage fee that is they only pay for the data utilized when required. With the vast growth in the use of mobile phone applications, the users are relying on their phones for their personal as well as professional work and suffering from many problems (storage, processing, security etc). To overcome these limitations and growth in the use of cloud applications, a new development area has emerged recently called as Mobile cloud computing. Mobile cloud computing is an integration of three technologies cloud computing, mobile computing and internet, enabling the users to access the services at any time and from any place. Mobile phones are sensitive devices and the personal data is not secured when user stores data on cloud and can be easily attacked by unauthorized person. This paper presents a two level encryption through a mobile application that encrypts the data before moving it to the cloud that ensures the security and the users authentication.

Published by: Tajinder KaurResearch Area: Cryptography

Organisation: Sainik Institute of Management & Technology, Ropar, PunjabKeywords: Cloud Computing,IAAS, PAAS, SAAS, mobile cloud computing, Data security,Ghost, AES encryption

Research Paper

15. Authentication using Finger Knuckle Print Techniques

In this paper, a new approach is proposed for personal authentication using patterns generated on dorsal of finger. The texture pattern produced by the finger knuckle is highly unique and makes the surface a distinctive biometric identifier. Important part in knuckle matching is variation of number of features which come by in pattern form of texture features. In this thesis, the emphasis has been done on key point and texture features extraction. The key point features are extracted by SIFT features and the texture features are extracted by Gabor and GLCM features. For the SIFT and GLCM features matching process is done by hamming distance and for the Gabor features matching is done by correlation. The database of 40 different subjects has been acquired by touch less imaging by use of digital camera. The authentication system extracts features from the image and stores the template for later authentication. The experiment results are very promising for recognition of second minor finger knuckle pattern.

Published by: Sanjna Singla, Supreet KaurResearch Area: Computer Science and Engineering

Organisation: Punjabi University Regional Centre for Information Technology and Management, Mohali, PunjabKeywords: Finger knuckle recognition, texture features, biometrics

16. A Novel Approach for Detection of Traffic Congestion in NS2

Traffic congestions are formed by many factors; some are predictable like road construction, rush hour or bottle-necks. Drivers, unaware of congestion ahead eventually join it and increase the severity of it. The more severe the congestion is, the more time it will take to clear. In order to provide drivers with useful information about traffic ahead a system must: Identify the congestion, its location, severity and boundaries and Relay this information to drivers within the congestion and those heading towards it. To form the picture of congestion they need to collaborate their information using vehicle-to-vehicle (V2V) or vehicle-to-infrastructure (V2I) communication. Once a clear picture of the congestion has formed, this information needs to be relayed to vehicles away from the congestion so that vehicles heading towards it can take evasive actions avoiding further escalation its severity. Initially, a source vehicle initiates a number of queries, which are routed by VANETs along different paths toward its destination. During query forwarding, the real-time road traffic information in each road segment is aggregated from multiple participating vehicles and returned to the source after the query reaches the destination. This information enables the source to calculate the shortest-time path. By allowing data exchange between vehicles about route choices, congestions and traffic alerts, a vehicle makes a decision on the best course of action.

Published by: Arun Sharma, Kapil Kapoor, Bodh Raj, Divya JyotiResearch Area: Computer Science and Engineering

Organisation: Abhilashi Group of Institution School of Pharmacy and Engineering and Technology,(H.P.)Keywords: VANET, PDR, AODV, NS2, END TO END DELAY

17. Consumer Trend Prediction using Efficient Item-Set Mining of Big Data

Habits or behaviors presently prevalent amid customers of goods or services. Customer trends trail extra than plainly what people buy and how far they spend. Data amassed on trends could additionally contain data such as how customers use a product and how they converse concerning a brand alongside their communal network. Understanding Customer Trends and Drivers of Deeds provides an overview of the marketplace, analyzing marketplace data, demographic consumption outlines inside the group, and the key customer trends steering consumption. The report highlights innovative new product progress that efficiently targets the most pertinent customer demand states, and proposals crucial recommendations to capitalize on evolving customer landscapes.

Published by: Yukti Chawla, Parikshit SinglaResearch Area: Computer Science and Engineering

Organisation: DVIET, Karnal,HaryanaKeywords: Consumer trend Analysis, data mining, item set mining, big data.

18. Robust data compression model for linear signal data in the Wireless Sensor Networks

The data compression is one of the popular power efficiency methods for the lifetime improvement of the sensor networks. The wavelet based signal decomposition for data compression, entropy encoding or arithmetic encoding like methods are being used for the purpose of compression in the sensor networks to elongate the lifetime of the wireless sensor networks. The proposed method is based upon the combination of the wavelet signal decomposition of the signal compression with the entropy encoding method of Huffman encoding for the purpose of data compression of the sensed data on the sensor nodes. The compressed data (reduced sized data) consumes the less energy for the small packets in comparison with the non-compressed packets, which directly affects its lifetime. The proposed model has been recorded with more than 70% compression ratio, which is way higher than the existing models. The proposed model has been also evaluated for the signal quality after compression and elapsed time. In both of the latter parameters, the proposed model has been found efficient. Hence, the proposed model effectiveness has been proved from the experimental results.

Published by: Sukhcharn SandhuResearch Area: Network Security

Organisation: Gurukul Vidyapeeth Group of Institutions, BanurKeywords: WSN efficiency, data compression, multi-objective compression, linear compression

19. A Review on ACO based Scheduling Algorithm in Cloud Computing

Task scheduling plays a key role in cloud computing systems. Scheduling of tasks cannot be done on the basis of single criteria but under a lot of rules and regulations that we can term as an agreement between users and providers of cloud. This agreement is nothing but the quality of service that the user wants from the providers. Providing good quality of services to the users according to the agreement is a decisive task for the providers as at the same time there are a large number of tasks running at the provider’s side. In this paper we are performing comparative study of the different algorithms for their suitability, feasibility, adaptability in the context of cloud scenario.

Published by: Meena Patel, Rahul KadiyanResearch Area: Cloud Computing

Organisation: CBS Group of Institution, Jhajjar, HaryanaKeywords: Forgery Detection Techniques, SVM Classifier, Hog Classifier and SASI Classifier

20. Novel Approach for Heart Disease using Data Mining Techniques

Data mining is the process of analyzing large sets of data and then extracting the meaning of the data. It helps in predicting future trends and patterns, allowing business in decision making. Presently various algorithms are available for clustering the proposed data, in the existing work they used K mean clustering, C4.5 algorithm and MAFIA i.e. Maximal Frequent Item set algorithm for Heart disease prediction system and achieved the accuracy of 89%. As we can see that there is vast scope of improvement in our proposed system, in this paper we will implement various other algorithms for clustering and classifying data and will achieved the accuracy more than the present algorithm. Several Parameters has been proposed for heart disease prediction system but there have been always a need for better parameters or algorithms to improve the performance of heart disease prediction system.

Published by: Era Singh Kajal, Ms. NishikaResearch Area: Data Mining

Organisation: CBS Group of Institution, Jhajjar, HaryanaKeywords: DATA MINING, HEART DISEASES, CLASSIFICATION, K-NEAREST NEIGHBOR, LDA, SVM

21. Novel Approach for Image Forgery Detection Technique based on Colour Illumination using Machine Learning Approach

With the advancement of high resolution digital cameras and photo editing software featuring new and advanced features the chances of image forgery has increased. The images can now be altered and manipulated easily. Image trustworthiness is now more in demand. Images in courtrooms for evidence, images in newspapers and magazines, and digital images used by doctors are few cases that demands for images with no manipulation. Some forgery images that result from portions copied and moved within the same image to “cover-up” something are called as copy-move forgeries. In previous year author use different-different methods such as Principle Component Analysis (PCA), Discrete Wavelet Transform (DWT) & Singular Value Decomposition (SVD) are time consuming. In past many of the algorithm were failed many times in the detection of forged image. Because single feature extraction algorithm is not capable to contain the specific feature of the images. So to overcome the limitation of existing algorithm we will use meta-fusion technique of HOG and Sasi features classifier also to overcome the limitation of SVM classifier. Logistic regression would be able to classify the forged image more precisely.

Published by: K. Sharath Chandra Reddy, Tarun DalalResearch Area: Digital Image Processing

Organisation: CBS Group of Institution, Jhajjar, HaryanaKeywords: Forgery Detection Techniques, SVM Classifier, Hog Classifier and SASI Classifier

22. Indian Coin Detection by ANN and SVM

Most of the systems available recognize the coins by taking physical properties like radius, thickness etc into consideration due to which these systems can be fooled easily. To remove above discrepancy features, drawings and numerals printed on the coin could be used as the patterns for which support vector machine can be trained so that more accurate recognition results can be obtained. In Previous techniques less emphasis given on classifier function that’s why classification accuracy is not improved. For solving this problem classifier techniques can be used.

Published by: Sneha Kalra, Kapil DewanResearch Area: Neural Networks

Organisation: PCTE, LudhianaKeywords: Coin detection, Machine learning

23. Automated Checking of PCB Circuits using Labview Vision Toolkit

LabVIEW has developed very strong vision intelligence software. The investigator has taken very useful industrial problem and has given a solution. All the PCB fabricating/Electrical and Electronic assembling organization, after culmination of the procedure physically check the PCB if every one of the segments are available or not. In the event that any segment will be missing, then it will be send back again for correction. All of these PCB industry do this procedure physically. As the creation of complete PCB is huge (in the scope of thousand and lakhs pieces for each month), thusly enormous labour and time it takes to check all PCB. Generally it takes 5-20 minutes to check each PCB relying on its complexity. So to physically check 1 lakh PCB, approximately 5-20 lacs minutes are required. It is really a huge problem for electrical and electronics industries. It is one of the biggest challenge and hurdle in PCB manufacturing industry now a days.

Published by: Manoj Kumar, Mrs. Shimi S.LResearch Area: PCB Circuits

Organisation: NITTTR, ChandigarhKeywords: Vision intelligence, PCB component checking

24. Automated Supervision of PCB Circuits

Machine vision intelligence(MVI) is the capacity of a Computer to “see” and “take appropriate decision”. A machine-vision framework utilizes one or more camcorders, simple to-computerized transformation ( ADC ), and advanced sign processing software ( DSP etc ). The subsequent information goes to a PC or robot controller. Two critical determinations in any vision framework are the affectability and the determination. Affectability is the capacity of a machine to see in faint light, or to distinguish feeble motivations at imperceptible wavelengths. Determination is the degree to which a machine can separate between items. When all is said in done, the better the determination, the more restricted the field of vision. Affectability and determination are associated. Every single other variable held steady, expanding the affectability diminishes the determination, and enhancing the determination lessens the affectability.

Published by: Manoj Kumar, Mrs. Shimi S.LResearch Area: PCB Circuits

Organisation: NITTTR, ChandigarhKeywords: Power quality monitoring, power quality software, power quality improvement

25. An Experimental Study on Performance of Jatropha Biodiesel using Exhaust Gas Recirculation

Today the world is in dilemma for the prevention of both of fuel depletion and environmental degradation crises. Due to excessive need, indiscriminate extraction and consumption of fossil fuels have led to a reduction in petroleum reserve. Developing countries such as India depend heavily on oil import. Diesel being the main transport fuel in India, finding a suitable alternative to diesel is an urgent need of the hour. Jatropha based bio-diesel (JBD) is a non-edible, renewable fuel suitable for diesel engines and has a potential of large-scale employment for wasteland land with relatively low environmental degradation. As Jatropha oil is free from sulphur and still exhibits excellent lubricity and is a much safer fuel than diesel because of its higher flash and fire point. Performance parameters including brake thermal efficiency (η), brake specific fuel consumption (BSFC) with varying loading conditions showed Jatropha biodiesel as an effective alternative on four stroke single cylinder compression ignition engine. Also the effect of exhaust gas recirculation (EGR) at 10% re circulation showed Jatropha as an effective fuel since the inherent oxygen present in the bio-diesel structure compensates for oxygen deficient operation under EGR.

Published by: Kiranjot KaurResearch Area: Mechanical Engineering

Organisation: Rayat Bahra University, MohaliKeywords: Jatropha, Exhaust Gas, Recirulation, Brake specific fuel consumption, Polycyclic aromatic hydrocarbon, Brake thermal efficiency.

26. Review Data De-Duplication by Encryption Method

Data deduplication is a technique to improve the storage utilization. De-duplication technologies can be designed to work on primary storage as well as on secondary storage. De-duplication with the use of chunking Data that is passed through the de-duplication engine is chunked into smaller units and assigned identities using crystallographic hash functions. Thereafter, two chunks of data are compared to ascertain whether they have the same identity. Chunking for de-duplication can be frequency based or content based. Frequency based chunking identifies high frequencies of occurrences of data chunks. The algorithm uses this frequency information to enhance data duplication gain.

Published by: Sonam Bhardwaj, Poonam DabasResearch Area: Computer Science Engineering

Organisation: UIET, KurukshetraKeywords: Deduplication, Cloud, SHA1

27. Pharmacological Studies on Hypnea Musiformis (Wulfen) Lamouroux

Hypnea musciformis belonging to family Rhodophyceae Genus name is Hypnea. To the best of our knowledge the algae Hypnea musciformis was evaluated for Phytochemical study Such as Physico-chemical analysis, elemental study, metal analysis. The different extracts undergo Preliminary Phytochemical analysis for the identification of various Phytoconstituents. It answers positively alkaloid, carbohydrate, glycosides, tannins, protein, amino acid and steroid …Pharmacological activity was screened by which methanol extract showed the maximum inhibition of arthritis. Then Methanolic extract was subjected to column chromatography to isolate the compound and identified by TLC and confirmed as Flavonoid by spectral studies as Astaxanthin and Hesperidin. Which responsible for reduction of arthritic activity and Free radical like Nitric oxide and DPPH. In Histopathological studies Methanolic extract of Hypnea musciformis shows effective in curing the synovial damage as compared to arthritic control. Our result showed that the methanol extracts and isolated compound possess significant anti-rheumatoid activity. It may due to the presence of Phenolic and Carotenoids terpene constituents. From the above results it can be concluded that Hypnea musciformis can be used in the treatment of anti-rheumatoid arthritic disease as a novel drug on the basis of clinical trials. Chemistry of marine natural products is a newer area of potential resources for discovering new therapeutic tangents developing new leads.

Published by: B. Lavanya, N. Narayanan, A. MaheshwaranResearch Area: Anti-rheumatoid activity

Organisation: Jaya College of Pharmacy, ThiruninravurKeywords: Hypnea musciformis, Methanolic extract, Antirheumatoid activity, Antioxidant activity, Phytoconstituents

28. Classification through Artificial Neural Network and Support Vector Machine of Breast Masses Mammograms

Breast Cancer is one of the most common types of cancer among women. Breast cancer occurs inside the breast cells due to excessive amount increase in production of cells. Most often this can cause death if not cure at a right time. There are many techniques to detect breast cancer and various abnormalities which are described in this report. But, in this research mammography technique is used to deal with the abnormality type: breast masses. These mammograms (X-ray images) of breast masses are stored in the standard mini-MIAS/DDSM databases. To finding the region of interest there are two methods are applied on it these are: segmentation and noise removal by using neural segmentation and thresholding respectively. After the extraction of abnormal part or region of interest, feature extraction is done through using three features: GLCM, GLDM and geometrical feature on which feature selection is applied to get higher accuracy. After calculating the value of each and every feature the classification is done through using method ANN (Artificial neural network) in which 40 mammograms are used to evaluate the terms named as True Positive, True Negative, False Positive, and False Negative with the help of confusion matrix. By using these confusion matrices, the system can understand the stage of each case. Performance evaluation explains that how much effective and beneficial the new research is. Hence, ANN are used to evaluate the performance through defining Accurateness (precision), Sensitivity and Specificity and also compare the results with existing SVM classification technique.

Published by: Kamaldeep Kaur, Er. PoojaResearch Area: Computer Science and Engineering

Organisation: Patiala Institute of Engineering and Technology, PunjabKeywords: Artificial neural network, Feature extraction, Mammograms, Segmentation, Support vector machine.

29. Review of Copy Move Forgery with Key Point Features

It involves the following steps: first, establish a Gaussian scale space; second, extract the orientated FAST key points and the ORB features in each scale space; thirdly, revert the coordinates of the orientated FAST key points to the original image and match the ORB features between every two different key points using the hamming distance; finally, remove the false matched key points using the RANSAC algorithm and then detect the resulting copy-move regions. The experimental results indicate that the new algorithm is effective for geometric transformation, such as scaling and rotation,and exhibits high robustness even when an image is distorted by Gaussian blur, Gaussian white noise and JPEG recompression.

Published by: Mrs. Nisha, Mr. Mohit KumarResearch Area: Department of Computer Science

Organisation: ASRA College, SangrurKeywords: Copy Forgery, ORB, SIFT, SURF

30. Analytical Review of the News Data Classification Methods with Multivariate Classification Attributes

The new classification has been emerged as the important sub-branch of the data mining. A lot of work has been already done on the news classification with variety of classifiers and feature descriptors. A number of news classification projects are working on the real-time systems in existence today. The news classification is the important part of the online news portals. The online news portals are rising every year, and adding more users to the news portals. The news classification is the branch of text classification or text mining. The researchers have already done a lot of work on the text classification models with different approaches. The news works has to be classified in the form of various categories such as sports, political, technology, business, science, health, regional and many other similar categories. The researchers have already worked with many supervised and unsupervised methods for the purpose of news classification. The supervised models have been found more efficient for the purpose of news classification. The major goal of the news classification research is to improve the accuracy while decreasing the elapsed time. Our news classification models purposes the use of k-means and lexicon analysis of the news data with nearest neighbor algorithm for the news classification. The k-means algorithm is the clustering algorithm and used primarily to produce the text data clusters with the important information. Then the lexicon analysis would be performed over the given text data and then final classification of the news is done using k-nearest neighbor. The results would be obtained in the form of the parameters of accuracy, elapsed time, etc.

Published by: Mandeep KaurResearch Area: Digital Image Processing

Organisation: C.G.C, Jhanjheri,PunjabKeywords: Classification, categorization, support vector machine, SVM Classification

31. Robustness against Sharp and Blur Attack in Proposed Visual Cryptography Scheme

The fundamental reason of watermarking invention was to protect originality of image message in the first place from outside attack. The quality of image depends on its ability to survive against various kinds of attacks that try to remove or destroy the originality. However, attempting to remove or destroy the message meaning should produce a noticeable debility in image quality. The robustness is a factor that plays an important role to test and verify the algorithm whether it will withstands against these attacks or not. In this paper the robustness of the proposed algorithm [15] for secret image share in Visual Cryptography Scheme is identified. The robustness of the image against various attacks, specifically image blur attack and image sharp attack are tested. The study of calculated PSNR value signifies the proposed algorithm withstands successfully on these attacks.

Published by: Dhirendra Bagri, R. K. KapoorResearch Area: Computer Science and Engineering

Organisation: NITTTR ,BhopalKeywords: DHCOD, PSNR, MSE, SLSB, Wavelet transform, Cryptographic, Watermarking, revealed image.

32. Review of Different Approaches in Mammography

Breast cancer screening remains a subject of intense and, at times, passionate debate. Mammography has long been the mainstay of breast cancer detection and is the only screening test proven to reduce mortality. Although it remains the gold standard of breast cancer screening, there is increasing awareness of subpopulations of women for whom mammography has reduced sensitivity. Mammography has also undergone increased scrutiny for false positives and excessive biopsies, which increase radiation dose, cost and patient anxiety. In response to these challenges, new technologies for breast cancer screening have been developed, including; low dose mammography.

Published by: Prabhjot Kaur, Amardeep KaurResearch Area: Digital Image Processing

Organisation: Punjabi University Regional Centre for IT and Management, Mohali, PunjabKeywords: Breast Cancer, Mammography

33. Ethanol: A Clean Fuel

Curiosity in producing ethanol from biomass is an incentive attempt for sustainable transportation. Ethanol is a colorless, slightly odoured and a nontoxic liquid produced from plants, and is formed by the fermentation of carbohydrates in the presence of yeast. It is also prepared from sorghum, corns, potato wastes, rice straw, corn fiber and wheat. A biofuel forms low green house gases, when burned compared to other conventional fuels. It is a substitute to fossil fuel which allows for fuel safety and security for many countries where there is less oil reserves. It is made from plants and other agricultural products through biological process rather than the geological process, which is involved in the formation of coal and petroleum. Biofuel is widely used as transportation fuels. Ethanol is considered a biofuel, and is widely used in some countries like U.S and Brazil. In this study, we studied the rising temperature of ethanol, diesel, and kerosene at a fixed point of time and found that ethanol as highest rising temperature compared to kerosene and diesel. It was also observed that the ethanol doesn’t produce any smoke while burning compared to diesel and kerosene which makes it an excellent alternative and clean fuel.

Published by: Samarth BhardwajResearch Area: Future Fuel

Organisation: SGGS school,ChandigarhKeywords: Biofuel, bio-ethanol, Bio-mass, carbon dioxide, carbon monoxide, environment.

34. Review of Brain Tumour Segmentation Approaches

Brain image segmentation is one of the most important parts of clinical diagnostic tools. Brain images mostly contain noise, in homogeneity and sometimes deviation. Therefore, accurate segmentation of brain images is a very difficult task. However, the process of accurate segmentation of these images is very important and crucial for a correct diagnosis by clinical tools. We presented a review of the methods used in brain segmentation. Reproducible segmentation and characterization of abnormalities are not straightforward. In the past, many researchers in the field of medical imaging and soft computing have made significant survey in the field of brain tumour segmentation

Published by: Nagampreet Kaur, Natasha SharmaResearch Area: Image Processing

Organisation: I. K. Gujral Punjab Technical University,Jalandhar,PunjabKeywords: Medical Image, Segmentation, Brain, MRI

35. A Survey over the Critical Performance Analytical Study of the MANET Routing Protocols (AODV & TORA)

The mobile ad-hoc network (MANET) is the ad-hoc technology for the automatic connectivity of the nodes in the network cluster. The MANETs are considered as the infrastructure less technology, which uses the peer-to-peer connectivity mechanism for the establishment of the inter-links between the network nodes. The MANET data is propagated over the paths established through the routing algorithms. There are several routing algorithms, which are primarily segmented in the two major groups, reactive and proactive. The reactive networks are designed to query the path when its required, whereas the proactive routing protocol constructs the pre-computed route based routing table, which is utilized to propagate the data over the pre-derived links/routes. In this paper, the major routing protocols have been evaluated for their performance under the distributed denial of service (DDoS) attacks. The advance on-demand distance vector (AODV) and temporally ordered routing algorithm (TORA) protocols, which are considered as one of the best protocols. This paper focuses upon the assessment of the best routing protocol under the DDoS attack over the MANETs. The security and vulnerability analysis of the routing protocols plays the vital role in the security enhancement of the aimed routing protocols. The security evaluation has been based upon the targeted protocols based upon the various factors.

Published by: Manju, Mrs Maninder kaurResearch Area: Network Security

Organisation: Doaba Institute of Engineering and Technology, MohaliKeywords: Security Analysis, Vulnerability analysis, distributed denial of service (DDoS) attack, MANET Routing.

36. Lexicon Analysis based Automatic News Classification Approach

The news classification approach is the primary approach for the online news portals with the news data sourced from the various portals. The various types of data is received and accepted over the news classification portals. The lexicon analysis plays the key role in the categorization of the news automatically using the automatic news category recognition by analyzing the keyword data extracted from the input image data. The N-gram news analysis approach will be utilized for the purpose of the keyword extraction, which will further undergo the support vector classification. The support vector machine based classification engine analyzes the extracted keywords against the training keyword data and then returns the final decision upon the detected category. The proposed model is aimed at improving the overall performance of the existing models , which will be measured on the basis of precision, recall, etc.

Published by: Kamaldeep Kaur, Maninder KaurResearch Area: Data Mining

Organisation: Doaba Institute of Engineering & Technology, KhararKeywords: News Classification, Regression, Probabilistic Classifier, Automatic Categorization, Multi-domain news analysis.

37. Design of Low Area and Secure Encryption System using Combined Watermarking and Mix-column Approach

Lately, the significance of security in the data innovation has expanded fundamentally. This paper shows another proficient design for rapid and low range propelled encryption standard calculation utilizing part strategy. The proposed engineering is actualized utilizing Field Programmable Gate Array.

Published by: Swati Sharma, Dr. M. LevyResearch Area: Image Processing

Organisation: Sambhram Institute of Technology BengaluruKeywords: Advanced Encryption Standard(AES), Mix-column, LUT(Look-up table) approach, GF (Galois Field), Splitting method.

38. Security Enhancement of the Telemedicine and Remote Health Monitoring Models

The telemedicine applications are the application utilized for the remote monitoring and health assessment of the people living in the remote areas. The networks of the doctors and the field executives utilizes the various kinds of the healthcare sensors for the health checkup of the people by collecting and transmitting over the internet for the treatment of the affected ones. The information being propagated through the internet between the healthcare sensors and the online server model is always prone to the several forms o the attacks. In this paper, the proposed model has been designed to improve the level of security over the telemedicine network. The proposed model will be improved by using the robust encryption with the highly scrambled authentication key. The performance of the proposed model will be assessed under the various performance parameters which define the network health as well as the security level.

Published by: Nishu Dhiman, Tejpal SharmaResearch Area: Network Security

Organisation: Chandigarh Group of Colleges,Mohali,PunjabKeywords: Robust authentication, Telemedicine security, Highly scrambled authentication data, Paired key based authentication.

39. Railway Bridge & Track Condition Monitoring System

As railroad bridges and tracks are very important infrastructures, which has direct effect on railway transportation, there safety is utmost priority for railway industry. This project aims at monitoring the tracks on the bridges along with structural health condition of the bridge for accidents reduction. In this paper we introduces railway tracks and bridge monitoring system using wireless sensor networks based on ARM processor. We designed the system including sensor nodes arrangement , collecting data, transmission method and emergency signal processing mechanism of the wireless sensor network.. The proposed system reduces the human intervention, which collects and transmit data . The desired purpose of the proposed system is to monitor railway infrastructure for accident reduction and its safety.

Published by: Vinod Bolle, Santhosh Kumar BanothResearch Area: ELECTRONICS

Organisation: Wainganga College of Engineering and Management Dongargaon NagpurKeywords: Railway bridge and track, Base station, sensor nodes, Arm controller

40. Flow Past a Rotating Circular Grooved Cylinder

CFD simulations of a two-dimensional steady state flow past a rotating circular grooved cylinder is analyzed in this study. Cylinder of diameter 0.1 m with 8 grooves of 0.01 m was examined at various Reynolds’s number (0.1 to 50) and angular velocity (0 to 100 RPS). Incompressible Navier Stokes equation in Ansys Fluent 14.0 was used to examine the flow. The pressure and velocity contours for various Reynolds’s number were generated. The result suggested that the flow remains attached to the surface of the cylinder up to the Reynolds number value of 4–5 and the flow pattern was independent of angular velocity at Reynolds’s number 45-46 and the cylinder behaved like a stationary cylinder and above these Reynolds’s numbers the flow is still two-dimensional, but no longer steady.

Published by: Ashish Kumar Saroj, Bharath Reddy, Gundu Jayadhar, D.GokulResearch Area: CFD

Organisation: XPLOCC Technologies, LucknowKeywords: Grooved cylinder, Angular velocity, Reynolds number, laminar, Incompressible fluid, Wake region.

41. A Comparison of Different Techniques used to Detect and Mitigate Black Hole Attack in AODV Routing Protocol based on MANET

A Mobile ad hoc network (MANET) is a self organized system which doesn’t have any pre-defined network infrastructure where mobile devices are connected by wireless links. Hence, a MANET can be constructed quickly at a low cost, as it doesn’t rely on existing network infrastructure. This paper presents a review on different techniques used to detect and mitigate the black hole attack in MANET i.e. for single black hole and also for cooperative black hole attack which are a serious threat to ad hoc network security. In cooperative black hole attack multiple nodes collude to hide the malicious activity of other nodes; hence such attacks are more difficult to detect. In this paper a comparison of various techniques that have been proposed in the literature for detection and mitigation of such attacks is presented.

Published by: Shivani, Pooja Rani, Pritpal SinghResearch Area: MANET

Organisation: Rayat Bahra UniversityPatiala,PunjabKeywords: Mobile Ad Hoc network, Single Black hole attack, Cooperative Black hole Attack, AODV Routing Protocol

42. Phylogenetic and Evolutionary Studies of Flavivirus

Abstract Viruses of Flavivirus genus are the causative agents of many common and devastating diseases, including yellow fever, dengue fever etc. so for proper development of efficient anti viral pharmaceutical strategies there is a need for proper classification of viruses of this group. To generate the most diverse phylogenetic datasets for the Flaviviruses to date, we analyzed the whole genomic sequences and phylogenetic relationships of 44 Flaviviruses by using various bioinformatics tools (MEGA, Clustal W, PHYLIP). We analyze these data for understanding the evolutionary relationship between classified and unclassified viruses and to propose for the reclassification of unclassified viruses which shows sequence similarity and also similar mode of transmission with classified viruses.

Published by: Meenu Priya Kontu, Dr. Sweta PrakashResearch Area: Bioinformatics

Organisation: Department of Bioinformatics, Govt kamla Raja Post Graduate (Autonomous) College GwaliorKeywords: Viruses, Flavivirus, Phylogenetic

43. Analyze the Effect of Base Station and Node Failure and Recovery on the Performance of Wimax

In this paper the effect of Base and node failure and Recovery is analyzed on the performance of Wimax by using different modulation techniques in a network. To analyze the performance opnet modeler is used. The performance is compared in terms of Delay, throughput and Load. The result shows that when base station fails then the performance Decrease and when node fail then performance increase. The result also shows that when different modulation techniques in different cells are used in same network then there is no change in performance.

Published by: Kanika, Er. Amardeep Singh VirkResearch Area: WiMAX

Organisation: Adesh Institute of Engineering and Technology, FaridkotKeywords: Wimax, OPNET, Wireless Network, IEEE 802.16, IPTV