1. Diabetes Prediction by Supervised and Unsupervised Learning with Feature Selection
: Two approaches to building models for prediction of the onset of Type diabetes mellitus in juvenile subjects were examined. A set of tests performed immediately before diagnosis was used to build classifiers to predict whether the subject would be diagnosed with juvenile diabetes. A modified training set consisting of differences between test results taken at different times was also used to build classifiers to predict whether a subject would be diagnosed with juvenile diabetes. Supervised were compared with decision trees and unsupervised of both types of classifiers. In this study, the system and the test most likely to confirm a diagnosis based on the pre-test probability computed from the patient's information including symptoms and the results of previous tests. If the patient's disease post-test probability is higher than the treatment threshold, a diagnostic decision will be made, and vice versa. Otherwise, the patient needs more tests to help make a decision. The system will then recommend the next optimal test and repeat the same process. In this thesis find out which approach is better on diabetes dataset in weka framework. Also use feature selection techniques which reduce the features and complexities of process
Published by: Rabina, Er. Anshu Chopra
Research Area: Machine Learning
2. Review on Grey- Hole Attack Detection and Prevention
— These Grey Hole attacks poses a serious security threat to the routing services by attacking the reactive routing protocols resulting in drastic drop of data packets. AODV (Ad hoc on demand Distance Vector) routing being one of the many protocols often becomes an easy victim to such attacks. The survey also gives up-to-date information of all the works that have been done in this area. Besides the security issues they also described the layered architecture of MANET, their applications and a brief summary of the proposed works that have been done in this area to secure the network from Grey Hole attacks
Published by: Suman Brar, Mohit Angurala
Research Area: Network Security
3. Review of Image Watermarking Technique for Medical Images
In this article, we focus on the complementary role of watermarking with respect to medical information security (integrity, authenticity …) and management. We review sample cases where watermarking has been deployed. We conclude that watermarking has found a niche role in healthcare systems, as an instrument for protection of medical information, for secure sharing and handling of medical images. The concern of medical experts on the preservation of documents diagnostic integrity remains paramount. Medical image watermarking is an appropriate method used for enhancing security and authentication of medical data, which is crucial and used for further diagnosis and reference. This paper discusses the available medical image watermarking methods for protecting and authenticating medical data. The paper focuses on algorithms for application of watermarking technique on Region of Non Interest (RONI) of the medical image preserving Region of Interest (ROI).
Published by: Kamalpreet Kaur, Er. Suppandeep Kaur
Research Area: Image Processing
4. Review on Rank Base Data Routing Scheme with Grey Hole Detection & Prevention in MANETS
MANET (Mobile Ad Hoc Network) is a type of ad hoc network that can change locations and configure itself, because of moving of nodes. As MANETs are mobile in nature, they use wireless connections to connect various networks without infrastructure or any centralized administration. While the nodes communicate with each other, they assist by forwarding data packets to other nodes in the network. Thus the nodes discover a path to the destination node using routing protocols. Gray hole attack among the different types of attacks possible in a MANET. Gray Hole attack is one type of active attack which tends to drop the packets during transmission the routing from source to destination .In this paper, we simulate gray hole attack Detection and prevention technique using AODV and AODV+PSO. Performance metrics Packet dropped or packet loss, End to End delay and Average throughput, the performance analysis has been done by using simulation tool ns-2 which is the main infrastructure.
Published by: Reena Kumari, Neha Goyal
Research Area: Network Security
5. Comparative Analysis of PCF, DCF and EDCF over IEEE 802.11 WLANs
with the enhancement of wireless network, QoS has become major researcher area. IEEE802.11 standard has two sub layers MAC protocols like as Distribution Coordination Function (DCF), Point Coordination Function (PCF). Medium access coordination function basically implements the Distribution Coordination Function (DCF) and Point Coordination Function (PCF) which support just to best effort service but have limited to QoS services. A new standard, Enhanced Distribution Coordination Function (EDCF) is reported .The IEEE 802.11e (EDCF) which defines the MAC procedures to support QoS requirements and that specifies distribution based access scheme to access the shared wireless media. In this paper, Protocols are tested under realistic conditions to perform evaluation of the coordination functions. Various parameters such as load, network load, media access delay, data dropped are tested in wireless network. Furthermore, the simulative observation is reported at data rate of 66Mbps using a physical layer protocols such as IEEE 802.11n to stumble the best one to implement with EDCF to achieved improved QoS.
Published by: Jagdish Singh, Joykaran Singh
Research Area: Wireless Communication
6. Call Admission Control (CAC) with Load Balancing Approach for the WLAN Networks
-- The cell migrations take place between the different network operators, and require the significant information exchange between the operators to handle the migratory users. The new user registration requires the pre-shared information from the user’s equipment, which signifies the user recognition before registering the new user over the network. In this thesis, the proposed model has been aimed at the development of the new call admission control mechanism with the sub-channel assignment. The very basic utilization of the proposed model is to increase the number of the users over the given cell units, which is realized by using the sub-channel assignment to the users of the network. The proposed model is aimed at solving the issue by assigning the dual sub channels over the single communication channel. Also the proposed model is aimed at handling the minimum resource users by incorporating the load balancing approach over the given network segment. The load balancing approach shares the load of the overloaded cell with the cell with lowest resource utilization. The proposed model performance has been evaluated in the various scenarios and over all of the BTS nodes. The proposed model results have been obtained in the form of the resource utilization, network load, transmission delay, consumed bandwidth and data loss. The proposed model has shown the efficiency obtained by using the proposed call admission control (CAC) along with the new load balancing mechanism. The proposed model has shown the robustness of the proposed model in handling the cell overloading factors.
Published by: Neetika Lalotra, Devasheesh Sharma
Research Area: Computer Networks
7. A Study on Friction Stir Welding of Various Aluminum Alloys
The comprehensive body of knowledge that has built up with respect to the friction stir welding (FSW) of aluminum alloys since the technique was invented in 1991 is reviewed on this paper. The basic principles of FSW are described, including metal flow and thermal history, before discussing how process parameters affect the weld microstructure and the likelihood of defects. Finally, the range of mechanical properties that can be achieved is discussed. It is demonstrated that FSW of aluminum is becoming an increasingly mature technology with numerous commercial applications. Keywords - Friction stir welding, metal flow, process parameters, mechanical properties
Published by: Neeraj Kumar, Virender Kumar
Research Area: Mechanical Engineering
8. Image Encryption using Huffman Coding for Steganography, Elliptic Curve Cryptography and DWT for Compression
An abstract is a brief summary of a research article or in-depth analysis of a particular subject or discipline, and is often used to help the reader quickly ascertain the paper's purpose. Images can be encrypted in several ways, by using different techniques and different encryption methods. In this paper, I am using Huffman Coding method for image steganography, Elliptic Curve Cryptography for image encryption and Discrete Wavelet Transform for image compression. In my work I am using steganography, encryption and compression all together on the image data. After applying all these techniques on image data it results in an encryption method which is highly secure. For the implementation of the proposed work we are using Matlab software.
Published by: Lavisha Sharma, Anuj Gupta
Research Area: IMAGE ENCRYPTION
9. Mammogram Image Nucleus Segmentation and Classification using Convolution Neural Network Classifier
Breast Cancer is one of the dangerous diseases which lead in resulting deaths among women. This is due to the presence of cancerous cells that are produced in extra amount of proportion which can replace the neighboring non-cancerous cells or it can infect all over the body. As the breast cancer concerns women mostly at the age of 40, they are asked to attain the regular mammographic screening, since mammography is most reliable method for cancer detection at early stages. Mammogram is the most common method used for breast imaging. It helps in examine the presence of cancer at early stages and help in reducing the mortality rate by 25-30% in screened women. There occur many different types of breast cancer such as: mass, micro calcification clusters, architectural distortion and asymmetry breast tissue. This dissertation carries the masses problem and deals with its shape and texture feature for classification. Various type of techniques and methodologies are present in mammography which helps to find out the presence of cancer and also multiple ways to detect it in its early stage so that the patient affected by it could not lead to death. Mammography is the most common, safe and inexpensive methodology suggested whose standard image database could be used for training the learning machine. In this dissertation nucleus segmentation is used to find out the region of interest (ROI). The result of ROI is further used for extracting the valuable shape and textural features by using geometrical features, GLCM and GLDM for classifying the cancer through the machine learning approach i.e. CNN (Convolution neural networks). CNN remove the overlapping of features obtained after segmentation. Hence, CNN is used to evaluate the performance through defining accuracy, precision, and recall and also compare the results with existing logistic regression and neural network classification technique.
Published by: Prabhjot kaur
Research Area: Digital Image Processing
10. Hybrid Algorithm for Cluster Head Selection Based on Energy in MANET
— In the existing system is only find in cluster head using cluster based routing protocol algorithm. This algorithm is not based in energy level of cluster head selection. The cluster head can communicate with other cluster heads, member nodes and gateways. That time the cluster head energy level is low. So the cluster head can’t communicate with other nodes. That the same time the congestion will be occurs and packet can’t be transfer in the nodes. It will take more time to complete the packet transmission. This approach illustrates that the proposed method is a routing protocol. The proposed research we have used no of connection in a group or cluster. Every cluster has a cluster head and the cluster head directly interconnect with the improper place. The results of proposed method are comparison with existing Leach Protocol. Here base connection is located to equal distance of a cluster and it’s directly communicating to the cluster connection. When a cluster or group is selected after that it’s force level is not considered. This method is increasing to life instance of network. As compare LEACH and Proposed method, we have noticed proposed methodhave better force, life time, less delay, better transmission and consumed less time. LEACH Protocol is based on the cluster to make comparison of native parameters so that we design the proposed methodl cluster based. Cluster used no of group to increase the performance. This consist many advantages which are listed below. Existing Number of groups are low. We can analyses more number of groups here. Every group’s stage check in this proposed research. Proposed method routing protocol have better result as compared to LEACH protocol. As cluster-head dies, series is rebuilt to bypass the deceased node. So the initial topology is not affected. Head node receives all the aggregated data moreover transmits further to cluster-head.
Published by: Updesh Gangwar, Ravi Shankar Shukla
Research Area: Computer Science and Engineering
11. Railway Bridge Track Surveying System for Accident Reduction
Abstract -As railroad bridges and tracks are very important infrastructures, which has direct effect on railway transportation, there safety is utmost priority for railway industry. This project aims at monitoring the tracks on the bridges along with structural health condition of the bridge for accidents reduction. In this paper we introduces railway tracks and bridge monitoring system using wireless sensor networks based on ARM processor. We designed the system including sensor nodes arrangement , collecting data, transmission method and emergency signal processing mechanism of the wireless sensor network.. The proposed system reduces the human intervention, which collects and transmit data . The desired purpose of the proposed system is to monitor railway infrastructure for accident reduction and its safety.
Published by: Vinod Bolle, Santhosh kumar banoth, Puja Khangar
Research Area: Electronics Engineering
12. A Novel Algorithm in 2 Level Aggregations for WSN in Multi Interface Multichannel Routing Protocol
Energy efficiency is an important metric in resource constrained wireless sensor networks (WSN). Multiple approaches such as duty cycling, energy optimal scheduling, energy aware routing and data aggregation can be availed to reduce energy consumption throughout the network. This thesis addresses the data aggregation during routing since the energy expended in transmitting a single data bit is several orders of magnitude higher than it is required for a single 32 bit computation. Therefore, in the first paper, a novel nonlinear adaptive pulse coded modulation-based compression (NADPCMC) scheme is proposed for data aggregation. A rigorous analytical development of the proposed scheme is presented by using Lyapunov theory. Satisfactory performance of the proposed scheme is demonstrated when compared to the available compression schemes in NS-2 environment through several data sets. Data aggregation is achieved by iteratively applying the proposed compression scheme at the cluster heads. The second paper on the other hand deals with the hardware verification of the proposed data aggregation scheme in the presence of a Multi-interface Multi-Channel Routing Protocol (MMCR). Since sensor nodes are equipped with radios that can operate on multiple non-interfering channels, bandwidth availability on each channel is used to determine the appropriate channel for data transmission, thus increasing the throughput. MMCR uses a metric defined by throughput, end-to-end delay and energy utilization to select Multi-Point Relay (MPR) nodes to forward data packets in each channel while minimizing packet losses due to interference. Further, the proposed compression and aggregation are performed to further improve the energy savings and network lifetime
Published by: Luv Kumar Pal, Anil Panday
Research Area: Computer Science and Engineering
13. A Review on Traffic Classification Methods in WSN
In a wireless network it is very important to provide the network security and quality of service. To achieve these parameters there must be proper traffic classification in the wireless network. There are many algorithms used such as port number, deep packet inspection as the earlier methods and now days KISS, nearest cluster based classifier (NCC), SVM method and used to classify the traffic and improve the network security and quality of service of a network.
Published by: Jaskirat Singh, Harpreet Kaur Saini
Research Area: WSN
14. Comparison of Gray Hole Attack in Manet in OLSR Protocol
In this era of wireless devices, Mobile Ad-hoc Network (MANET) has become an indivisible part for communication for mobile devices. Therefore, interest in research of Mobile Ad-hoc Network has been growing since last few years. In this paper we have discussed GRAY Hole attack in OLSR routing protocols in MANET. Security is a big issue in MANETs as they are infrastructure-less and autonomous. Main objective of writing this paper is to apply gray Hole attack in MANET& know How its effect on the MANET Environment. This article would be a great help for the people conducting research on real world problems in MANET security
Published by: Rohit Katoch, Anuj Gupta
Research Area: Computer Science and Engineering
15. Sentimental Analysis of Twitter Data using Text Mining and Hybrid Classification Approach
Opinion Mining is an important concept in today’s world and due to the advent of social media it has become a huge source of database. Since almost everybody in the modern era is involved with some social media platform, the public mood is hugely reflected in the social media today. This thesis proposes to utilize this source of information and predict the sentiments of public towards a particular topic. Food price crisis is being studied here in this thesis and public opinion is predicted for the topic. Twitter data is utilized for the same and live tweets of Indian origin are extracted using twitter API called ‘tweepy’. Oauth is used as handler and tweets are filtered for specific keywords and location using latitude longitude data. The tweets are saved into a database. They first preprocessed for removal of spam, special characters, url, short words etc.The tweets are then stemmed and tokenized and TF-IDF score is calculated for all the keywords. Feature selection is applied on it using Chi-Square and information gain. A term document matrix (TDM) is created which is fed to the classifiers for classification. Two classifiers has been analysed in this thesis: KNN and Naïve Baye’s and a hybrid has been made using them. The results of both the classifier has ben found to be satisfactory while the hybrid-KNN outperforms the Naïve Baye’s Classifiers in terms of accuracy. Thus a novel method is designed for opinion mining of Indian tweets regarding food price crisis.
Published by: Shubham Goyal
Research Area: Data Mining
16. MY Portfolio: An Intelligent Platform
The paper discusses the intelligent running of the website where Portfolio is a tool for creating online portfolio websites used by professionals or creative people like photographers, designers, architects, stylists, models etc. The portfolio gives a compact display of your profile be it your personal aspect or professional. There is a problem which the authors have tried to solve i.e unlike other professional network here user will have a single platform for both personal & professional profiles. Along with many of its unique feature portfolio is a very useful tool for the employers & also for the job seekers. It is a place where user can keep all his/her details i.e. personal & professional details like your hobby, your pictures, your resume if you are a job seeker, your requirement if you are an employer you can keep your requirements here along with all your social media links many more such details information. Here one can search the profiles of existing users will be notified that who has visited the profile. People can also get in touch with others by visiting their profile chatting each other.
Published by: Vinu Lata, Mrs. Aaradhna Singh
Research Area: Software Engineering
17. Image Enhancement by Adaptive Filter with Ant Colony Optimization
The principal aim of the image enhancement technique is to modify the attributes in an image to make it more suitable for the given task and specific purpose. During the enhancement process the number of attributes to be modified varies from one to more. Digital image enhancement techniques provide the wide range of choices for improving the visual quality of image. The suitable choice of the technique to be applied is influenced by the imaging equipments, task in hand and viewing conditions. In this paper we use approach for image enhancement using weirner filter which optimize by met heuristics like ant colony optimization (ACO).Results is significance improve PSNR improve up to 28.56 and MSE reduce up to 2.54.
Published by: Kanchan Rani, Er. Gurjot kaur
Research Area: Image Processing
18. An Improved Visual Recognition of Letters of English Language using Lip Reading Technique
Two phenomena have determined the emergence of a new research field. First, the drop of costs involved in electronically collecting and storing interpretation of the world has brought the need for sophisticated technique to handle the outcome data collection. The mapping from acoustic to visual information is the focus of this part of the thesis. The challenge is to produce adequately precise movements, in order to convey useful information to the listener in a real-time system with low latency
Published by: Ishu Garg, Amandeep Verma
Research Area: Image Processing
19. Vehicular Adhoc Network Routing Improved Throughput by Flower Pollination Optimization Algorithm
— VANET (Vehicular Ad-hoc Network) is a new technology which has taken enormous attention in the recent years. Due to rapid topology changing and frequent disconnection makes it difficult to design an efficient routing protocol for routing data among vehicles, called V2V or vehicle to vehicle communication and vehicle to road side infrastructure, called V2I. It is autonomous & self-organizing wireless communication network, where nodes in VANET involve themselves as servers and/or clients for exchanging & sharing information.
Published by: Karan Sharma, Shelly
Research Area: Wireless Network
20. Web Usage Mining Tools & Techniques: A survey
The Quest for knowledge has led to new discoveries and invention. That leads to amelioration of various technologies. As years passed World Wide Web became overloaded with information and it became hard to retrieve data according to the need .Web mining came as a violence to provide solution of above problem. Web usage mining is category of web mining. Web usage mining mainly circulation with discovery and analyzing of usage patterns in order to serve the needs of web based applications. The web usage mining mainly consist of three stages: data preprocessing, pattern discovery and pattern analysis. This paper is focused with the study of different tools and techniques for web usage mining
Published by: Satya Prakash Singh, Meenu
Research Area: Data Mining
21. Graph Coloring and Its Implementation
Graph coloring is an important concept in graph theory. It is a special kind of problem in which we have assign colors to certain elements of the graph along with certain constraints. Suppose we are given K colors, we have to color the vertices in such a way that no two adjacent vertices of the graph have the same color, this is known as vertex coloring, similarly we have edge coloring and face coloring. The coloring problem has a huge number of applications in modern computer science such as making schedule of time table , Sudoku, Bipartite graphs , Map coloring, data mining, networking. In this paper we are going to focus on certain applications like Final exam timetabling, Aircraft Scheduling, guarding an art gallery.
Published by: Ridhi Jindal, Meena Rani
Research Area: Graphs
22. Introduction to Wireless Sensor
Recently underwater sensor network has attracted great attention of many researchers. Here determining the location of sensor node is very important. The required information will not be useful if sensing nodes are not localized. There are various methods for localization in sensor networks but these are different in case of terrestrial sensor network and underwater sensor network. This paper explores some of the localization schemes in case of underwater sensor network and there comparison is made so that they can be utilized on basis of application requirement.
Published by: Smridhi
Research Area: WIRELESS SENSOR
23. Natural Language Processing
Language is way of communicating your words Language helps in understanding the world ,we get a better insight of the world. Language helps speakers to be as vague or as precise as they like. NLP Stands for natural language processing. . Natural languages are those languages that are spoken by the people.Natural language processing girdles everything a computer needs to understand natural language and also generates natural language.Natural language processing (NLP) is a field of computer science, artificial intelligence, and linguistics mainly focuses on the interactions between computers and human languages or natural languages. NLP is focussed on the area of human computer interaction. The need for natural language processing was also felt because there is a wide storage of information recorded or stored in natural language that could be accessible via computers. Information is constantly generated in the form of books, news, business and government reports, and scientific papers, many of which are available online or even in some reports. A system requiring a great deal of information must be able to process natural language to retrieve much of the information available on computers. Natural language processing is an interesting and difficult field in which we have to develop and evaluate or analyse representation and reasoning theories. All of the problems of AI arise in this domain; solving "the natural language problem" is as difficult as solving "the AI problem" because any field can be expressed or can be depicted in natural language.
Published by: Aparna Priyadarsini Khadanga, Suvendu Kumar Nayak
Research Area: Artificial Intelligence
24. An IDS by Correlation & KPCA with Neural Network Optimized By Genetic Algorithm
— An Intrusion Detection System is an application used for monitoring the network and protecting it from the intruder. Intrusion is a set of actions aimed to compromise these security goals. IDS have the computer security goals which are important for the data mining for extraction of data like confidentiality, integrity, and availability. This research study the performance measures of IDS is important for the security purposes. KDD 99 has 41 features. The IDS approached is used with the help of neural network technique of data mining and the Genetic algorithm is used as a classifier. The all features of KDD99 are used in this study. In this research, the feature is selected and extracted instead of using all features. The feature selection is done with the help of Correlation and feature extraction is done with the help of KPCA. The selection of features is according to the Eigen values of the features. The neural network is used for the change the weightage of the error. The neural network is basically runs many times and change weightage according to this.
Published by: Harpreet Kaur, Gaganpreet Kaur Bhalla
Research Area: Security
25. Implementing Multiple Security in the Cloud Environment
Cloud computing is continuously evolving and considered next generation architecture for computing. Typically, cloud computing is a combination of computing resources accessible via internet. Historically, the clients or the organizations store data in data centers with firewall and other security techniques to protect data against intruders. However, in cloud computing, since the data is stored anywhere across the globe, the client organizations have less control over the stored data. To build the trust for the growth of cloud computing, the cloud providers must protect the user data from unauthorized access and disclosure. Here in this work hybrid approach of encryption techniques and the storage of data are considered in the cloud system. The main advantage of the hybrid scheme is to provide more security in the cloud.
Published by: Anuradha, Dr. Suman Sangwan
Research Area: Cloud Computing
26. A Comparative Study of Lakshanas and Samprapti of BhasmakRog w.s.r to Hyperthyroidism
Agni is the fundamental concept of Ayurveda, which has described an important factor of Digestion and Metabolism in our body. Agni converts Food in the form of Energy, which is responsible for all the Vital Functions of our body. According to Ayurveda, रोगाः सर्वेपि मंदाग्नौः.............॥ all diseases occurs due to Mandagni except Bhasmaka Rog. It occurs due to Agni vruddhi which response to Kshudda vriddhi, Dhatu ksheenta with various Pitta prakop Lakshanas, hence Bhasmaka Rog directly effects on Metabolism. In human body, Thyroxin Hormone also plays an important role in Metabolism. If level of this hormone increased, results to increase Appetite, Sweating etc. This high level of thyroxin called Hyperthyroidism and its symptoms are same as Pitta Prakopa Lakshana. So the question arises whether there is any correlation between Bhasmaka Rog and Hyprthyroidism? What are the Lakshanas and Samprapti of both conditions? With the present article, we are trying to study the Lakshana and Samprapti of Bhasmaka Rog with special reference Hyperthyroidism.
Published by: Dr. Ankita U. Mandpe, Dr. G.H. Kodwani, Dr. Meera .A. Aurangabadkar
Research Area: HYPERTHYROIDISM
27. Review on Routing Approaches of MANET with Opportunistic Network
VANET (Vehicular Ad-hoc Network) is a new technology which has taken enormous attention in the recent years. Due to rapid topology changing and frequent disconnection makes it difficult to design an efficient routing protocol for routing data among vehicles, called V2V or vehicle to vehicle communication and vehicle to road side infrastructure, called V2I. It is autonomous & self-organizing wireless communication network, where nodes in VANET involve themselves as servers and/or clients for exchanging & sharing information.
Published by: Karan Sharma, Shelly Bhalla
Research Area: Vanet
28. Review on CSTR PLANT Error Optimization
The controller attempts to minimize the error over time by adjustment of a control variable, such as the position of a control valve, a damper, or the power supplied to a heating element, to a new value determined by a weighted sum Where k p, K i and K d all non negative, denote the coefficients for the proportional, integral and derivative terms. Dead timesproducea decrease in the system phase and also give rise to a non-rational transfer function of thesystem making them more difficult to analyse and control. Because of this characteristics dead time control problems have attracted the attention of engineers and researchers who have developed a special type of controller like PID controllers, Smith Predictor(DTC), MPC and various algorithms to control dead times.
Published by: Ramandeep Kaur, Jaspreet Kaur
Research Area: Optimization
29. Gateway Discovery in MANET: A Survey
MANET (Mobile ad-hoc network) mitigates the requirement of common control or access point for forming a transient network. The extensive growth of wired and wireless devices due to rapid enhancement in technology has led to more emphasis on this network. The tremendous use of mobile devices by the users forms temporary connection between mobile devices which form MANET. This network is although very effective in the sense that it does not need any prior infrastructure but it has limitation too like low bandwidth, frequent disconnection , limited features and low battery problem. MANET also allows internet connection with the help of Internet Gateways .These Gateways are bridge between infrastructure based and MANET based network. Several mechanisms had been proposed for the Gateway Discovery in ad-hoc network. This paper tries to analyze several Gateway discovery mechanism proposed earlier and gives a critical evaluation of those technique. The survey concludes with future scope in this area.
Published by: Rahul Mishra, Rakesh Kumar
Research Area: Gateway DIscovery in MANET
30. Big Data: Challenges and Opportunities
Big Data has the potential to revolutionize not just research, but also education. There are few defines phases through which the data is to be processed. There are challenges like Heterogeneity and Incompleteness, scale, timeliness, privacy and human consideration. Through better analysis of the large volumes of data that are becoming available, there is the potential for making faster advances in many scientific disciplines and improving the profitability and success of many enterprises.
Published by: Meena Rani, Ridhi Jindal
Research Area: Big Data
31. Review on Agile Method with Text Mining
Working software measures the progress. Basically, Agile method involves interleaving the specification, implementation, design and testing. Series of versions are developed with the involvement of and evaluation by the stake holders in each version. Agile methods aim at reducing the software process overheads (like documentation) and concentrate more on code rather than the design. Customer involvement, incremental delivery, freedom of developers to evolve new working methods, change management, and last but not the least simplicity is the basic essence of Agile development. Agile methodologies are well suited for small as well as medium sized projects.
Published by: Parveen kaur
Research Area: software engeenring
32. Performance Comparison of Ad-hoc Routing Protocols
Specially appointed systems are known by heaps of details like multi-jump remote availability, much of the time change system topology and the requirement for productive element steering conventions that assumes an essential part. This paper introduces an execution examination among two responsive steering conventions for versatile specially appointed systems: Dynamic Source Routing (DSR), Ad Hoc On interest separation Vector (AODV).Both conventions were recreated utilizing the apparatus ns-2 and were look at as far as parcel misfortune proportion, end to end delay, with portable hubs changing number of hubs and velocity. Reproduction uncovered that in spite of the fact that DSR splendidly scales to little systems among low hub speeds, AODV is favored because of its more productive utilization of data transfer capacity.
Published by: Sheetal
Research Area: AD-HOC ROUTING PROTOCOLS
33. Efficient Fingerprint Recognition using Wavelet Transforms
Fingerprints have long been used as a reliable biometric feature for private identification. Fingerprint classification refers to the matter of assignment fingerprints to 1 of many prespecified categories. Automatic classification may be used as a pre-processing step for fingerprint matching, reducing matching time and complexness by narrowing the search area to a set of a usually large info. Automatic fingerprint identification is one among the foremost vital biometric technologies. so as to expeditiously match fingerprints during a massive info, Associate in Nursing classification theme is critical. Fingerprint classification, that refers to assignment a fingerprint image into variety of pre-specified categories, provides a possible classification mechanism. In observe, but massive intra-class and tiny interclass variations in world pattern configuration and poor quality of fingerprint pictures build the classification drawback terribly tough. A fingerprint classification algorithmic program needs a sturdy feature extractor that ought to be ready to reliable extract salient options from input pictures
Published by: Jaspreet Kaur, Navleen Kaur
Research Area: Information Technology
34. Hazards Reporting based on Real-Time Field Data Collection using Personal Mobile Phone.
Hazard is a situation or thing that has the potential to harm people's, property or the environment. Hazardous area cause many people health. So we must to prevent from it. We are develop hazard reporting system to prevent from hazard prob- lem. Important task of the Reporting is Data Collection.The Geo-spatial Data is used to Indicate the Data along with the geographic component.This means that the data set have loca- tion information tied to them such as geographical data in the form of coordinates,address,city,or ZIP code.User report to the organization by using the same data and organization solve that problem.
Published by: Jayanti Khutwad, Bindu Konde, Ashvini Deokate, Prof A.A.Kadam
Research Area: GPS
35. Review on Detection of Gray Hole Attack in MANET
Mobile ad-hoc network(MANET) is a wireless network which has robust infrastructure. Mobile nodes can be used to form MANET. Arbitrary topology can be formed by connecting nodes with each other randomly. When source want to transfer packets to destination, a path being discovered for transmission. Sometime packet get dropped in path due to malicious node. Attack by malicious node is called gray hole attack. In this paper we detect the gray hole attack in the MANET. The detection and removal of the malicious node depends on the calculated probability of each node.
Published by: Geetanjali, Anupama Kumari
Research Area: Network Security
36. Review on Encrypt the Text by MD5 and RSA in Client Cloud Approach
Cloud computing is one of the emerging technology which is showing continuous advancement in the field of networking. Cloud computing is defined by National Institute of Standards and Technology (NIST) as a model for enabling ubiquitous, on demand network access to a shared pool of configuration computing resources (e.g. computer networks, servers, storage, applications and services) ,which can be rapidly provisioned and release with minimal management effort. It is gaining popularity in all the areas. But still by far cloud computing sharing is behind one expected because of the security concerns (unauthorized access, modification or denial of services, etc). In this research paper, the proposed work plan is to eliminate security concerns using cryptographic algorithm and hashing algorithm.
Published by: Adviti Chauhan, Jyoti Gupta
Research Area: Cryptography
37. Real Time Sign Language Recognition Systems: A Review
Alerts give a good watching, handy to fully grasp contrasting option to utilising a product unit like a console, mouse, and joystick brutal computer collaboration (HCI).Accordingly, the major factor of movement acknowledgment logical experiences are to fabricate a framework that can admire and have an understanding of targeted human motions naturally and make use of them to move on data (i.E., for informative use as in verbal exchange by way of gestures) or to control gadgets (at the finish of the day., manipulative utilize like in controlling robots without a physical contact). Possibly a standout amongst probably the most important standards for signal dialect acknowledgment is that normal signaling be bolstered by using the acknowledgment motor in order that a person can interface with the framework with none barriers. Considering a grouping of indicators is most likely mixed with co enunciation and inadvertent tendencies, these non-gestural developments must be disposed of from an information video before the recognizable proof of every motion in the association.
Published by: Sumit Sandhu, Sonia khatri
Research Area: Electronics and Communication
38. Survey of Various X-RAY Bone Image Segmentation Approaches
Photo segmentation is an foremost study subject considering the fact that it plays a major position in photo evaluation, and understanding. Segmenting an snapshot is essentially the most difficult and tricky assignment due to the fact that there exist exclusive objects and a tremendous variants between them using a common framework. Thresholding is likely one of the simplest segmentation strategies. The drawback of thresholding methods is that they may be able to be utilized to a single-band photograph, equivalent to a gray-scale image or a single band of a multi-band photograph. Area headquartered ways have shown to be very useful and effective segmentation methods in photograph processing. Nevertheless, they've over-segmentation tendency, require handbook initialization and are touchy to noise. Clustering system can be used for multi-band pics, however the number of groups need to be founded first. Classification-situated algorithm requires a training phase. Deformable units are less sensitive to noise than the opposite methods awarded in this paper, which make them compatible for problematic clinical picture segmentation problems. Atlas-situated approaches use prior potential in an effort to perform segmentation, but they are time-ingesting.
Published by: Sarita, Vikas Sandhu
Research Area: Electronics and Communication
39. Extracting News from the Web Pages by using Concept of Clustering with Neural Genetic Approach
Web news extraction is a investigation area which has been widely discovered. It has resulted in some systems which takes good extraction capabilities with little or no human involvement. The present system looks into the perception of web broadcast from a single web site which takes a equivalent format and the idea commonly is not as efficient when multiple web news pages are measured which go to altered sites. My work proposes a web extraction layout which is pretty same for maximum of the web news The purpose of web news extraction is to enhance information retrieval which provisions news articles associated to a particular event for competitive business analysis Researches in this area have shown many methods altered from the other based on the requirement, the extractor should be chosen. . In previous work they use unsupervised learning for extracting the news from web, but it compares the entire news pattern which extract so far. And in previous work did not work on the pattern of text in web which provide important information for classification and analysis of news from the web. Previous work extracting news is not complex process but classification of news take more time in processing. In previous work features will increase exponentially on the basis of unsupervised learning done. We reduce the complexity and increase the accuracy web news extraction by using text from web and classified by Cluster based supervised learning. to study and analysis of text mining and classifier on different parameters. To offered and implement pre-processing of web page by text mining and classified by cluster based supervised leaning. To learning the offered methodology by precision, recall, accuracy and F1 measure. The point of information accessible in the World Wide Web, it performs that the detection of quality data is graceful and simple but it has been a important matter of concern text mining is a field of researches and alterations. Online news classification has been challenge continuously in terms of manual operation. Data mining is procedure of determining interesting knowledge such as patterns, suggestions, changes, variances and important structures, from large amounts of data stored in database, data warehouse, or additional information sources. Information to the wide availability of massive amount of data in electronic form, and pending need for revolving such data into useful information and knowledge for broad application with market analysis, business administration and judgment support, documents mining has involved a great deal of devotion in information business in recent year.
Published by: Nishan Singh Saklani, Saurabh Sharma
Research Area: Computer Science Engineering
40. Smart Wifi Dustbin System
We realize that Garbage causes damage to local ecosystems, and it is a threat to plant and human life. To avoid all such situations we are going to implement a project called IoT Based Smart Garbage."When somebody dumps trash into a dustbin the bin ashes a unique code, which can be used to gain access to free Wi-Fi". Sensor check garbage lls in dustbin or not and Router pro- vides Wi-Fi to user. Major part of our project depends upon the working of the Wi-Fi module; essential for its implementa- tion. The main aim of this project is to enhancement of a smart city vision.
Published by: Akshay Bandal, Pranay Nate, Rohan Mankar, Rahul Powar
Research Area: Computer Science
41. Twitter Stream Analysis for Traffic Detection in Real Time
Now a days,social networking are more popular.for example,twitter,Facebook etc.social networking are used forevent detection in real time.Real time events are traffic detection,earthquake monitoring.In this paper,we use the the twitter for real time traffic event detection.Firstly,the system extract the tweets from twitter and apply the text mining techniques on that tweets.those techniques are tokenization, stop-word removing,stemming.after that classify that on the basis of class label i.e traffic event or no traffic event.In this paper, we present an online method for detection of real-traffic events in Twitter data.
Published by: Rucha Kulkarni, Sayali Dhanawade, Shraddha Raut, Prof.D.S.Lavhkarer
Research Area: Data Mining
42. Novel Approach for Routing in MANET by Network Connectivity with Meta Heuristic
. A popular example of opportunistic routing is the “delay tolerant” forwarding to vanet nework when a direct path to destination does not exist. The evaluation of this work is twofold. We implemented two prototypes on off -the-shelf hardware to show the technical feasibility of our opportunistic network concepts. Also, the prototypes were used to carry out a number of runtime measure- ments. Then, we developed a novel two-step simulation method for opportunistic data dissemination. The simulation combines real world user traces with artificial user mobility models, in order to model user movements more realistically. We investigate our opportunistic data dissemination process under various settings, including different communication ranges and user behavior pattern In this use Conventional routing in this case would just “drop” the packet. With opportunistic routing, a node acts upon the available information ,In this thesis optimize the routing by centrality information then refine by ant colony metaheuristics.In this method validate our approach on different parameter like overhead, throughput
Published by: Ramninder Kaur, Harpreet Kaur
Research Area: Wirelesss Network
43. Advanced E-voting System using NFC
The electronic voting is the technology in which the citizens can do the vote using smart phone. It gives functionality to users to give vote from android mobile. E-voting technique have advantages over traditional voting framework like less manpower, it save time, accuracy and transparency ,fast result ,etc. Security pre-requisites E-voting technique has so many challenges associated with voting. Mainly Assimilation and Verification to keep secure voted data. To overcome these challenges we purpose the new e-voting framework in which the NFC tag is used to give more accuracy and transparency in voting framework.The NFC tag store information of voters to check the voter and voters vote in the application. The E-polling technique has three phases.The first involves analyze and verification of user .In second phaseto get OTP and using this OTP user can vote in the framework. In third stage Administrator will count and sort out the votes and declare the result of voting in application.
Published by: Pratiksha Bhosale, Sayali Mokashi, Priyanka Wadkar, Prof P.V.Mahadik
Research Area: Computer Science
44. Distance Sensing with Ultrasonic Sensor and Arduino
A sensor is a device that converts one type of energy to another. Arduino is a small microcontroller board with a USB plug to connect to the computer. The Arduino board senses the environment by receiving input from a variety of sensors and can affect its surroundings by controlling LCD, speakers, motors and GS module. Ultrasonic Sensor measure the distance of target objects or materials through the air using “non-contact” technology. They measure distance without damage and are easy to use. The output Signals received by the sensor are in the analog form, and output is digitally formatted and processed by microcontroller. In present work, it is used to detecting an obstacle, along with its exact distance. The internal analog to digital converter is used is calibrated to get almost accurate distance measurement. The measured distance is also displayed on an LCD screen.
Published by: N. Anju Latha, B. Rama Murthy, K. Bharat Kumar
Research Area: Embedded Systems