Volume-3, Issue-4

July-August, 2017

1. Simulation Implementation of Different Types of Loads for High Gain Single Stage Boosting Inverter

This paper presents simulation implementation of different types of loads inr high-gain single-stage boosting inverter (SSBI) for photovoltaic applications. The two-stage micro inverter first performs dc to dc voltage step up and then converts dc to ac, whereas the single-stage topology has to perform the dc to dc voltage step up, and the dc to a c inversion functions all in one stage. The single stage boosting inverter system employs a Tapped Inductor to attain high-input voltage step up and, thus, allows operation from low dc input. The traditional two-stage approach is costly and has complex topology whereas the Single Stage Boosting Inverter has a simpler topology and a lower component count. The Single Stage Boosting Inverter can achieve high dc input voltage boosting, good dc–ac power decoupling, good quality of ac output waveform, and good conversion efficiency. The simulation results for linear R (Resistive) and RL (Inductive) loads as well as nonlinear RC and nonlinear resistor and saturable inductor loads are shown in this paper.

Published by: Pravin Appasaheb Mali

Research Area: Single Stage Boosting Inverters

2. Analysing EOT (extension of time) claim procedure in Indian construction industry along with case study

Project is a sequence of activities having fixed duration. If there is a delay in any activity which lies on the critical path of the project then it will hamper the project completion date and will lead to time overrun. Time overrun is an important factor to judge the success of the project. It may be due to contractor's fault, employer's fault, interface contractor's fault or due to force majeure. There is a need to analyse the cause of delay and if the delay is not due to contractor's fault than contractor is entitled for the EOT. In India contracts are framed according to CPWD and Military engineering services, hence it has some different approach to claim EOT from FIDIC and other international bodies. Claiming of EOT requires the involvement of different department. This paper discusses the protocols follows in claiming EOT as per Indian law and suggest about the amendments to be taken to eliminate disputes in process of claiming EOT.

Published by: Ayush Kushwaha, Anutosh Kushwaha

Research Area: Civil Engineering

3. Modified snack recipe Varagu Tikki for Diabetes Mellitus

Varagu (bhagar) is a well-known millet in India which is generally consumed only during traditional fasting. VARAGU TIKKI (cutlet) is a modification of traditional Aloo Tikki in respect of its nutritional aspect specially for people suffering from diabetes mellitus. The main ingredient used to lower its glycemic index and increase fiber content is Varagu. The additional nutritional benefits are provided by carrot, bengal gram whole and functional foods like flax seeds, curd, coriander leaves. It is a perfect diabetes friendly snack in which varagu provides high fiber that is known to lower blood glucose level. Carrot provides good amount of antioxidants. Flax seeds are a good source of omega 3 fatty acid and soluble fiber that help to prevent fluctuation in blood glucose level. Curd acts as a good binding agent as well as probiotic.These dietary changes were replaced with varagurice, bengal gram flour and carrot hence could be associated with a lower risk of diabetes and might be an appropriate component of recommendations for an overall healthy diet. Based on these facts we could conclude that this modified VARAGU TIKKI is nutrient dense, low calorie, low glycemic index, handy, palatable and attractive. The additional benefit of this snack is its cost effectiveness and could be consumed by all classes of the population and all age groups.

Published by: Shruti Bhavsar, Sneha Patil, Swapana Nerkar, Swati Sawant, Dr. Rupali Sengupta

Research Area: Nutrition

Research Paper

4. Novel Framework for Software Defect Classification by Hybridization of Sampling and Classifier Algorithms with Kernel Principle Component Analysis

Machine Learning approaches are great in taking care of issues that have less data. Much of the time, the product space issues portray as a procedure of discovering that rely on upon the different conditions and changes as needs be. A prescient model is built by utilizing machine learning methodologies and characterized them into faulty and non-damaged modules. Machine learning methods help designers to recover valuable data after the arrangement and empower them to examine information from alternate points of view. Machine learning methods are turned out to be valuable as far as programming bug. In this paper forecast by SVM with Gaussian and polynomial kernel and use SVM method with component base learning adaptive boost

Published by: Deepak, Maninder Kaur

Research Area: Software Prediction

Research Paper

5. An Optimization Technique to Detect the Forgery in Digital Images by Using Ant Colony Optimization

In our society digital images are a powerful and widely used communication medium. They have an important impact on communication and IT industry. The proposed versatile over division calculation sections the host picture into no overlapping and sporadic blocks adaptively. Then, the element focuses are removed from each block as block elements, and the block components are coordinated with each other to find the named highlight focuses; this technique can around show the presumed forgery districts. In past few years, research goes to detecting and classified for copy move forgery images for forensic requirement. So detection is very important challenges for testing in forensic science. In this paper detection and classification by point base and block base features SIFT and SURF Respectively but use ant colony optimization in matching and feature selection phases ,in case of SIFT features and proposed SIFT with ACO features which also use in classification with support vector machine with Gaussian and polynomial kernel

Published by: Neha Jain, Er. Sushil Bansal

Research Area: Copy Move Forgery

6. The Postwar Novel as Postmodern: Billy Pilgrim’s Imagination and the Critical Tendency Towards Teleology, Slaughterhouse - Five

This paper is an exploration of Slaughterhouse-Five (1969), an early document of American postmodern literature. In particular, this paper attempts to present the critical discussions surrounding this novel as equidistant to the broader theoretical discussions surrounding the concept of postmodernism. My contention is that, in discussing this novel, due in part to the natural teleological and linear tendencies of literary criticism, and despite the professed openness of postmodern thought to conflict, diligent efforts must be made to periodically reassert collapsed possibilities in literature. With this in mind, I approach Slaughterhouse-Five in an effort to, first, demonstrate how critics have diminished the potential meaning of each novel in imposing their own notions of a literary-historical trajectory, and, second, how readings of marginalized characters in this novel can reveal untapped potential for further exploration of the broadest definitions of the project of postmodernism.

Published by: Suman Rajest .S, Anbarasi

Research Area: English Literature

7. Wavelet Analytical Study of Solar Wind Proton Density

The solar wind is plasma, i.e., an ionized gas that fills the solar system. It results from the supersonic expansion of the solar corona. The solar wind consists primarily of electrons and protons with a smattering of alpha particles and other ionic species at low abundance levels. The structure of the heliosphere is strongly affected by the protons produced by the photo-ionization of the interstellar neutral hydrogen and by the charge exchange of the hydrogen atoms. The wavelet is a new analytical tool for turbulent or chaotic data to the physics community. It allows detection and characterization of short-lived structures in turbulence. Proton density fluctuations are studied using discrete wavelet transforms.

Published by: Anil Kumar

Research Area: Plasma Physics

8. On Almost Contra (b, µ)-Continuous Functions.

The purpose of this paper is to introduce the notion of almost contra (b, µ)-continuous function. Also the relationships between almost contra (b, µ)-continuous functions and the other forms are investigated.

Published by: C. Rajan

Research Area: Topology

9. Speech Recognition in Noisy Environment-an Implementation on MATLAB

Speech is one of the ways to express ourselves naturally. So, speech can be used as a means to communicate with machines. In this work, using MATLAB as a platform isolated word recognizer is achieved. Speech signals get distorted by many kinds of noises. Hence, it is necessary to reduce the noise contained in the speech signal. This is called speech enhancement. Speech enhancement aims at improving the intelligibility of the speech. Noise has been removed using Spectral Subtraction with Over Subtraction technique. The feature extraction is carried out using MFCC and feature matching is achieved using HMM.

Published by: Nishitha Danthi, Dr. A. R Aswatha

Research Area: Speech Processing

10. Conversion of RF Signal to Optical Signal Using MZM

In previous proposed system, Radio frequency signal was transmitted through RF system. But the system had some disadvantages like non coherence and system was not efficient. Hence to overcome these drawbacks, RF over Fiber system is proposed. In the proposed system, RF signal is the message signal which is used to modulate the optical signal that has to be transmitted over the optical cable. Mach–Zhender Modulators (MZMs) are used to convert the RF signal to optical signal when the light cannot directly modulate for higher speed, the Lithium Niobate, Mach-Zhender Modulator (LN-MZM) is used, Continuous Wave Laser diode is used as the source and PIN diode is used as optical detector. To transmit the optical signal Polarization-Maintaining optical fiber (PMF or PM fiber) is used, this transmitted signal is detected at receiver by optical detector .Hence in RF over fiber system are small in size, these are flexible and is a very low loss technology using intensity modulation to transmit RF signal.

Published by: Yakin Y. Malekar, Dr. A. R Aswatha

Research Area: Optical Communication

11. Recovery of RF Signal from Modulated Optical Signal

In previous proposed system, Radio frequency signal was transmitted through RF system. But the system had some disadvantages like non coherence and system was not efficient. This was overcome by using RF to optical domain converter. But that signal has to be recovered by converting the optical signal back to RF signal. This can be done by this system i.e. Optical to RF converter which is having photo detector, TIA and LNA. Here the modulated optical signal is detected by using photo detector, which detect the light signal and convert it to current signal. This current signal is converted to voltage signal and then required amplification is provided by using a LNA. Hence this system is a compact, and efficient system which is to provide a high speed optical to RF conversion.

Published by: Akshata Gudihal, Saravana Kumar

Research Area: Communication

Research Paper

12. Review on PID Controller with Intelligent System

This paper exhibited a survey investigation of tuning of Proportional Integral Derivative (PID) Controller for speed control of CSTR PLANT utilizing delicate processing strategies. CSTR PLANT engine is generally utilized as a part of businesses regardless of the possibility that its support cost is higher than the enlistment engine. Speed control of CSTR PLANT is pulled in extensive research and a few techniques are advanced. The PID controller is the generally utilized remunerating controller which is utilized as a part of nonlinear frameworks. This controller is broadly utilized as a part of a wide range of zones like aerospace, process control, manufacturing, automation and so forth. The tuning of PID parameter is extremely troublesome. There are different delicate processing systems which are utilized for tuning of PID controller to control the speed control of CSTR PLANT. Tuning of PID parameters is essential on the grounds that these parameters greatly affect the dependability and execution of the control framework

Published by: Jasvir Kaur, Gursewak Singh Brar

Research Area: Control System

13. Convex optimization based Adaptive PID Controller in CSTR Plant with Deadline Constraint

Process control by error feedback is an effective method but it will not give proper feedback when plant show nonlinear data, so use optimization techniques which optimize in an effective manner but optimization algorithm takes more time converge so select effective algorithm, which shows effective time complexity for convergence. We propose Human dynamic optimization which shows significance difference.

Published by: Jasvir Kaur, Gursewak Singh Brar

Research Area: Control System

14. Wavelet Analytical Study of Sulphur Dioxide as an Air Pollutant

Most of the sulphur dioxide in the atmosphere is anthropogenic by-product. It is one of the basic causes of acid rain worldwide. NAAQS set the level of sulphur dioxide in the atmosphere for the safety of human health and environment. Breathing of Sulphur dioxide becomes cause of many diseases concerned with respiratory system like Bronchitis, Asthma, etc. In the perspective of human health and environment protection continuous monitoring and analysis has become of great importance. Wavelet is a tool to analyze non-stationary signal and hence wavelet transforms provide excellent analysis of non-stationary time series of SO2 and extracts important information. Daubechies4 wavelet is orthogonal and compactly supportive and therefore, it is useful for multiresolution analysis of SO2 data. Wavelet transforms provide simple and accurate frame work for modelling the statistical behaviour of SO2 variation in the interest of public health and environment protection.

Published by: Dr. Anil Kumar

Research Area: Environmental Physics

15. Size and Color Based Quality Assessment of Maize Grain

This paper present a simple procedure or methodology of Size and color based prediction of pure quality bunches of maize yield using image processing toolbox of MATLAB . Image processing is a field of science have a number of application in technical fields where we are dealing with images. In many fields like agriculture to filter out the impurities of different size rather then the pure yield and quality assessment of different types of yields . Image processing provide some simple algorithms for size based prediction of fixed size objects from objects of different sizes in an image using MATLAB commands. In real life applications like face detection or size based rating of different fruits of vegetables etc. So this paper presents the dignified approach to extract the concerned size and color from an image.

Published by: Amarpal Singh, Jobanpreet Singh, Shadab Ahmed Khan

Research Area: Image Processing

16. Structural Analysis, Material Optimization using FEA and Experimentation of Centrifugal Pump Impeller

In general the Efficiency of a centrifugal pump (ηo) = Mechanical efficiency (ηm) × Volumetric efficiency (ηv). Most of the study has been done in improvement of Hydraulic efficiency but overall efficiency depends on both factors Hydraulic and Mechanical. Mechanical components – for example impeller weight and structure produces a mechanical loss that reduces the power transferred from the motor shaft to the pump or fan impeller. Also strength of the pump reduces due to stress corrosion problems in impeller which can be minimized using alternate material having same/more strength. The modelling of the impeller will be done by using solid modelling software, CATIA. The meshing and boundary condition application will be carried using Hypermesh, it is also used to produce good and optimal meshing of the impeller to obtain accurate results and analysis has been done by using ANSYS. A static analysis on 3HP pump impeller has been carried out to examine the stresses and displacements of the centrifugal impeller. Conventional MS material is replaced with glass fiber composite material. After getting safe results from analysis, model will be fabricated and testing will be done on UTM.

Published by: Ghanshyam G. Iratkar, Prof. A. U. Gandigude

Research Area: FEA

17. Between Closed Sets and -Closed Sets in Topological Spaces

Sheik John [24] (Veera Kumar [26]) introduced the notion of -closed sets (= -closed sets). Many variations of -closed sets were introduced and investigated. In this paper, we introduce the notion of mω-closed sets and obtain the unified characterizations for certain families of subsets between closed sets and -closed sets.

Published by: C. Rajan

Research Area: Topology

18. Internet of Things Based Real Time Transformer Health Monitoring System

Transformer is one of the important electrical equipment that is used in power system. Monitoring transformer for problem before they occur can prevent faults that are costly to repair and result in a loss of electricity. The main aim of the paper is to acquire real time data of transformer remotely over internet falling under the category of Internet of Things(IOT ). For this real time aspect we take one temperature sensor, one potential transformer and one current transformer for monitoring T, V, I data of the transformer and then send them to a remote location. These three analog values are taken in multiplexing mode and connected to a programmable microcontroller of 8051 family through an ADC 0808. Then the values of all the sensors are sent sequentially as per the frequency of multiplexing of the ADC by Microcontroller. They are then sent directly through an Wi-Fi module under TCP IP protocol to a dedicated IP that displays the data in real time chart form in any web connected PC / Laptop for display in 3 different charts. The real time data is also seen at the sending end LCD display interfaced with microcontroller.

Published by: Rahul

Research Area: Power System

19. Resource Allocation Utilizing Enthalpy Based Krill Herd Optimization Algorithm in Cloud Computing

Cloud computing has been a sort of Internet-related computing which proffers distributed computer processing resources , data computers and other devices upon requirement. The prior methodology of resource assignment in cloud computing is accomplished through Queuing Theory Based Cuckoo Search algorithm and it is having few limitations including failure in processing plus knapsack issue. Moreover scheduling energy consumption plus computational cost is high. Our proposed work of resource assignment through workflow scheduling is purely carried in two stages. First stage comprises of two phases, for every available task we measure task reward, delay, transmission probability, communication cost and reputation. According to the task measure value computed, we calculate the Enthalpy values. In the Second stage of our proposed work, we employ enthalpy based krill herd optimization algorithm for allocating resources that increase trade make span and for minimizing the resource usage. It also reduces computational complexity by enhancing the computing efficiency of processing elements. The implementation of our suggested algorithm reduces the knapsack issue of energy consumption, VM usage, PM usage, computational time, task migration and resource utilization which proffers to cost reduction.

Published by: Chenni Kumaran .J, M. Aramudhan

Research Area: Cloud Computing

20. Combination of Urethrotmy, Urinary Bladder Repair, Tubecystostomy in Clinical Cases of Urethral Obstruction in Bullocks

Combination of urethrotomy, bladder repair and tube cystostomy was done in 48 clinical cases of bullocks suffering from urethral obstruction. Tube cystostomy performed using either Foley’s or PVC catheter along with urethrotomy and bladder repair for adult bullocks was performed for both intact bladders and ruptured bladders with high success rate of 70.83%. Surgical tube cystostomy in addition to urethrotomy and bladder repair was found to be superior to percutaneous tube cystostomy and urethrotomy. Tube cystostomy with Foley’s catheter was found to be superior to PVC catheter.

Published by: Basavaraj R. Balappanavar, B. V Shivaprakash, S. M Usturge, D. Dilipkumar, Ashok Pawar, Vivek R. Kasaralikar, Vinay P. Tikare

Research Area: Veterinary Surgery

21. Wavelet Analysis of Air Pollution due to Carbon Monoxide

Carbon monoxide is the most abundant of the criteria pollutants. High concentration of Carbon monoxide generally occurs in areas with heavy traffic congestion. There are adverse effects on Human health due to high concentration of CO in the atmosphere. The metropolitan cities are facing such type of problems a lot. In a given signal (data of average CO per day, observed by DPCC Anand Vihar, Delhi), trend is the most important part and also slowest part of a signal. In wavelet analysis terms this corresponds to the greatest scale value. Symlet wavelet is orthogonal and compactly supportive and therefore, it is useful for multiresolution analysis of a CO data. Approximation and Detailed coefficients have been determined using Sym4 wavelet transforms.

Published by: Anil Kumar

Research Area: Environmental Physics

22. Flood Frequency Analysis of Stream Flow in Pakistan Using L- moments and TL-moments

Determination of most appropriate method to obtain probability distribution for data of stream flow for different locations in Pakistan by using statistical models is most important aspect in hydrologic structure. Statistical model building for flood frequency analysis will be used to determining the performance of stream flow. Developing methods that were provide an appropriate expectation of hydrologic events is always fascinating for both hydrologist and statistician, because of its significance in manipulating hydrologic structure and water resource controlling programs. L-moments and TL-moments were used to find the appropriate distribution for the data of stream flow in Pakistan for the analysis flood frequency. The basic purpose of flood frequency analysis is to find best probability distribution by the use of method of L-moments and TL-moments. Parameters of different distributions were predicted by the use of L-moment and TL-moment approach. TL-moments means that the moments are trimmed symmetrically by one conceptual sample. Most appropriate distribution was determined accordance with different goodness-of-fit methods. In addition L-moment as well as TL-moment ratio diagrams will gives us graphical clue of best fit distribution. Different goodness-of-fit test shows that GPA is most appropriate distribution for eight sites, while GLO, GNO and GEV for four, three and two sites respectively. L-moment method of estimation is found to be more suitable for most of the sites to find out the best probability distribution.

Published by: Zakia Batool

Research Area: Environmental Factors

23. Optimize Cloud Resources Framework for Workflow Scheduling By Swarm Intelligence

To completely misuse the utilizations of cloud, different difficulties should be tended to where planning is one among them. Albeit catholic research has been done on Workflow Scheduling, there are not very many edges customized for Cloud environments. For some essential standards of Cloud, for example, flexibility and heterogeneity existing work neglects to meet ideal arrangement. Hence our work concentrates on the booking techniques for logical work process on IaaS cloud. We display a calculation in view of the metaheuristic optimization system where the best of two calculations Ant colony Optimization (ACO) and Particle Swarm Optimization (PSO) are converged to enhance locally and internationally which limits the general work process time (makespan) and diminishes the cost. Our heuristic is assessed utilizing CloudSim and a few understood logical work processes of various sizes. The outcomes demonstrate that our approach performs better when contrasted with PSO calculation.

Published by: Harjot Kaur, Sharvan Kumar

Research Area: Cloud Computing

Research Paper

24. Data Security in Cloud Computing Based On Blowfish With MD5 Method

In this paper work on cloud base security, which is essential now days. Today scenario high computation speed and storage is industry or organization requirement for that they use cloud for storage but cloud will access by any number of user. SO data should be secure with minimum cost. Here cost means storage and time. Propose work reduce the storage and time in significant manner.

Published by: Pooja Devi, Amit Verma

Research Area: Cloud Computing

25. Spectral Analysis of Particulate Matter in the Atmosphere using Wavelet Transforms

Particulate matter contains solid or liquid particles that are transported ans dispersed in the atmosphere. Study of particulate matter has attracted great interest of scientists due to effect on human health and its major role in climate change. Reduced lung function, development in respiratory diseases and premature death are effect of long term exposure of particle pollution. Wavelet transforms provide excellent analysis of non-stationary time series and extracts important information. Daubechies4 wavelet is orthogonal and compactly supportive and therefore, it is useful for multiresolution analysis of PM data. Skewness and Kurtosis parameter describe lack of the symmetry of data and correlation coefficient between PM10 and PM2.5 describes the linear relation between them.

Published by: Dr. Anil Kumar

Research Area: Environmental Physics

26. Indexing Hydrological parameters of Narmada River influenced by Socio biological activities

In an effort to determine the water quality of Narmada River flowing from Moretakka, National Sanitation Foundation water quality index was selected in this piece of work. In support of the work, indexing of River water collected from centre and both left and right catchment areas was conducted between September 2010 and September 2012. The National Sanitation Foundation (NSF) Water Quality Index (WQI) was one of the analytical tools used to summarize the data. Essentially, the WQI converts the concentration data for nine analytes into one of five water quality classes, ranging from “very bad to excellent”. Based on the WQI values, water quality typically was in the “good” range for the year 2010-11, while its range decreases to “medium” in the following year 2011-12. The sites nearest the centre had the highest water quality rating with significant decreases in water quality occurring in the catchment area, particularly in Left catchment area. Water quality also was significantly impacted by socio biological activities. High total coliform levels (>1600 microorganisms/100 ml) are of particular concern for all the sampling sites.

Published by: Dr. Taniya Sengupta Rathore

Research Area: Fresh Water Pollution

27. Role of Public Relations in Image Management of an Organization

The paper seeks to establish the importance of maintaining good and meaningful relations with all the publics who interact with a corporate organization. It is the people who form an opinion over a period of time (about a firm) that gives the organization its reputation. So maintaining good relations with all the concerned publics is important for an organization. Public Relations, as the name suggests, is all about maintaining relations with the public. The corporate organizations feel the need to maintain, enhance and foster good relations with their prospective customers (public) in order to succeed. The Role of Public Relations in this aspect becomes very important.

Published by: Neha Singh, Dr. A. Ram Pandey

Research Area: Image Management

28. Modified Load Balancing Technique to Improve Performance of MANETs

Mobile Ad-Hoc Networks are autonomous and decentralized wireless frameworks. MANETs include mobile nodes that are allowed to move in the network. The load balancing is one of the fields in mobile ad hoc networks that need to be focused upon to improve the performance of the network. If the same path is over utilized for a period, its bandwidth can get used or the energy of the nodes might be drained. This would result in the loss of the data, if data transmission is continued over the same path. This paper takes load balancing of the network into consideration. The load has been monitored by taking into consideration the available bandwidth and packet delivery ratio of the current path, which is being used for data transmission. If the load exceeds the new path is used for packets transmission. The performance of the network was analyzed based on four parameters namely packet delivery ratio, throughput, routing overhead and remaining energy in the network.

Published by: Harpreet Rupra, Rasbir Singh

Research Area: Computer Science

29. Energy Preservation through Rest Planning Algorithm for Wireless Sensor Networks

Wireless Sensor Networks applications often need to be changed after deployment for a variety of reasons. Reconfiguring a set of parameters and patching security holes and modifying tasks of individual nodes. Wireless reprogramming is a crucial technique to address such challenges. Code Dissemination is a basic building block to enable wireless reprogramming. We present a link quality aware Sleep Scheduling Algorithm leveraging the 1-hop link quality information. It has following salient features compared to prior works. First it supports dynamically configurable packet sizes. Second it employs an accurate sender selection procedure to mitigate transmission collisions and transmission over poor links. Third it conserves energy and also takes short completion time.

Published by: Vinayak S. Korishettar, Manisha Tapale

Research Area: Wireless Sensor Networks

30. Intersection Movement Assistance for Vehicles

Connected vehicle, wireless communication technology is expeditiously evolving in recent years, and host vehicle (HV) can transmit or receive the basic safety message (BSM) from the remote vehicles (RVs). A collision warning predicted that provides connected automated vehicle and alert driver when time to collision (TTC) is within specified bound. Then, distance is calculated by using Haversine formula, and the error statistics of the estimation of latitude and longitude are analysed and alert the driver. Vehicle collision can be detected in real time and the vehicle can prevent the potential conflict accordingly by using this information provided by BSMs. The goal of Intersection Movement Assistance (IMA) is to develop a vehicular communication system to alert the driver from potential collision at intersection based on the Basic Safety Message (BSM) from the neighbouring vehicles in V2V environment.

Published by: Manoj Pai, Uma Mudenagudi, Gourav Sharma

Research Area: Automotive electronics and Embedded Systems

31. Performance Evaluation of Various Routing Protocols of Mobile Ad Hoc Networks

Mobile ad-hoc wi-fi networks hold the promise of the destiny, with the potential to line up networks at anytime, anywhere. Mobile unplanned networks (MANETs) unit of measurement assortment of mobile nodes, dynamically forming a transient network whereas not pre-present community infrastructure or centralized administration. lately a whole heap of study efforts consciousness on Mobile Ad-hoc networks. Routing protocol performs a significant perform if hosts want to change packets that can not be ready to speak directly. All nodes unit of measurement cellular and will be coupled dynamically in Associate in Nursing impulsive manner. All nodes of those networks behave as routers and participate in discovery and maintenance of routes to totally different nodes among the community. This state of affairs becomes further difficult if further nodes unit of measurement brought among the community. Associate in Nursing Ad-Hoc routing protocol got to be able to decide the superb direction among the nodes, decrease the knowledge live overhead to switch right routing, trim the time required to converge once the topology changes.

Published by: Sheetal Kadyan, Gopal Singh

Research Area: MANET

32. An Interactive Interface for Natural Language Query Processing to Database Using Semantic Grammar

Above and beyond the years field of natural language query processing (NLQP) has seen humangous changes in research as well as methodology. Trending issue within the database management is to provide a high level interface for layman. Major source of data which is generated in many organizations and companies are stored in database which plays important role in IT field. To access these data one need to know SQL language but since layman doesn’t have any knowledge about formal language i.e. SQL a proper interface is required. So an interface for NLQP to database by using semantic grammar deals with design and developing a system which understand the natural language like English and convert English language to SQL queries. So this paper presents thee insight of NLQP system on how to enable communication between human and computers without memorization of complex commands and procedures and also NLQP system is able to intelligently process the user request in reasonable human useable format.

Published by: Soumya M. D, B. A Patil

Research Area: Natural Language Query Processing

33. Comparing Three Neural Network Techniques in the Classification of Breast Cancer

Breast cancer is becoming common disease in women nowadays. Breast cancer is nothing but mass or group of uncontrollable growth of cells in body is called tumor. Benign is an initial stage of cancer and malignant is last stage of the cancer, these two are the stages of breast cancer. A man can survive or rate of survival is more in benign stage by taking a very good treatment by radiologist where as in malignant stage a man cannot survive directly it leads to death. Neural network is powerful classifier. Before classifying the neural network is to be trained by collecting the data or images from different datasets. After training the large dataset test the new data sample or breast image by extracting the new features from new image and then classifying the image into cancerous or non-cancerous. Finally, the results are compared with three neural network 1) Radial basis function 2) Feed forward neural network and 3) Back propagation neural network. Accuracy is calculated 95% of RBF, 96% of FFNN and 100% of BPNN.

Published by: Smitha Hallad, Prof. Roopa Hubballi

Research Area: Image Processing

34. Role of Staging Laparoscopy in Gastric Carcinoma

Laparoscopy is a minimally invasive procedure used as a diagnostic tool and surgical procedure that is performed to examine the abdominal and pelvic organs. Staging laparoscopy is minimally invasive surgery for the diagnosis of intra-abdominal diseases. Despite increasingly sophisticated radiological diagnostic equipment, many patients with gastric, hepatic, or pancreatic malignancy continue to have the diagnosis of unresectable or metastatic disease made at exploratory laparotomy. Staging laparoscopy may aid in the more accurate staging of gastric cancers and guide appropriate treatment without the morbidity associated with exploratory laparotomy.

Published by: Dr. Pratik Hire, Dr. K. B. Golhar

Research Area: Surgical Oncology

35. Isolation, Screening and Charaterization of Cellulolytic Bacteria, Determination of Their Cellulolytic Potential

Cellulosic biomass is one of the most dominant waste materials from various industries. Cellulose degradation and its utilization is important for global carbon sources. The value of cellulose as renewable source of energy has made its hydrolysis a subject of research and industrial interest. The present investigation is based on isolation, screening, and characterization of cellulolytic bacteria and determination of their cellulolytic potential. The work also concentrates on production of cellulase enzyme by submerged fermentation and its partial purification by centrifugation. Enzyme activities of selected isolates were determined by DNS method and finally application of potential isolates in degradation of various natural cellulosic substrates was carried out.

Published by: Jeeva Raj, Hema .C, Haris .S, Hema Amulya, Dr. Snehalatha V.

Research Area: Microbiology

Research Paper

36. Shifted Histogram Using Optimal Shift Distance for Images with Entropy Value & Wavelet Decomposition Images

Histogram equalization is a method in image processing of contrast adjustment using the image's histogram. Histogram equalization automatically determines a transformation function that seeks to produce output images that has a uniform histogram. When automatic enhancement is desired that is good approach because result from this technique are predictable and method is simple to implement. Method used to generate a processed image that has a specified histogram is called histogram matching or histogram specification. In this thesis we applied advance algorithm to enhance the quality of an image and we succeeded and also got the entropy value of image by changing the scaling factor K we got different value and also got different shifted histogram image. In our research work we worked on different level to analyze the effect of algorithm. By changing scaling factor we get different EME and image enhanced significantly from which lot of important information can be recovered.

Published by: Rekha, Vijay Nandal

Research Area: Image

37. Comparison Analysis of Modulation Technique of UWB, OFDM and CDMA for Different Parameter

With pace of time there is tremendous development in communication and on other hand as population increasing exponentially with time many severe traffic situations arises for example congestion problem, quality of service and cost also. A researcher always tries to resolve complexity of the system means system must be simple, cost effective and easy to use. In our dissertation research area is to compare different characteristics parameters of digital modulation technique so that a final conclusion can be made which technique is best one when we transmit data and finally it reach at destination with minimum bit error rate (BER). In base paper only CDMA technique is considered but in our dissertation work UWB and OFDM also considered. In our research paper we implemented simulation result CDMA, OFDM and UWB for different band pass technique for example BPSK, QPSK, QAM, 16 QAM, 64QAM, Calculate Transmitter and receiver message of CDMA, OFDM and UWB, Bit Error Rate, Signal-to-Noise ratio (SNR) and MSE Equalizer. Besides this our dissertation having multi input multi output (MIMO) antenna concept used. Earlier single antenna concept was used, now we use array of antenna to provide much efficient and good quality of service or we can say MIMO (Multi input Multi Output)

Published by: Pooja Budhwar, Dheeraj Kapoor, Nipin Gupta

Research Area: Communication

38. Survey Paper on Analysis of Modulation Technique of UWB, OFDM and CDMA for Different Parameter

Brief history of evolution of mobile communication throughout entire world is useful in order to appreciate fabulous impact that cellular radio and PCS will have on all of us over next several decades. Progressive involvement in technology development is very crucial for any government if it hopes to keep its own country competitive in the rapidly growing field of wireless communication. As time passed we developed so much advanced technology which was never been possible before. After second generation tremendous revolution occurred with third generation technology in communication and it is known as cdma2000. After that various sub technologies came into existence of cdma2000. These are based on Interim Standard 95 and Interim Standard 95B technology. Similarly for other technology OFDM and UWB we developed lot of advanced technology as per need of clients. Our main focus to develop this technology is that cost effective, less complex, maximum subscriber, high quality of service. As we know information is transmitted through electromagnetic waves with help of antenna. Which types of antenna are best suited for a particular technology is tedious task. Now days we are using smart antenna. We can say smart antennas are arrays of antenna which consist multiple transmitter antenna and multiple receiver antenna.

Published by: Pooja Budhwar, Dheeraj Kapoor, Nipin Gupta

Research Area: Comminication

39. Removing Salt-And-Pepper Noise from Digital Image Using Unsymmetric Trimmed Median Filter

Every digital image has a two-dimensional mathematical representation of the digital image. Digital image are made out of pixels i.e. picture component. Every pixel speaks to the dark level for highly contrasting photographs at a solitary point in the image, so a pixel can be spoken to by a small speck of particular shading. Image restoration is the process of restoring degraded images which cannot be taken again or the process of obtaining the image again is costlier. We can restore the images by prior knowledge of the noise or the disturbance that causes the degradation of the image. Image restoration is done in two domains: spatial domain and frequency domain. In the spatial domain, the filtering action for restoring the images is done by directly operating on the pixels of the digital image. In our research work different format of the same image will be executed for a different level of noise and then we will analyze which format will be best and besides PSNR two more parameters MSE and IEF also considered. In our research work, our main objective is to remove salt and pepper noise from the image. As in base paper, 30% and 70% salt and pepper noise are removed with PSNR value. But in our dissertation work salt and pepper noise at 30%, 50%, 70%, and 75% are removing with three parameters like PSNR, MSE, and IEF. After the filtering, the image is remapped into spatial domain by inverse Fourier transform to obtain the restored image. Different noise models were studied. Different filtering techniques in both spatial and frequency domains were studied and improved algorithms were written and simulated using Matlab. Restoration efficiency was checked by taking peak signal to noise ratio (PSNR) and mean square error (MSE) into considerations.

Published by: Minakshi, Suraj Rana

Research Area: Image

40. Identification and Characterization of Cellulose Degrading Bacteria and Estimation of Its Cellulolytic Capacity

Cellulosic biomass is one of the foreseeable sustainable source of fuels and is also one of the dominating waste materials in nature resulting from human activities. Keeping in view the environmental problems like disposal of large volumes of cellulosic wastes and shortage of fossil fuel in the world, the main aim of the present investigation was to characterize and study the cellulolytic activity of the selected isolate on natural cellulosic substrates viz. finely grated vegetable peels. The cellulose degrading capacity of the isolate was confirmed by Congo red test. By the selection of efficient cellulolytic microorganisms and cost- effective operational techniques, the production of useful end products from the biodegradation of the low cost enormous stock of cellulose in nature can be very beneficial.

Published by: Jeeva Raj

Research Area: Microbiology

41. Integration of Robust Different Hierarchical Routing Protocol of Wireless Sensor Network

Wireless sensor network is nowadays very popular important in the field of research because the world is now switching faster from wired communication to the wireless communication. In our research work, we have to compare different protocols TEEN, PEGASIS, ECHREP LEACH. It is used in environmental monitoring, habitat monitoring, battlefield etc. WSN is made up of tiny sensor nodes which sense the data and communicate to the base station via other nodes.WSN networks are data-centric rather than node centric. So, main issues in WSN networks are energy consumption of network, the lifetime of a network, delay, latency, quality of service etc.WSN has defined many routing protocols for the network. The main challenge in WSN is to design a routing protocol which gives the maximum energy efficient routing because nodes in a sensor network are equipped with the battery. So, as time passes the battery of nodes will decrease so in turn network lifetime will decrease. There are many routing protocols which are classified as their working and their application to different conditions. This paper describes brief information about routing protocols. The main focus of this paper is to give the comparison of different hierarchical routing protocols. In this Dissertation, we were comparing four routing protocol LEACH, PEGASIS, TEEN and Proposed ECHERP. So we conclude that according to overall performance in hierarchical network ECHERP performance better compare to other routing protocol in WSN.

Published by: Nikita Balhara, Tajendar Malik

Research Area: Routing Protocol

42. Multipurpose Near Field Communication

Near field communication (NFC) is a short-range wireless protocol that allows users to connect devices and access content and services by simply holding enabled devices near each other. Many of the existing applications (ticketing, purchasing, device configuration, etc.) use NFC as a method to transfer unique identifiers which then inform a larger system. An innovative method of designing a multipurpose near field communication system using RF encoder and decoder. 4-bit encoder and decoder are used for wireless communication so that we are going to use 434 mhz rf receiver and transmitter chip .It will control multiple devices at a time. By using RF encoder we can store data without encoder we can’t send data

Published by: K. Lakshmi, M. Sree Lekha, S. Prem Kumar

Research Area: Wireless Communication

43. Improvement of Coexistence of LTE Femtocell Network with Dynamic Resource Allocation

In current scenario interference is one of the most challenging problem we are facing in femtocell deployment under the coverage of existing macrocell. Allocation of resources between femtocell and macrocell is essential to counter the effects of interference in dense femtocell networks. Advances in resource management strategies have improved the control mechanism for interference reduction at lower node density, but most of them are ineffective at higher node density. In our research work, a dynamic resource allocation management algorithm for spectrum shared hybrid access is accomplished in femtocell network. In base paper only throughput is represented with FAP but in our research work we set up cognitive radio network (CRN), comparison between power used by Femto cell and macro user and power factor is very important parameters in our daily life and when there is comparison between Femto and macro Femto cell user consume very less power as compared to macro. Besides this in our research work we show the throughput and power used by Femto cell at different floor and result clearly shows that if Femto cell used at same floor then coverage area will be more with respect to that if used at different floor. Power consumption between macro and Femto user shows us that Femto user utilizes very less power as compared to macro user.

Published by: Annu Kumari, Suraj Rana

Research Area: Comminication

44. Simulink Model Design for FSO Communication System for Analysing Of Different Parameters

In our dissertation work, an approach has been made to analyze the effect of free space transfer function by considering various parameters like path loss factor, atmospheric turbulence, pointing errors on the performance of free space optical (FSO) communication system. The performance of the proposed free space optical communication system is studied by developing a MATLAB simulator. In our research work two cases are considered for two different probabilities when binary codes are generated by Bernoulli generator with 0.5 and 0.4. Finally, we evaluated the Bit Error Rate (BER) and signal to noise ratio (SNR) performance of the proposed system varying with different system parameters. The BER is highly degraded on severe atmospheric turbulence condition ever for a short distance of free space channel. The effect of path loss factor due to dense fog is also severe on the BER even though the turbulence effect and free space distance is short. After Bernoulli generator spectrum analyser used to see result in frequency domain and later on both signal are convoluted with Hadmard code to achieve orthogonality criterion and the passed through AWGN channel and 10db SNR also added the both signal are processed with FSO circuit with different parameters condition and finally BER is received on display. FSO communication is latest trend technology and during this lot of problem have to face and one important parameter is that range. Signal cannot be transmitted to long distance if we transmit then BER also increased and due to which communication is not reliable so we required to work on distance and power utilized and atmospheric condition so that data can be transmitted successfully for a long distance

Published by: Usha, Manisha

Research Area: FSO Optical Fiber

Research Paper

45. Comparative Research Analysis on LTE Techniques to Reduce PAPR in Multi-Carrier Communication Systems

New technologies emerging day by day and main focus to give a better quality of service at low cost. LTE has adopted DFT-spread OFDMA technique as the uplink multiple access schemes which use single carrier modulation and frequency domain equalization. In our research work, we show the PAPR performance of DFT-spreading technique. The performance of PAPR of DFT spreading technique is dependent on the number of subcarriers assigned to each user. In this thesis, a method for PAPR reduction in LTE system has been introduced, which is based on the DFT spread method. DFT spread method is further classified into two methods known as LFDMA (localized FDMA) and IFDMA (interleaved FDMA). It was shown that an interleaved FDMA and localized FDMA perform better than orthogonal FDMA in the uplink transmission where transmitter power efficiency is of great importance in the uplink. LFDMA and IFDMA result in lower average power values due to the fact that OFDM and OFDMA map their input bits straight to frequency symbols where LFDMA and IFDMA map their input bits to time symbols. We conclude that single carrier-FDMA is a better choice on the uplink transmission for cellular systems. Our conclusion is based on the better efficiency due to low PAPR and on the lower sensitivity to frequency offset since SC-FDMA has a maximum of two adjacent users. From results, it can also be concluded that the performance of IFDMA is far better than the LFDMA.

Published by: Ritu, Tajendar Malik

Research Area: Commuinication

46. Effectively Reconstructing the Routing Paths in Sensor Networks

In wireless sensor networks, sensor nodes are usually self-organized or self sorted, delivering data to a central sink in a multi-hop manner. Reconstructing the per-packet routing path enables fine-grained diagnostic analysis and performance optimizations of the network. The performances of existing path reconstruction approaches like MNT, however, goes down rapidly in large scale networks with loss links or failed links. We present Pathfinder, a vigorous path reconstruction method against packet losses as well as routing dynamics. At the node side, Pathfinder exploits temporal correlation between a set of packet paths and efficiently compresses the path information using path difference. At the sink side, Pathfinder infers packet paths from the compressed information and employs intelligent path speculation to reconstruct the packet paths with high reconstruction ratio. We propose a novel analytical model to analyze the performance of Pathfinder. We further calculate Pathfinder compared with two most related approaches using traces from a large scale deployment and extensive simulations by means of graph. Results show that Pathfinder outperforms existing approaches, achieving both high reconstruction ratio and low transmission cost compared to MNT.

Published by: Mohammad Peer M. Shaikh, Prof. Anand D. Vaidya

Research Area: Wireless Sensor Networks

47. Non Destructive Method by Penetrant Testing

This paper presents results from a literature review of defect characteristics essential for non-destructive testing (NDT). Most of the major NDT methods are included in the study – Penetrant Testing (PT), The study was performed by means of searching in scientific databases, , etc. Mainly, the following It is concluded that for Penetrant testing, the defect geometry, the defect size and the defect. A number of investigations address the relationships between the defect parameters like roller depth, surface defects Also the phenomena of the electrical contacts between the defect surfaces (for a crack) was studied. Defect parameters that are essential to the quality of Penetrant testing are defect position in the object (includes the depth), orientation, size, crack surface roughness, closure and tip radius. This investigation has been focused on those parameters that are not that easy to reconstruct and only briefly discussed the influence on the signal response due to defect position, orientation and size.

Published by: Shyamji, Dr. Suresh Prasad

Research Area: Welding

Research Paper

48. Analysis of Air Pollution

This research paper is an attempt towards analyzing real time air pollution data collected by PAQS sensor devices from some key locations in Bangalore. Air pollution in most of the metropolitan cities in India is turning out to be a major threat to our environment and hazardous to our health. Many infections and diseases related to lungs and throat are caused by the polluted air we breathe. There is a growing need to conduct regular measurements of air quality data and analyze it.

Published by: Rajeshwari K. Rai

Research Area: Data Analytics

49. Classification of Leaf Disease Based On Multiclass SVM Classifier

India, the country where the main source of income is from agriculture. Farmers grow variety of crops based on their requirement. Since the plants suffer from disease, the production of crop decreases due to infections caused by several types of diseases on its leaf, fruit and stem. Leaf diseases are mainly caused by bacteria, fungi, virus etc. Diseases are often difficult to control. Diagnosis of the disease should be done accurately and proper actions should be taken at the appropriate time. Image Processing is the trending technique in detection and classification of plant leaf disease. This work describes how to automatically detect leaf diseases. The given system will provide fast, spontaneous, precise and very economical method in detecting and classifying leaf diseases. This paper is envisioned to assist in the detecting and classifying leaf diseases using Multiclass SVM classification technique. First the affected region is discovered using segmentation by K-means clustering, then features (color and texture) are extracted. Lastly, classification technique is applied in detecting the type of leaf disease. The proposed system effectively detects and also classify the disease with accuracy of 92%.

Published by: Pooja Kulinavar, Vidya I. Hadimani

Research Area: Image Processing

Research Paper

50. Novel Fuzzy logic controller based Multivariable Energy Management Strategy for Standalone DC Micro grids

Due to substantial generation and demand fluctuations in standalone green electricity control schemes are becoming crucial for the electricity sharing and voltage regulation functions. The classical power management strategies rent the maximum energy factor monitoring (MPPT) algorithms and rely on batteries in case of possible extra or deficit of strength. However, so as to recognize constant present day-constant voltage (IU) charging regime and growth the life span of batteries, electricity control strategies require being more flexible with the power curtailment feature. The paper proposes a method for the hybrid solar photovoltaic and wind energy device in Battery management for stand-alone applications. Battery charging manner is non-linear, time-varying with an enormous time delay so it is difficult to achieve the best energy management performance by using traditional control approaches. A fuzzy manipulate approach for battery charging or discharging utilized in a renewable power generation system is analysed in the paper. To improve the life cycle of the battery, fuzzy control manages the desired state of charge (SOC). A fuzzy logic-based controller for use for the Battery SOC manipulate of the designed hybrid system is proposed and in comparison with a classical PI controller for the overall performance validation. The whole designed device is modelled and simulated the use of MATLAB/Simulink Environment.

Published by: Navneet Singh Saini, Davesh Bindal

Research Area: Power

51. Authorized Deduplication of Files in Cloud Environment

Data deduplication is one of vital information compression procedures for disposing of copy duplicates of rehashing information, and has been generally utilized as a part of cloud storage to lessen the measure of storage room and spare data transfer capacity. To secure the classification of sensitive information while supporting deduplication, the convergent encryption method has been proposed to encode the information before outsourcing. To better ensure information security, this paper makes the primary endeavor to formally address the issue of approved information deduplication. Not quite the same as customary deduplication frameworks, the differential benefits of clients are additionally considered in copy check other than the information itself. We likewise display a few new deduplication developments supporting approved copy check in a half and half cloud design. Security examination exhibits that our plan is secure as far as the definitions indicated in the proposed security show. As a proof of idea, we execute a model of our proposed approved duplicate check plan and direct testbed experiments utilizing our model. We demonstrate that our proposed authorized duplicate check scheme brings about negligible overhead contrasted with normal operations.

Published by: Shrikrishna Kerur, Dr. Anand N. Diggikar

Research Area: Cloud Computing

52. Fuzzy Enhanced 3-Level Dwt Image Compression

The development of the higher quality and cheaper image acquisition devices has produced steady increases in both resolution and image size, and a greater consequent for the design of efficient compression techniques . Although the storage capacity and transfer bandwidth has grown accordingly in past years, many applications still require compression. Uncompressed multimedia (graphics, video and audio) data requires storage capacity and transmission bandwidth. Despite the frequent progress in mass-storage density, processor speeds, and the performance of digital communication system , demands for data storage capacity and data transfer bandwidth. The amount of data related with visual information is so large that its storage would requires more storage capacity. Storage and/or transmission of such data require large capacity and/or bandwidth, which could be very expensive. In this research work we present a technique which is a combination of DWT technique and enhanced by Fuzzy logic function. The aim is to give a better compression ratio along with increasing the visual perception quality of the image. Fuzzy logic Technique have been used in various areas involving clustering, data aggregation pattern deduction etc. We are using fuzzy logic in this technique to improve the quality of the compressed image resulted from DWT image compression technique.

Published by: Manish Mishra, Dr. Md. Sanawer Alam

Research Area: Image Compression

53. Supply Chain Management System

Leading industry analysts are projecting continued growth for business applications such as Enterprise Resource Planning, Customer Resource Management, and in particular Supply Chain Management. The survey of manufacturers was conducted that supports the analysts’ projections. The survey was conducted to determine the extent to which supply chain management techniques and technologies are being used in different sectors, to understand the challenges faced while implementing Supply Chain systems, to identify key barriers to supply chain collaboration, and to understand the state of the market for SCM systems The survey findings reveal that the companies face a number of challenges while implementing Supply Chain systems in their organizations. And there are identifiable key barriers to supply

Published by: Arjun Singh, Gyanendu Sharma

Research Area: Indusrty

54. Optimization of Parameter for Surface Roughness by Using Taguchi Method

In order to produce any product with desired quality by machining, proper selection of process parameters is essential. This can be accomplished by Taguchi approach. The aim of the present work is to investigate the effects of process parameters on surface finish to obtain the optimal setting of these process parameters, and the regression analysis and the analysis of variance(ANOVA) is also used to analyse the influence of cutting parameter during machining. Current investigation on turning process is a Taguchi optimization technique applied on the most effective process parameters i.e. feed, cutting speed and depth of cut while machining SS 304 as the work piece with brazed cutting tool. Main effect plots are generated and analysed to find out the relationship between them. The details of experimentation and analysis are given in the following context.

Published by: Krishanveer Singh, Abhilash Karakoti

Research Area: Production

55. Design & Analysis of Polyethylene Terephthalate Based Sandwich Beam with Ansys

The present work aims to study the design and analysis of functionally graded layered sandwich beam with viscoelastic core (Polyethylene terephthalate) in high thermal environment. The top layer is functionally graded material (Ceramic S3N4) and stainless steel on the bottom. The report is divided into five chapters. First two chapters give Introduction and Literature survey of various viscoelastic core with Functionally graded material (FGM) that have unique and different properties from any other material. Materials and Methods used are given in chapter three. Chapter four provides details of ANSYS Software where the sandwich beam is subjected to axial dynamic loaded conditions. The viscoelastic core (Polyethylene terephthalate) layers is observed to be at high temperature but other layer of the beam are at normal conditions. The temperature variation is non-linear in viscoelastic core (Polyethylene terephthalate). The effect of various system parameters such as core thickness ratio, compression, expansion, modal structure test, thermal test are analyzed using ANSYS software. The thickness of beam is responsible for their displacement and boundaries for stable and unstable region are considered in this work. It is found that the buckling load of the beam decreases with increase in core thickness parameter, and temperature of top face of functionally graded material. Temperature variation along thickness has negligible effect on buckling load as well as on natural frequency. In this study as frequency of the beam increases deformation starts first in Polyethylene terephthalate layer of the beam. The instability of the beam increases with increase in core thickness parameter and temperature of the top layer.

Published by: Rohit Kumar Singh, Nidhi Sindhu, Sushil Kumar Singh

Research Area: Industrial

56. Comparison and Analysis of Multistoried R.C.C Building in Different Seismic Zones

The principle objective of this project is to analyses and design a multistoried building [G + 21 (3 dimensional frame)] using STAAD Pro. And comparing the results on basis of 1. Economic 2. Difference in material requirement 3. Degree of stability under different seismic zones 4. Level of supervision of construction 5. Requirement of special tool and equipment Therefore the four requirements named as 1. Utility 2. Safety 3. economical 4. elegance Must be fulfilled for a structural design to be satisfactory. This project presents a comparative study on design a multistoried RCC building in different seismic zones in India by limit state method. The design involves load calculations manually and analysing the whole structure by STAAD Pro.Based on Limit State Design conforming to Indian Standard Code of Practice IS: 456, IS: 1893, IS: 13920, IS: 875.The structure was subjected to self-weight, dead load, live load, wind load and seismic loads. Under the load case details of STAAD.Pro. The wind load values were generated by software considering the given wind intensities at different heights and strictly abiding by the specifications of IS 875. Seismic load calculations were done following IS 1893-2005. The materials were specified and cross-sections of the beam and column members were assigned. The supports at the base of the structure were also specified as fixed. The codes of practice to be followed were also specified for design purpose with other important details. After completion of the design, we can work on the structure and study the bending moment and shear force values with the generated diagrams. We may also check the deflection of various members under the given loading combinations. The design of the building is dependent upon the minimum requirements as prescribed in the Indian Standard Codes. The minimum requirements pertaining to the structural safety of buildings are being covered by way of laying down minimum design loads which have to be assumed for dead loads, imposed loads, and other external loads, the structure would be required to bear. Strict conformity to loading standards recommended in this code, it is hoped, will ensure the structural safety of the buildings which are being designed. Structure and structural elements were normally designed by Limit State Method.

Published by: Arvind Kumar Gupta, Mirza Aamir Baig

Research Area: Comparative Earthquake Analysis

Research Paper

57. Novel Approach For Spectrum Sensing of Primary use by Hybrid Swarm Intelligence in Cognitive Radios

In cognitive radio spectrum sensing is important challenge for secondary user but not challenge for primary user because primary give bandwidth and secondary user acquired bandwidth, when two or more user acquired same bandwidth which increase the error.so reduce the false detection and increase band width is main objective of this paper. In this paper comparison with Particle swarm optimization and hybrid swarm and ANT colony method, In our experiment show the PSO with Ant colony optimisation better throughput and true detection rate.

Published by: Manpreet Kaur, Mayank Joshi

Research Area: Communication

58. Project Planning and Delay Analysis For 2x300 MW EMCO Thermal Power Plant

Proper planning and scheduling is very important in construction projects for reducing and controlling delays in the project. Substantial amounts of time, money, resources are wasted each year in the construction industry due to improper planning and scheduling. With globalization, the construction projects have become big. Planning of such projects require huge amount of paperwork, which can be reduced with the help of project planning software. The main objectives of this study is to plan, schedule, and track with help of primavera, and study the results generated. Also, to recommend measures to the organization for enhancing the project planning skills for similar projects in future. The construction delay is a main problem in large construction industry of India. Large construction industry is very important for the development, economy and progress of India. Delay to projects is one of the foremost concerns of the construction industry in India. The delays to the projects are affecting the economies throughout the world. Delay to projects mean the slowdown of development in all other related fields. Construction project management is vital for accomplishing pre-determined objectives. Despite using construction management, most of the projects do not meet original time schedule or has been delayed. Delay is one of the biggest problems faced by construction industry. This project is a study or research to approach and analysis to avoid and control the time delay in construction in India. This research will review delay factors through literature review and scheduling through primavera. To carry out this research, comprehensive literature review is done to provide the background, history and delay factors of project management in construction. The information of literature review will be used for planning and scheduling and will be a great support to planning department to control the delay in constructions.

Published by: Mohammad Zeyauddin, Masoom Reza, Shahzeb Md. Danish

Research Area: Construction Management

Research Paper

59. Overview of Regulatory Guidelines for Medical Devices

Medical devices are becoming more important in the health care unit. Diversity and intricacy of medical devices in last two decades. Regulation of these devices has also advanced due to the requirement for a steady regulatory perspective. One of the major issues for companies developing and producing medical devices is to be updated on the regulatory requirements and implement them in the process. This thesis examines the regulatory requirements for medical devices in Australia, Brazil, India, Japan, Russia, MENA countries and compares them with the requirements in the European Union. The conclusion of this thesis is that most countries have similar requirements for registration of medical devices and are striving to harmonize with the GHTF guidelines. Now with the availability of different regula­tions of the countries or region on medical devices, there is a need to harmonize regulations in order to curtail regulatory hurdles and expedite access to high quality, safe and efficacious medical devices. Most coun­tries are trying to harmonize the regulatory guidelines for medical devices through their participation in Global Harmonization Task Force (GHTF). Harmonized regulation of medical device will lead to the availability of quality product.

Published by: Pakala Jayasree, Jyothshna Devi .K, Alagusundaram. M, Jayachandra Reddy .P

Research Area: Drug Regulatory Affairs

60. Pharmaceutical Market and Regulatory Issues for Export of Pharmaceutical Products to Latin American Countries

Indian Pharmaceutical industry is one of the world’s largest and most developed, ranking fourth in terms of volume and thirteenth in terms of value. The country accounts for an estimated 10% of global production and 2% of world markets in pharmaceuticals. The regulatory process to obtain marketing authorizations (MAs) for drugs in Latin American (LATAM) countries, despite regional harmonization efforts, is highly country-specific. Complex and evolving ad-hoc requests from reviewers must be proactively addressed to avoid costly delays or show-stoppers to pharmaceutical products import from India. The comforts and confronts faced by pharma companies in India. It is of case study approach and presented the Indian pharma export advantages, the Government initiatives toward the export market, problems, the recent US trademark legislation issues, and the opportunities in a nutshell. Indian companies have focused on export prospects in both formal and informal markets globally. Keeping the high credibility of serving quality products in the complex market, India has gained a strong reputation among the global place.

Published by: Gopala Krishna Mukkamala, Sabareesh .M, Alagusundaram .M, Jayachandra Reddy .P

Research Area: Drug Regulatory Affairs

61. Seismic Analysis of Office Building with Prestressed Flat

High rise building structures are both a necessity and a matter of sophistication and pride for structural engineers. Buildings crossing 25 to 30 storeys are a common phenomenon these days. But what happens to a structure as it crosses these height limits? Forces of nature in the form of earthquakes and cyclones starts playing brutal games with the structures. Higher the structure goes, higher it attracts the forces and wrath of nature in the form of seismic force. Seismic force, predominantly being an inertia force depends on the mass of the structure. As the mass of the structure increases the seismic forces also increase causing the requirement of even heavier sections to counter that heavy forces. And these heavy sections further increase the mass of the structure leading to even heavier seismic forces. Structural designers are met with huge challenge to balance these contradictory physical phenomena to make the structure safe. The structure no more can afford to be rigid. This introduces the concept of ductility. The structures are made ductile, allowing it yield in order to dissipate the seismic forces. A framed structure can be easily made ductile by properly detailing of the reinforcement. But again, as the building height goes beyond a certain limit, these framed structure sections (columns) gets larger and larger to the extent that they are no more practically feasible in a structure. A flat slab is a one-way or two-way system with thickenings in the slab at the columns and load bearing walls called ‘drop panels’. Drop panels act as T-beams over the supports. They increase the shear capacity and the stiffness of the floor system under vertical loads, thus increasing the economical span range. A flat slab is a one-way or two-way system with thickenings in the slab at the columns and load bearing walls called ‘drop panels’ Figure 9. Drop panels act as T-beams over the supports. They increase the shear capacity and the stiffness of the floor system under vertical loads, thus increasing the economical span range. Here an attempt has been made to study the behaviour of different structures of reinforced concrete with different prestressed systems. Studies have been carried out on sample model structures and analysis has been carried out by ETABS software. It has been ensured to consider sample models that represent the current practices in structural design to include different structural configurations

Published by: Wamik U. R Rahman, Misbah Danish, Mirza Aamir Baig

Research Area: Structural Engineering

62. Design of Industrial Steel Building by Limit State Method

In this project work it is proposed to carry out the design of an industrial steel storage shed. The four requirements named as Utility, safety, economical and elegance must be fulfilled for a structural design to be satisfactory. This project presents a study on behavior and economy of roof trusses columns and purlins by comparison of limit state and working stress method. Roof trusses and purlins are integral part of an industrial building. This project presents a study on behavior and economical of fink type roof trusses, beam section purlins by comparison of limit state and working stress method. This study involves in examination of theoretical investigations of specimens in series. Overall two methods were designed and comparison of all the internal force, economical, and hence, to evaluate the co-existing moments and shear forces at the critical cross-section with same configuration area by keeping all other parameters constant. The theoretical data are calculated using Indian Standard code IS 875-1987 (part I), IS 875-1987 (part II), IS 875-1987 (part III), IS 800 – 2007 using limit state method, IS 800- 1984 using working stress method and the section properties of the specimens are obtained using steel table. The project aims to provide which method is economical, high bending strength, more load carrying capacity and high flexural strength. The studies reveal that the theoretical investigations limit state method design is high bending strength, high load caring capacity, minimum deflection and minimum local buckling& distortional buckling compare to the working stress method.

Published by: Dinesh Kumar Gupta, Mirza Aamir Baig

Research Area: Design of Steel Structure

Research Paper

63. Design and Simulation Result Analysis of Data Aggregation in Ns2 for WSN with Security

Energy efficiency is an important metric in resource constrained wireless sensor networks (WSN). Multiple approaches such as duty cycling, energy optimal scheduling, energy aware routing and data aggregation can be availed to reduce energy consumption throughout the network. This thesis addresses the data aggregation during routing since the energy expended in transmitting a single data bit is several orders of magnitude higher than it is required for a single 32 bit computation. Therefore, in the first paper, a novel nonlinear adaptive pulse coded modulation-based compression (NADPCMC) scheme is proposed for data aggregation. A rigorous analytical development of the proposed scheme is presented by using Lyapunov theory. Satisfactory performance of the proposed scheme is demonstrated when compared to the available compression schemes in NS-2 environment through several data sets. Data aggregation is achieved by iteratively applying the proposed compression scheme at the cluster heads. The second paper on the other hand deals with the hardware verification of the proposed data aggregation scheme in the presence of a Multi-interface Multi-Channel Routing Protocol (MMCR). Since sensor nodes are equipped with radios that can operate on multiple non-interfering channels, bandwidth availability on each channel is used to determine the appropriate channel for data transmission, thus increasing the throughput. MMCR uses a metric defined by throughput, end-to-end delay and energy utilization to select Multi-Point Relay (MPR) nodes to forward data packets in each channel while minimizing packet losses due to interference. Further, the proposed compression and aggregation are performed to further improve the energy savings and network lifetime. Besides this we also applied RSA security algorithm for encryption and decryption.

Published by: Rachna Kumari, Sunil Dalal

Research Area: WSN

64. Personality Based Indian Song Suggestions

Songs play a very important role in living beings. Dr. Jagdish Chandra Bose, the pioneer of music impact on plant growth had experimented & shown the results that even plants respond positively to the nice music. We, the people generally tend to hear only the trending songs & listen to sings under peer influence. But people should hear right kind of music suitable to their personality traits in order to bear the musical health benefits. An experiment was carried out where participants were asked to answer personality related questionnaire & were asked about their music preferences. The study shows that among the five traits each trait has got to do something with the Indian music genres. The App has been tested over 70 people. Till date we knew music has some impact but with this study which music/ song genres are suitable to what kind of personalities has been attempted.

Published by: Ela Gore

Research Area: Text Analytics, Psychology, Music

65. Robust Approach of Compressing Images Using DCT and Analysis of Parameters PSNR, CR

Image compression is one of the tedious tasks in the field of image transmission via internet and to store in binary or digital form on computers and other storage devices. The necessity of bandwidth of channel relies on many aspects like data and size of file to be transferred. Compressing an image is significantly different than compressing raw binary data. If we used general or outdated technique to compression images then result would be not optimal as it should be. This is because images have definitely statistical properties which can be triggered or exploited by encoders which are implemented or design for them. In image we have to give up some fine details for the sake of saving a little more bandwidth or storage space. So we can say that lossy compression technology. In this dissertation compression of digital images are done with the help of DCT. Several encoding technique have also been used together with DCT to improve the performance of compression. A computational analysis of picture quality is also made with respect to compression ratio and PSNR.

Published by: Madhu, Sunil Dalal

Research Area: Image

66. Parametric Optimization of Hot Machining Process for Aisi4140 Material Using Grey Relational Technique.

In the industries, there is a need of materials with very high hardness and shear strength in order to satisfy industrial requirements. So many materials which satisfy the properties are manufactured. Machining of such materials with conventional method of machining was proved to be very costly as these materials greatly affect the tool life. So to decrease tool wear, power consumed and increase surface finish Hot Machining can be used. The L9 orthogonal array of a Taguchi experiment is selected for four parameters (speed, feed rate, depth of cut and temperature) with three levels (low, medium, and high) in optimizing the hot machining turning parameters on lathe.

Published by: Jagjit Singh

Research Area: Mechanical

67. Challenges before Micro, Small and Medium Enterprises and Role of Government

MSME meaning is The Micro, Small and Medium Enterprise (MSME), the sector considered as the most vibrant and dynamic sector of the Indian economy over the five decades. The major characteristics of Indian MSMEs is a high contribution to domestic production, significant export earnings, low investment requirements, operational flexibility, location wise mobility, capacities to develop appropriate indigenous technology, import substitution, contribution towards defense production technology-oriented industries and competitiveness in domestic and export markets help them tap opportunities in various sectors. This sector not only source of providing large employment opportunities at comparatively lower capital cost than large industries but also it helps in the industrialization of rural & backward areas, which ultimately helps in reducing Regional imbalances, assuring more equitable distribution of national income and wealth. MSMEs are similar to, large industries as ancillary units and contribute enormously to the socio-economic development of the country.Micro Small and Medium Enterprises have played a significant role in the economic development of the various countries. India certainly is no exception. The sector which has such a remarkable contribution to economy need to look into the other side i.e. the challenges. India being a democratic country government rests with larger responsibility. So in the article we have discussed the challenges and the role of government and its effactiveness.

Published by: Dr. Shashikant Magar

Research Area: Commerce

Research Paper

68. Classification of copy move forgery and normal images by ORB features and SVM classifier

Today, the characterization of the technological age is done by the digital images spread. They are the most common form of conveying information whether through internet, newspapers, magazines, or scientific journals. They are used as a strong proof against various crimes and as evidence used for various purposes. The modification, capturing or creating of the image has become easier and available with the emergence of means of image editing and processing tools. One of the most important and popular type of image forgery is copy-move forgery in which an image part is copied and then pasted into the same image that has intention of hiding something important or showing a false scene. Because the important properties of the copied parts comes from the same image, such as brightness, noise, and texture which will be compatible with the entire image that makes more difficult for experts for the detection and distinguishing the alteration. Usually, the detecting copy move forgery conventional techniques suffer severely from the time-consuming problem. The evaluation of the improved method had been done using (150) images that was selected from two different datasets, “CoMoFoD” and “MICC-F2000”. Experimental results show that the improved method can accurately and quickly reveal the doubled regions of a tampered image. In addition, greatly reducing the processing time in comparison to the khan algorithm, and the accuracy is kept at the same level. Owing to the availability and technological advancement of the image editing sophisticated tools, there is increase in the loss of authentication in digital images. Thus, this led us to the proposal of different detection techniques that checks whether the digital images are forged or authentic. The specific type of forgery technique is copy move forgery in which widely used research topic is detection under digital image forensics. In this thesis an enhancement of copy move image forgery classification is done by implementing a hybrid features with classification algorithm like SIFT with SVM and EM algorithm and ORB with SVM and EM .The technique works by applying firstly the DCT on an image and then on an resultant image, SIFT is obtained after applying DCT. A supervised learning method is proposed for classifying a copy-move image forgery of TIFF, JPEG, and BMP. The process starts with reducing the color of the photos. Achieve the accuracy more than 90%.

Published by: Rekha Devi, Deepti Chauhan

Research Area: Copy Move Forgery

69. Review of Classification of Copy Move Forgery

Copy move forgery is one of the important fields in forensic science for image pro-cessing. Image forgery is different like copy move attack, image splicing and image retouching. In this paper review the specific copy move forgery images and its detection methods.

Published by: Rekha Devi, Deepti Chauhan

Research Area: Copy Move Forgery

70. Parametric Study of Multi-Storey Buildings for Blast Load

An explosive is a mixture of compounds which, when initiated by heat, impact, friction, or shock, undergoes a rapid decomposition in the form of heat and gas where tremendous amounts of energy is released. Full or partial collapse of buildings, minor and major cracks are the most perceptible type of failure that may result from a blast load. The level of damage produced in a structure depends on charge weight, distance of building from point of explosion . This work deals with study of nature of blast loading and its effects on regular and irregular multi-storey building with and without shear wall opening.

Published by: Prajna, Deepthishree S. Aithal

Research Area: Structural Engineering

71. Universal Dependencies of Sanskrit

We present the first steps towards a treebank of Sanskrit within the Universal Dependencies framework. our dataset is tiny at the moment, consisting of less than 200 sentences—a result of a summer internship project. Nevertheless, this seems to be, to the best of our knowledge, the first publicly available piece of syntactically annotated Sanskrit text.We also present a parsing experiment, with results surpassing delexicalized parsing.

Published by: Puneet Dwivedi, Easha Guha

Research Area: Natural Language Processing

Research Paper

72. Gateway Based Energy Efficient Routing: GEER

Wireless Sensor Networks comprise of an extensive number of little and minimal effort sensor nodes energised by little non rechargeable batteries and furnished with different detecting gadgets. WSN is sent, most likely in a rough and inhospitable landscape, it is relied upon that all of a sudden dynamic to accumulate the required information for a few times when something is identified, and after that outstanding to a great extent idle for drawn out stretches of time. In this way, scientists are constantly persuaded to configuration create effective energy efficient plans and relating calculations with a specific end goal to give sensible optimal utilization of battery power and to enhance the system lifetime for WSNs. The lifetime of wireless sensor network systems is enhanced by cluster location and balancing the network loading among the clusters. In this research work a gateway based technique has been contemplated. The nodes have been isolate into ordinary, middle of the road and progressed in light of their vitality arrangement and every class has its own particular paradigm for determination likelihood. The calculation performs well as far as number of alive nodes, network lifetime, average energy and so forth. A comprehensive investigation as far as different graphical parameters is additionally introduced in this work.

Published by: Naziya Anjum, Masood Ahmad, Dr. Shafeeq Ahmad

Research Area: Wireless Sensor Networks

Review Paper

73. Crack Detection in Railway Track Using Image Processing

Computer vision can provide many potential advantages over manual methods of railway track inspection. Great levels of performance can be achieved through the automation of inspection using computer vision systems, as they allow scalable, quick, and cost-effective solutions to tasks otherwise unsuited to humans. At a minimum, railway track components can be objectively and quantitatively inspected, as the system does not suffer from fatigue or the subjectivity inherent with human inspectors. The digital nature of the data collection involved with a computer vision based method, archiving inspection results and trending of the data becomes feasible, leading to more advanced failure prediction models for maintenance scheduling and a more thorough understanding of railway track structure. In this research paper, a computer vision based method is presented. A system has been suggested which can periodically take images of the railway tracks and compared with the existing database of non-faulty track images on a continuous basis. If a fault arises in the track section, the system will automatically detect the fault and necessary actions can be taken, to avoid any mis-happening.

Published by: Aliza Raza Rizvi, Pervez Rauf Khan, Dr. Shafeeq Ahmad

Research Area: Image Processing

Research Paper

74. Image Processing Based Disease Detection for Sugarcane Leaves

Sugarcane is one of the most important crop in India. Indian sugar industry is the second largest agro based industry, next only to the textiles. But, being long durational crop, sugarcane is prone to the number of disease caused by pathogens viz. fungi, bacteria, viruses and phytoplasmas like organisms. Image processing techniques has been proved to be changing the scenario of agriculture in India with a number of research and applications like automatic disease detection, drone based pesticides and fertilizer dispensing, estimation of yield, vegetative growth, fruit sorting etc. This research is carried out to study effectiveness of Image Processing and computer vision techniques for detection of disease in sugarcane plants by observing the leaves. Few major diseases in sugarcane plant like red rot, mosaic and leaf scald have been studied and detection algorithm for the same has been implemented in this research work.

Published by: Arifa Khan, Manmohan Singh Yadav, Dr. Shafeeq Ahmad

Research Area: AIET, Lucknow

Research Paper

75. Experimental Investigation of Mechanical Properties of Luffa-Epoxy Composite

In current years composites have concerned considerable importance as a potential operational material. Low cost, light weights, high specific modulus, renewability and biodegradability are the most basic & common attractive features of composites that make them useful for industrial applications. Luffa- cylindrica locally called as “sponge-gourd” is one such natural resource whose potential as fiber reinforcement in polymer composite has not been explored till date for tribological applications. In this research twin layer fiber and triple layer fiber composites are prepared and were tested to study mechanical properties.

Published by: Gurmeet Singh Arora, Dr. A. S Verma, Dr. Nitin Srivastava

Research Area: Composite

Research Paper

76. Flame Retardant Luffa Fiber Reinforced Composites with Epoxy Resin Matrices

In current years composites have concerned considerable importance as a potential operational material. Low cost, light weights, high specific modulus, renewability and biodegradability are the most basic & common attractive features of composites that make them useful for industrial applications. Luffa- cylindrica locally called as “sponge-gourd” is one such natural resource whose potential as fiber reinforcement in polymer composite has not been explored till date for tribological applications. In this research twin layer fiber and triple layer fiber composites are prepared and were tested to study thermal properties.

Published by: Gurmeet Singh Arora, Dr. A. S Verma, Dr. Nitin Srivastava

Research Area: Composite

Review Paper

77. Review of Cognitive Spectrum Sensing Of Secondary User by Soft Computing

Subjective Radio innovation is being used to give a system for using the range more beneficially, range detecting is crucial to this application. The limit of Cognitive Radio frameworks to get the opportunity to spare zones of the radio range and to keep watching the range to ensure that the Cognitive Radio framework does not cause any undue obstacle depends completely on the range detecting parts of the framework. For the general framework to work suitably and to give the required change in range effectiveness, the Cognitive Radio range detecting framework must have the ability to enough recognize some different transmissions, perceive what they are and teach the central get ready unit inside the Cognitive Radio with the objective that the required move can be made.

Published by: Manpreet Kaur, Mayank Joshi

Research Area: Wireless Communication

78. Identity Crisis in V. S. Naipau the House of Mr. Biswas

This paper focus on voice and identity in V.S Naipaul “The house of Mr.Biswas” This paper deals with the man struggle to make something valuable.it is a struggle projected by the heroic effort to own is dream house, which in the way to own his own life. Plot of the story is that lone man struggle to free from oppressive force of his in-laws and failing health . Naipaul has also convey that the struggles faces by the man actually mold him to reach his dream. Mr.biswas mostly lives in a series of houses that either do not belong to him or are houses unworthy of the name. Each of the houses he lived is a attempt of solving a problem and each is a wrong answer in a different way . Author projected the character of Mr.Biswas as smart and funny but also often petulant , mean and unsympathetic. Mr. Biswas enemy , who are mostly is relatives are largely unlikable , but they also have admirable moments. A house for Mr.Biswas is 1961 novel published by V.s Naipaul , this is the first work of Naipaul to receive the positive critics and acclaim worldwide . Naipaul also drawn simple elements form the life of his father and he also potrayed the struggle faced by him

Published by: R. Vaishnavi, J. Kiruba Sharmila

Research Area: Identity

Research Paper

79. Experimental Investigation on Ferro-Cement with and Without Using Fibers

Ferro-cement is made up of combination of cement mortar and multiple layers mesh which is closely spaced. It is widely used due to its advantage from its some of behaviour such as mechanical properties and impact strength. The main aim of this project is to investigate the strength of characteristics of Ferro-cement with and without using fiber. Also compare the strength of Ferro-cement with using three different types of fiber such as, polypropylene, polyester and synthetic fibers. Mechanical test were performed to check the effect of fibers on improving compressive and flexural/bending strength in Ferro-cement. Since fiber is act as secondary reinforcement, it will prevent Ferro-cement from micro cracking and propagation crack growth and increases the strength.

Published by: Sahana H. D, Deepthishree S. Aithal, Rajendra Rao Kalbhavi

Research Area: Ferro-Cement

80. Study on Diagrid Structures with Various Aspect Ratio under the Action of Wind

Multi-storey building construction is increasing day by day throughout the world. The design and construction of artificial infrastructure on the lines of bio-mimicking principles require the development of highly advanced structural system which has the quality of aesthetic expression, structural efficiency and most importantly geometric versatility. Recently, the use of diagonal members for carrying the gravity and lateral load has increased and these members are known as ‘diagrid’. The unique geometrical configuration of the diagrid structural system has led them to be used for tall buildings providing structural efficiency and aesthetic potential. The study done in this paper by considering aspect ratio 1:1, 1:2, 1:3 and 1:4. Diagrid angles 33.69°, 53.13°, 63.43°, and 69.44° for G+60 story building. The behaviour of structure under the action of wind load is studied. Also optimum angle for the prepared diagrid models will be studied. Software that will be used is ETABS 2015.

Published by: Denet Priya Mascarenhas, Deepthishree S. Aithal

Research Area: Structural Engineering

Dissertations

81. Pushover Analysis of Multistory Reinforced Concrete Building

The standard building codes define the significant design requirements to ensure the safety of buildings in a sudden ground movement causing inertial forces in the building. We witness post earthquake effects in many buildings designed as per these codes. Therefore it is important to analyse the building performance before physically constructing it. The performance based design gives you the choice to check the story drift, displacement at the roof level and the capacity before the building fails for certain ground motions. The performance based design ensures the safety for the Design Basis Earthquake (DBE) and Collapse prevention for Maximum Considered Earthquake (MCE). For performance-based seismic design, the performance levels described in ASCE 41, Seismic Rehabilitation of Existing Building (2007), for both structural and non-structural systems are the most Widely-recognized characterizations. These performance levels are Operational Building Performance Level, Immediate Occupancy Building Performance Level, Life Safety Building Performance Level, Collapse Prevention Building Performance Level. Non Linear Pushover Analysis method considers the nonlinear behaviour of the structure which increases the load taking capacity of the building. It also focuses on ductility of the structure by providing plastic hinges. Pushover analysis is applicable to new and existing structures which can be a good method for retrofitting of structures after its design life is over. It considers target displacement and defining objectives whenever the performance meet the objectives then the damage at that performance level is acceptable.

Published by: Dabeer Anwer Danish, Mirza Aamir Baig, Shahzeb Mohd. Danish

Research Area: RCC Structure Design.

82. A Study on Survive Quality and Passenger Satisfaction on Air India Services

This study examines the underlying forces of service quality influences on passengers‟ satisfaction in aircraft transport. The study examines which dimensions have a positive influence on service quality. The findings of this study are based on the analysis of a sample of 100 respondents.. The results suggest that there are different factors of in-flight service quality that are important according to the customer seat class. The dimensionality of perceived service quality in Domestic air travel was explored and dimensions were identified. These dimensions include in-flight service, in-flight digital service and back-office operations. The findings reveal that these dimensions are positively related to perceive service quality in international air travel and of these dimensions, Cuisines provided, seat comfort safety are the most important dimension in in-flight service quality. Personal entertainment is the most important dimension as perceived by airline passengers in In-flight digital service quality. Online ticket booking is another dimension in back-office operations. In addition, the findings indicate that passengers‟ satisfaction on different airline companies on basis of the services delivered

Published by: Dr. P. Usha, E. Kusuma

Research Area: Management Studies

Others

83. Honour Killing a Customary Killing

Honour Killing- A negative aspect of the society which has been practiced in India over the years and has resulted in loss of thousands of lives of innocent young ones. The registered cases in India have increased nine times in 2015 as compared to its past year that is 2014. Most of the time, the killing is by the relatives or family of the young couple whom they think had dishonoured them in society. It is mostly found among Hindu and Muslim families specifically in Northern states such as Haryana, Punjab and Uttar Pradesh where people strongly oppose inter caste and love marriages. Bigamy, adultery or even dressing against family culture has ended up killing of the victims. The murders are committed even publically to lesson other people of the community. It is not isolated to rural areas but also to urban areas and has widely spread geographically.

Published by: Suraj Sharma

Research Area: Criminal Law

Research Paper

84. Maya Angelou’s Life and Works as an Inspirational Sources to the Women

Maya Angelou’s life experiences is not just a history to be written but it is the sources of positivity and a portrayal of life from lifeless which conveys valuable meaning to the women all over the world and of any generation, especially to the black women. Maya Angelou’s autobiographical fiction and poems has deep rooted positive aspects shows a life of a women from how a young black girl dwelt in an dominant American society, her struggles of revelation on self to the nation have attained such an identity first as a black woman, then as an artist and as a writer. Her writings are not merely a piece of literary work to the literature of American Nation but it is a guideline of life that inspires, motivates, instructs and regulates every woman on how to look at life in an optimistic way along with the torments they endured. Despite her works, it is Angelou’s life story that emphasizes and teaches the value of patience, hope, perseverance, and the power of positive attitude against the pain through the name of Maya Angelou’s autobiographies.

Published by: Kotteeswari. R, Anbarasi .U

Research Area: English Literature

85. Review on Online Shopping For Visually Impaired People

For Visual impaired people it is difficult to choose different types of clothes during online shopping. We are developing system which helps blind people to recognize color of clothes along with different categories such as material, size, patterns etc. our system convert speech into text format and then display the result of different category with speech so that blind people choose the clothes they want. Identification of color is based on histogram of each image in HSI color space and multilevel clustering is used for identification of items that satisfying many local feature. For speech recognition Deep learning method are used. This approach is helpful for blind people as well as handicapped people

Published by: Kunal Mohadikar, Rahul Navkhare

Research Area: Data Mining

86. Honeypot for Detecting Behaviour & Exposing Attacker’s Identity for Dos and Ddos Attacks

Denial of Service attacks or Distributed Denial of service attacks are a big threat to the internet. Several methods, techniques and proposals are introduced to deal with the attacks but none of it has given a successful result. In this project we aim to design a honey pot to deal with Denial of service attacks and Distributed Denial of service attack. Honey pot is a recent technology in the area of computer network security. It is a computer or network segment on the internet that is set up to attract and trap people who attempt to penetrate other people’s computer system. The project focuses on designing a honey pot which appears as the original network and trap the attacker by attracting it. The honey pot will identify the identity of the attacker using some browser exploitation technique and also record the pattern of attack done by the attacker. The advantages of the system are twofold: First we can defend our operational network with a high probability against known Dos, DDoS and against new, future variants. Second, we trap the attacker so that recording of the compromise can help in a legal action against the attacker.

Published by: Rachana Khorjuwekar, Dilip Motwani

Research Area: Network and Security

87. Quest for self-identity: psychoanalytic theory and writing technique in Isabel Allende’s Maya’s notebook

This paper deals with the psychoanalytic feminism as in Isabel Allende’s Maya’s Notebook, as it involves the life and mind of women. And one can see that the whole novel is all about Maya a 19 year old girl and her sufferings because she has to cope with desolation and loss at a very young age. Another woman who gets the focus of the author is Maya’s grandmother. The most important issue that Allende brings out is the way women are abused for criminal activities and for money –making. The lyrical prose used to describe the delight of sea-life is yet another aspect of the novel. Exile in this kind of a landscape is more a search for inner peace, to escape alienation, death and grief. Life in Chile is portrayed as a union of raw nature unlike the mechanical life of California. Maya’s Notebook is very much rooted into the contemporary global culture. Maya Vidal regains her identity after getting shattered into pieces by the circumstances. Her exile and diary get her closer to herself and rescues her getting disintegrated.

Published by: K. R Veerapandian, N. Indhira Priya Dharshini

Research Area: Psychoanalytic Theory

88. An Automated Storage Area

The fast growth of data intensive applications has caused a change in the traditional storage model. The server to disk approach is being replaced by storage area networks (SANS), which enable storage to be externalized from servers, thus allowing storage devices to be shared among multiple servers. The prominent technology for implementing SANs is iSCSI, due to its suitability for storage networking. The iSCSI SAN is a block-level shared-storage alternative to Fiber Channel that is becoming increasingly popular among SME organisations. One of the reasons iSCSI SANs are popular for smaller businesses that often have fewer staff members with technical expertise is because they are easy to set up and maintain compared to Fiber Channel SANs. A storage area network (SAN) is a secure high speed data transfer network that provides access to consolidated block level storage. SAN makes a network of storage devices accessible to multiple servers. SANs are sometimes also referred to SAN storage, SAN network, network SAN etc. In this paper, we proposed the automated system using this SAN technique with respect to iSCSI protocol as a storage solution by LVM sharing system and shell scripting and realized a SAN storage and encryption system, through this technology to achieve a static data encryption which realized a flexible security management strategy by multiple-key and multiple layer encryption system to project the physical resource through iSCSI’s protocol stack and virtualization techniques.

Published by: Priyanka Khandekar, Dr. Kishor Kolhe

Research Area: Networking

89. Study on the impact of family’s Socio Economic status on employee’s level of satisfaction with organizations

Socioeconomic status (SES) is an economic and sociological combined total measure of a person's work experience and of an individual's or family's economic and social position in relation to others, based on income, education, and occupation. When viewed through a social class lens, privilege, power, and control are emphasized. Low SES and its correlates, such as lower education, poverty, and poor health, ultimately affect the society as a whole. Additionally, Mortality differences within society are greater than indicated by social class based on occupation alone. Irrespective of social class, men with greater material assets have lower rates of mortality from all causes than men less well endowed, independent of a wide range of lifestyle and biological factors. These findings suggest that mortality differences within our society are closely related to relative wealth. Inequities in wealth distribution, resource distribution, and quality of life are increasing globally. Each employee may differ at least with any one of socio economic variables related to them. Hence the differing employee satisfaction with organization among different employees is analyzed from point of view of their socio-economic variable. In this study, the factors that influence the employee satisfaction with organization among the selected respondents has been studied in terms of social status of employees.

Published by: Shweta Tewari

Research Area: Human Resource

Review Paper

90. Review on Recent Techniques for Reducing Packet Drops in Dense and Sparse Vehicular Network Scenarios

Vehicular networks have been the trending topic of research since the past decade that can be attributed to its enormous potential to enhance road safety, traffic efficiency and furnish un-interrupted service to the users over the course of mobility. Vehicular communications are being perceived as an enabler for driverless cars of the future. Automobile industries, governments and research community across the globe are investing extensive effort and capital towards the deployment of vehicular networks owing to the gigantic potential envisaged in its applications. Vehicular networks represent a special sub-class of MANET that presents numerous research challenges due to their distinct features such as hybrid network architectures, node movement characteristics and new application scenarios. Designing efficient routing protocols for VANET remains one of the most prominent challenging issues. The major challenge associated with Delay Tolerant Networks (DTN) protocols is ensuring less packet drops while ignoring delay. This paper reviews the works that lessen packet drops by exploiting position, social and velocity information of nodes in dense and sparse vehicular networks.

Published by: Aditi Saini, Pritpal Singh

Research Area: VANET

Others

91. Dowry Prohibition Act- A Shelter or A Weapon

Though system of dowry existed in India even before the British Rule, the format of this tradition was entirely different. It was a form of recourse for woman in case of emergency in which her father use to give a part of property to the bride to whom she was entitled but not to the groom in shape of land, gifts, etc. And the bride was sole beneficiary of this property. But now days, dowry means goods given by family of bride to the bridegroom, his parents and relatives on their marriage either on demand or without demand which may include cash, ornaments, furniture, utensils, house hold or immovable property. In India dowry is prohibited and who so ever demand for dowry is punishable under Section 498-A IPC.

Published by: Suraj Sharma

Research Area:

Others

92. RTI- An Act Improving Governance

Abstract not visible on author's request.

Published by: Suraj Sharma

Research Area:

Research Paper

93. A Comparative Study of Fly Ash and Na-Fly Ash as Adsorbents for Removal of Cr (VI), Cu (Ii) and Fe (Ii) From Aqueous Solutions

Removal of Cr(VI), Cu(II) and Fe(II) from aqueous solutions has been studied using fly ash (FA) and Na-fly ash (Na-FA) as adsorbents. The process of adsorption has been established to be dependent on concentration of the metal ions, time of contact, dosage of adsorbent and on temperature. The adsorptive behaviour has been found to fit in Freundlich adsorption isotherm. Ion exchange and complexation at the surface are the major mechanisms involved in the removal of metal ions from aqueous solution. Na-fly ash is found to have better adsorption capacity than FA which is attributed to the increased content of Na ions on the surface which is readily available for exchange with metal ions. The observed results indicate that Na-FA has excellent adsorption capacity compared to FA and can be used as a potential cost-effective adsorbent for removal of heavy metal ions from industrial effluents.

Published by: S. Sheeba Thavamani, D. Karthika Navaneetha

Research Area: Waste Water Treatment

Research Paper

94. A Region Based Load-Balancing Approach in Mobile Cloud Computing Environment

Distributed computing grants the end client to get to the required programming or equipment structures on request. This will lessen the cost of establishment and upkeep. Portable Cloud Computing (MCC) is acquainted with increment the experience of end client by giving them the administrations, best-case scenario. The improvement of distributed computing and virtualization strategies empowers advanced cells to beat the asset restriction obliged by enabling them to calculation offload and exchange a few sections of use for calculation to capable cloud servers. The proposed framework depends on the client's moving way portability. It will expect the client's area to complete the procedure. The proposed framework will lessen the reaction time and in addition enhance the heap adjusting.

Published by: Tejpal Singh, Karandeep Singh

Research Area: Mobile Cloud Computing

95. Adsorption Refrigeration using Zeolite-Water pair on Pro-e and MATLAB

The methods of production of cold by mechanical processes are quite recent. Long back in 1748, William Coolen of Glosgow University produced refrigeration by creating partial vacuum over ethyl ether. The first development took place in 1834 when Perkins proposed a hand operated compressor machine working on ether. Then in 1851 came Gorrie’s air refrigeration machine, and in 1856 Linde developed a machine working on ammonia. The pace of development was slow in the beginning when steam engines were the only prime movers known to run the compressors. With the advent of electric motors and consequent higher speed of compressors, the scope of refrigeration widened. The pace of development was considerably quickened in 1920 decade when du Pont put in the market the family of new working substances, the fluoro-chloro derivates of methane, ethane, etc.- popularly known as choloro fluoro carbons or CFCs under the of Freons. Recent developments involve finding alternatives or substitutes of Freons, since it has been found that chlorine atoms in Freons are responsible for depletion of ozone layer in upper atmosphere. Another noteworthy development was that of ammonia- water vapour absorption machine by Carre. These developments account for the major commercial and industrial applications in the field of refrigeration. A phenomenon called Peltier effect was discovered in 1834 which is still not commercialized. Advances in cryogenics, a field of very low temperature refrigeration, were registered with the liquefaction of oxygen by Pictet in 1877. Dewar made the famous Dewar flask in 1898 to store liquids at cryogenic temperatures. Then followed the liquefaction of other permanent gases including helium in 1908 by Onnes which led to discovery of the phenomenon of superconductivity. Finally in 1926, Giaque and Debye independently proposed adiabatic demagnetization of paramagnetic salt to reach temperatures near absolute zero. Here the main focus is on Zeolite-Water Solar Adsorption Refrigeration, Environmental protection initiates by environmental agencies are necessitating the replacement of chlorofluorocarbons with benign working fluids. One of the sensitive areas affected is refrigeration and heat pump technology, where new working pairs are being developed as an alternative to the traditional CFCs. This will have less impact of the destruction of ozone layer. In the design of adsorption refrigeration and heat pump systems, it is important to analyse precisely the performance of the cycle. This is based on an accurate determination off the adsorbent-adsorbate performance. Therefore, the thermodynamics behaviour of adsorbent materials has to be studied in details using a number of physical models, which are widely accepted. Various kinds of sorption systems have been developed, mostly of activated carbon-ammonia, activated carbon-methanol, silica gel- water and Zeolite-Water pairs. Nowadays, the refrigeration sector is one of the most important in the process industry. It was realised in the mid- 1970s that CFCs allow ultraviolet radiation into the earth’s atmosphere by destroying the protective ozone layer, while preventing infrared radiation from escaping the earth, and thus contributing to the greenhouse effect. The discovery of the ozone- depleting properties of CFCs and HCFCs refrigerants, and of their global warming potential, led to the Montreal Protocol, which scheduled the end of 1995, and of HCFCs by 2030. The production of these refrigerants has fallen dramatically in recent years. Researchers have recently focused on development of new refrigerants to replace CFCs and HCFCs. These new working fluids are synthetic compounds namely hydro fluorocarbons (HFCs). Although the ozone depletion potential of some of them is zero, their global warming potential related to the greenhouse effect- can be large. An alternative to HCFs in the use of naturally occurring substances (refrigerants) like ammonia, carbon dioxide, methanol, water and air. Consequently, from the 1970s interest in solid-vapour adsorption systems was rekindled in view of their energy saving potential in air conditioning and heat pump applications. Along with a consideration for energy efficiency, increasing attention was given to the use of waste heat and solar energy. Adsorption technologies have been used also extensively for separation and purification of gases for the past few decades but their exploitation for refrigeration is still limited. This has led to sorption technology receiving renewed attention due to environmental concerns. New classes of adsorbent-adsorbate pairs, like zeolite, silica gel or activated carbon, are gaining importance because they can replace CFC refrigerants. The advantages of such systems in comparison with conventional compression systems are- • Adsorption systems are environmentally friendly • They can use heat rather than electricity as the primary energy source. • No moving parts • No solution pumps • Silent and easy to maintain

Published by: Diwakar Srivastava

Research Area: HVAC

Review Paper

96. Analysis of Gait Recognition Using SVM and Surf Algorithm a Review

Gait is a potential behavioral feature and many studies have demonstrated that it has a rich potential as a biometric for recognition. Vision based Posture Recognition has a potential to be a natural and power full tool supporting efficient institutive interface for HCI. In this paper a survey of recent Human Gait Recognition System is presented, its purpose is to introduce a visual interpretation of Gait Recognition as mechanism of interaction to identify the human in the application of Biometric Authentication. Simple feature selection Hanavan’s model reduce the computational cost significantly during training and recognition. These methods have been applied on frames of videos, and these videos are live and some from ADSC-AWD database. In visual observation frameworks, human ID at a separation has as of late picked up more investment. The advancement of workstation vision methods has additionally guaranteed that vision based programmed walk examination might be continuously attained for training and testing purpose.

Published by: Chinu Sayal, Dr. Rajbir Kaur, Dr. Charanjit Singh

Research Area: Electronics and Communication

Research Paper

97. Task Scheduling in Cloud Computing

Cloud computing is the delivery of computing services—servers, storage, databases, networking, software, analytics and more—over the Internet (“the cloud”). Companies offering these computing services are called cloud providers and typically charge for cloud computing services based on usage, similar to how you are billed for water or electricity at home. We are probably using cloud computing right now, even if you don’t realize it. If you use an online service to send email, edit documents, watch movies or TV, listen to music, play games or store pictures and other files, it is likely that cloud computing is making it all possible behind the scenes. Numerous applications which are very complex need parallel processing for executing the jobs efficiently. Because of the synchronization and communication among processes which run parallel, there is a reduction in usage of resources of CPU. So there are number of jobs that need to be executed with the available resources to achieve optimal performance, least possible total time for completion, less processing cost and efficient utilization of resources etc. . To accomplish these goals and achieve high performance, it is important to design and develop a multi objective scheduling algorithm to schedule the tasks along with satisfying the user’s Quality of Service requirements. After studying and analyze the processing time of various low level scheduling algorithms, an improved task scheduling is developed using quality of service parameters of resource nodes and priorities of the task. In order to achieve efficient consumption of cloud resources, the load balancing problem is solved by using Adaptive Load balancing algorithm. The evaluation parameters considered in the work includes total processing cost, average waiting time and total processing time.

Published by: Mir Salim Ul Islam, Bhawana Rana

Research Area: Cloud Computing

Research Paper

98. A Study on Women Impact on Demonetization and Perspective of Economists, International Media

Demonetization refers to Withdrawal of a particular form of currency from circulation. On eve of 8th November 2016 the government announced the demonetization of Rs.500 and Rs.1000 notes (almost 86% of the currency) which left everyone shocked. This sudden announcement led to chaos in every part of the society. Though it was a bold decision taken by the Modi government, it affected many poor daily wage workers. While some economists, politicians were in agreement with the move, some were against it. This paper aimed at reviewing the implications of demonetization on Women, Rural People and Views of Politicians, Economists, International Media on demonetization.

Published by: D. Sai Sarika

Research Area: Demonetization

Research Paper

99. Comparative Analysis of Mobile Payment Applications

Cashless transactions have various benefits attached with them like reduction in black money, increase in span of income tax, reduced crime rates.At the same time, concerns are being raised about the security and service of these applications. There are various countries which have more than 50% of their transactions through cashless means. . Due to demonetization act, already existing mobile payment applications are brought into limelight like Paytm, Mobikwik, Freecharge etc

Published by: Sanchi Meena

Research Area: Cashless Economy

Research Paper

100. Automatic Detection of Diabetic Retinopathy using Deep Convolutional Neural Network

The purpose of this project is to design an automated and efficient solution that could detect the symptoms of DR from a retinal image within seconds and simplify the process of reviewing and examination of images. Diabetic Retinopathy (DR) is a complication of diabetes that is caused by changes in the blood vessel of the retina and it is one of the leading causes of blindness in the developed world. Currently, detecting DR symptoms is manual and time-consuming process. Recently, fully-connected and convolutional neural networks have been trained to achieve state-of-the-art performance on a wide variety of tasks such as speech recognition, image classification, natural language processing, and bioinformatics. In our approach, we trained a deep Convolutional Neural Network model on large dataset consisting around 35,000 images and used dropout layer techniques to achieve higher accuracy.

Published by: Vishakha Chandore, Shivam Asati

Research Area: Deep Learning

Research Paper

101. Seismic Effect of Masonry Infill with Open and Shear Wall on Flat Slab Structures

In the design of tall structures it is essential that the structure must be enough stiff to resist the lateral loads caused by wind and seismic motion. Lateral loads leads to develop high stresses, produce sway movement or cause vibration. Therefore, it is a must for the structure to have sufficient strength against vertical loads together with adequate stiffness to resist lateral forces. Many researches has been carried which describes the suitability of various lateral load resisting system against deformation and shear exerted due to the seismic and wind forces. The RC structures with shear wall and MI frame have been recognized as one of the most efficient structural system for such a purpose. A flat slab is a typical type of construction in which a reinforced slab is built monolithically with the supporting columns and is reinforced in two or more directions, without any provision of beams. Flat slab structures in areas of low seismicity (Zone I & II) can be designed to resist both vertical and lateral loads as permitted by code IS: 1893(Part1)-2002. However for areas of high seismicity (Zone III, IV & V) code does not permit flat slab construction without any lateral load resisting system. In a building having frames (without beam) and with shear wall, the frames are designed for at least 25% of the seismic force and 75% is taken by the shear wall. If effect of lateral load analysis and other design features are to be studied in flat slabs; punching shear is a matter of concern for any structural designer.

Published by: Janardanachar M. H, D. Prakash

Research Area: Civil Enginnering

Research Paper

102. Human factors in Safe Construction of Mega Projects in India

Construction is a hazardous process, which is responsible for severe and fatal accidents around the world. The global studies on Occupational Health and Safety (OHS) aspects in the construction industries also reiterated for the improvement of safety culture. The mega projects are projects characterized by vast capital, unorganized work force, technically complex constructions, and having a significant impact on the socioeconomics and community development. For example; highways, power projects, residential buildings, airports, industrial parks, etc. The construction industry is mainly skill oriented and labor intensive. This paper is aimed at studying the prevailing safety practices in the construction of infrastructure power projects in various regions in India in the past 15 years. The study mainly focused on a survey of human practices which are responsible for the behavior at work. The unsafe acts/conditions at work place when uncontrolled lead to incidents/ accidents. The elements identified in the construction process are factored and a safety culture model is developed. The model is based on rock-bottom principles, rigorous approaches and rigid hazard management practices, which is suitable for the construction industry.

Published by: Samarth Ramprasad .K, Prabhat Kumar

Research Area: Mangement Sciences

Research Paper

103. Biosorption of Heavy Metal Using Bacteria Strain and Its Optimization

Soil and water pollution is becoming one of major burden in modern Indian society due to industrialization. Though there are many methods to remove the heavy metal from soil and water pollution but biosorption is one of the best scientific method to remove heavy metal from water sample by using biomolecules and bacteria. Biosorbent have the ability to bind the heavy metal and therefore can remove from polluted water. Currently, we have taken the water sample from Ballendur Lake, Bangalore. Which is highly polluted due to industries beside this lake. This sample of water was serially diluted to 10-7. 10-4 and 10-5 diluted sample were allowed to stand in Tryptone Glucose Extract agar media mixed with the different concentrations of lead acetate for 24 hours. Microflora growth was observed. Then we cultured in different temperature, pH And different age of culture media. Finally, we did the biochemical test to identify the bacteria isolate and we found till genus level, it could be either Streptococcus sp. or Enterococcus sp.

Published by: Dhondup Namgyal, A. Chandra, G. Reddy, K. Kumar

Research Area: Toxicology

Research Paper

104. Synthesis of Zirconium Phosphosilicate and its Application as Inorganic Ion-Exchanger for Adsorption and Radiochemical Separation of Indium(III)

In the present investigation zirconium phosphosilicate has been synthesized and characterized for its composition. A rapid and selective method has been developed for the radiochemical separation of In (III) from other elements by adsorbing it on zirconium phosphosilicate as an ion-exchanger by batch process. 114mIn is used as a tracer. The optimum time of contact and pH for the adsorption of 1 mg of In (III) has been found to be 5.0 min and 4.0 respectively. The adsorption was found to be maximum (94.48 ±2.8%) with 200 mg of the exchanger. The interference of various anions in the adsorption of In(III) at optimum condition of adsorption has been studied. Adsorption of various other cation in presence of In(III) also has been studied. The distribution co-efficient values and the Decontamination factor has been evaluated.

Published by: Dr. S. D Ajagekar

Research Area: Chemical Sciences

Research Paper

105. Effect of Bracings on Multistored RCC Frame Structure Under Dynamic Loading

Braced frames are known to be efficient structural systems for buildings under high lateral loads such as seismic or wind loadings. The potential advantage of bracing system is that it is comparatively small increase in mass associated with the retrofitting scheme since this is a great problem for several retrofitting techniques. In this study building modeling is done for regular and vertical irregular building with and without bracing system. Response spectrum analysis is done for each model with different types of bracings. The results in terms of displacement, storey drift, storey shear of regular and irregular structure is compared.

Published by: Rakshith K. L, Smitha

Research Area: Structural Engineering.

Research Paper

106. Detecting and Overcoming the Black Hole in MANET

The nodes in mobile ad hoc networks are prone to several attacks. This is because these networks are decentralized and any node can join in the network and any node can leave the network. So if any attacker wants to steal some information from the network, the malicious node can be deployed very easily in the network. One of the many possible attacks is the black hole attack. The black hole node shows the source node that it has shortest route to destination even if it does not have any. Therefore, the source node forwards the packets to the path, which are dropped and never reaches the destination node. The proposed scheme detects black hole attack based on the maximum sequence number for each path, which should be received. If in any route reply message, the sequence number were greater than this, the source node would reject the reply on the path. The existing and the proposed schemes were implemented in network simulator 2.35. The performance of network was analysed on the basis of the packet delivery ratio, throughput and remaining energy. These parameters showed improvement over the existing scheme.

Published by: Irfan Ahmad Wani, Pooja Garg

Research Area: Computer Science

107. A Comparative Study of Swalpa Masha Taila Nasya and Swalpa Masha Taila Uttarabaktika Snehapana in the Management of an Avabahuka

Avabahuka is one among the Vatavyadhi, which hampers the normal function of the upper limbs due to vataprakopa, mainly the shoola, bahupraspandan are the lakshana of Avabahuka. Here the the Snehana (Bruhman) chikitsa is the prime treatment for Vatavyadhi. ”Vatasyopakramah snahah swedah sashodhanah mruduh”(A.H.Su.13/1) Swalpamasha tail is one having vishesh vatahar property.Because it is having guru,ushna,snigdha guna. Acharya Chakara and Vagbhatt have mentioned that Nasya and Uttarabhaktika snehapana are more effective in Avabahuka . The study was conducted to assess the efficacy of Nasyakarma and Uttarbhaktika snehapana with swalpamasha tail in Avabahuka .

Published by: Dr. Vikramaditya Jangir

Research Area: Ayurveda

108. A General Study Of Bi-Products of Sericulture

Sericulture is an agro based industry which involves the raising of food plant for silkworm, rearing of silkworm for the production of cocoons, reeling and spinning of cocoon for the production of yarn etc. for value added benefits such as processing and weaving. Sericulture is a part-time family occupation mainly for the below poverty line poor people. Silk is an animal protein secreted by the Fifth Instar larve for spinning the cocoon. This cocoon acts as a protective covering of delicate caterpillar to pass the pupal stage inside it and metamorphosis into an adult moth.

Published by: Ameet Singh

Research Area: Sericulture

Research Paper

109. Environmental Jurisprudence – A Journey from Vedic Culture to Supreme Court

Human civilized societies can flourish only when there is consistency and harmony among the various members of Earth community. Earth jurisprudence is the part of legal philosophy and of human governance and it is based on the idea that welfare of each member of community is based on the welfare of the earth as a whole. It seeks the relevancy of Earth community. One such area of its working is environment. The basic sense of environment protection is intrinsic in our ancient texts. At international level it is a concept, not more than a century old, but in Hinduism it was included in the ancient texts. As a result of international and national judicial interventions many principles have evolved potentially applicable to all member nations of international community in respect of protection of all aspects of environment. International legal Instruments have played a vital role in fostering environmental law , environment conservation and sustainable development . A number of treaties have been signed in order to achieve it. Various and numerous provisions are enshrined in our Constitution in order to prevent and protect environment. It is a Fundamental Right as well as a duty has also been casted upon the persons to preserve and conserve it. Our judiciary by passing various remarkable judgments and elucidating various principles has deeply and actively involved in achieving the targets of environmental Jurisprudence.The environmental jurisprudence is expanded enough through setting of committees, laying down principles and by creative and innovative thinking of various courts and judges.

Published by: C. S Priyanka Maheshwari

Research Area: Legal Educaton

Review Paper

110. Review on Distribution System Reconfiguration for Minimizing Losses and Utilization of DG for Improvement in Voltage Profile

Power distribution networks are mostly operated in radial configuration. The dynamics of the distribution system operations often requires reconfiguration of the network. Distribution network reconfiguration is achieved by using sectionalizing switches that remain normally closed and tie switches that remain normally open. The main purpose of the reconfiguration is to minimize active power losses in order to improve distribution system performance addresses performance enhancement of distribution network with distributed generator (DG) integration using modified multi objective genetic (MG) algorithm. In aim of network reconfiguration is to minimize active power losses and to improve voltage quality. The constraints of network reconfiguration problem are load flow equations, upper and lower limits of bus voltages, and upper and lower limits of line currents. The effort of performance enhancement is done by using optimization of distribution network configuration. The objective of the optimization is minimizing active power loss and improving voltage profile while the distribution network is maintained in the radial structure. In this study, configuration optimization method is based on a modified MG algorithm.

Published by: Vibhuti, Shavet Sharma

Research Area: Power System

Research Paper

111. Speech Enhancement Using Dual Transform-Normalized LMS Algorithm for Speech Recognition Application

Speech enhancement is one of the important preprocessing techniques required for any speech processing applications. This work represents dual channel speech enhancement where the desired signal is available and the input noisy speech is enhanced based on the reference signal, this method of speech enhancement can be used in robots where the machine can recognize the comments given by human. In this paper noisy and the desired speech signals are dual transformed using discrete cosine transform and Hadamard transform and applied to the adaptive filter using Normalized Least Mean Square algorithm. In NLMS the step size parameter is varying based on the input signal rather in LMS the step size is constant. This variable step size will lead to fast convergence of noisy speech towards desired speech and the enhanced signal gives better performance compared to conventional LMS algorithm. The performance analysis is done through various subjective and objective measures.

Published by: Dr. D. Deepa, Dr. C. Poongodi

Research Area: Speech Signal Processing

Research Paper

112. A Study on Direct Oxidation of O-Toluidine by Potassium Bromate

O-Toluidine is a very specific reagent used widely in various Laboratory. It finds place in forensic medicines laboratory where human blood stains on various objects are tested and used for crime detection. Besides these O-Toluidine is utilized in synthesis of many useful organic compounds. A valuable report has been published on germicidal action of O-Toluidine. Thus selection of O-Toluidine for mechanistic studies seems interesting and inviting as well

Published by: Tapas Ghosh

Research Area: Chemistry

Research Paper

113. IOT Applications for Indian Based Farming and Hospitality Industry

Cloud Computing and the Internet of Things(IoT), which is the interconnection via the Internet of computing devices embedded in everyday objects, enabling them to send and receive data are hot discussions now in the trade of Indian based farming and hospitality industry. Cloud computing which is the practice of using a network of remote servers hosted on the Internet to store, manage, and process data, rather than a local server or a personal computer provides software usage, data access, data storage services and other computation through the Internet and facilitates customers to rent resources based on the pay-as-you-go model. Prompted by attaining a sustainable world, this paper also provides several technologies and issues regarding green cloud computing which is a buzzword that refers to the potential environmental benefits that information technology (IT) services delivered over the Internet can offer society and the green IoT targets at a sustainable smart world, by reducing the energy consumption of IoT, moreover it improves the discussion with the depletion in energy consumption of the two techniques Cloud Computing and Internet of Things combination in agriculture and healthcare systems.The hot green information and communications technologies (ICT’s) are enabling green Internet of Things will also be examined. Green computing establishment first and later focuses on the current works done relating to the two emerging technologies in both agriculture and healthcare cases. And also, this paper gives a knowledge by presenting Internet of Things application for Indian based farming and hospitality industry via sensor-cloud integration model. Last but not least it also, lists out the benefits, challenges, and future research directions regarding green application design. This experiment aims to make green area broad and contribution to sustainable application world.

Published by: V. Sasikumar, Dr. S. Priya

Research Area: Networking

Research Paper

114. Partial Replacement of Fine Aggregates with Waste Glass

The use of Waste Glass has recently gained popularity as a resource-efficient, durable, cost-effective, sustainable option for many types of Portland cement concrete (PCC) applications. The production of Portland cement is not only costly and energy-intensive, but it also produces large amounts of carbon dioxide. With the use of waste glasses available around the world at low costs, the use of Waste Glass seems to offer the best short-term solutions to rising river bed sand demand.

Published by: Ishan Srivastava, Dushyant Gupta, Sukhvinder Singh Sehmi, Kumar Shivam, Jhalki Bharadwaj

Research Area: Construction

Research Paper

115. A Critical Study on Make in India Program of NDA Government Special Reference To Tech India

Honorable PM Mr. Narendra Modi in an event in Vigyan Bhawan, New Delhi on September 25,2014 launched ‘Make in India’ program, which is a major national initiative which focuses on making India a global manufacturing hub. Idea is to develop infrastructure without delays and make it very easy for the companies to do business in India along with support to existing industries in the nation. The manufacturing sector which is currently contributing to 15% of the country’s Gross Domestic Products Needs to be improved to 25% in next couple of years. Also eliminating lengthy laws and regulations, bureaucratic processes needs to be shortened , make government more transparent toward public. In this regime manufacturing will improve opportunities of investments, investments will improve skills, need of specific skill will improve employment, employment will improve purchasing power of the youth and once again there will be a demand of manufacturing goods, its growth and obviously the growth of the nation. Apart from manufacturing Initiatives , National Investments Schemes, creation of Industrial corridors, Development of smart Cities and FDI (Foreign Direct Investment) in important sectors like Defense, construction and Railways are taken into consideration. To support PM’s mission there is a need of skill Development in different sectors. These skills are based on current industry needs and todays changing environments. Flexibility in labor laws, formation of new agencies, defining objectives and roles are the key points taken into consideration.

Published by: Vikas Pathak, Dr. Pramesh Gautam

Research Area: Management

Research Paper

116. Multiple Face Detection for Colour Images

The emergence of high resolution digital cameras for recording of still images and video streams has had a significant impact on how communication and entertainment have developed during the recent years. At the same time Moore’s law has made tremendous computing power readily available that only some 20 years ago was reserved for high profile research establishments and intelligence services. These two tendencies have respectively called for and fostered the advent of unprecedented computationally heavy image processing algorithms. Algorithms that in turn have allowed for new processing of existing image based material. Parallel to this technological development the measures deployed in the protection against attacks from the enemies of modernity calls for more surveillance of the public space. As a result of this regrettable circumstance more video cameras are installed in airports, on stations and even on open streets in major cities. Whether or not the purpose is entertainment or dead serious surveillance, tasks like detection and recognition of faces are solved using the same methods. Due to the varying and generally adverse conditions under which images are recorded there is a call for algorithms capable of working in an unconstrained environment. In 2004 an article by Paul Viola and Michael J. Jones titled “Robust Real-Time Face Detection” was publish in the International Journal of Computer Vision. The algorithm presented in this article has been so successful that today it is very close to being the de facto standard for solving face detection tasks. This success is mainly attributed to the relative simplicity, the fast execution and the remarkable performance of the algorithm. This report documents all relevant aspects of the implementation of the Viola-Jones face detection algorithm. The intended input for the face detection algorithm is any conceivable image containing faces and the output is a list of face positions.

Published by: Radhika Anchanala, Dr. G. Chenchu Krishnaiah

Research Area: Electronics & Communication Engineering

Review Paper

117. Review on Text Classification by NLP Approaches with Machine Learning and Data Mining Approaches

Software Engineering and syntactic which is uncertain with the associations among PCs and regular dialects. In supposition, normal dialect changing over must be perfect skill of human-PC interface. Characteristic dialect thankful is some of the time portray to as an Artificial Intelligence-whole issues, since common dialect acknowledgment appears to draw in broad learning about the outside world and the bent to control it. NLP has huge have basic components with the field of computational phonetics, and is regularly viewed as a sub-field of computerized reasoning. In this paper review on the different techniques of text classification is discussed.

Published by: Gurvir Kaur, Parvinder Kaur

Research Area: Text Mining

Case Study

118. Study on FM/FM/1 Queueing System with Pentagon Fuzzy Number using α Cuts

This paper Studies on FM/FM/1 queueing system with pentagon fuzzy numbers using α cut method. The arrival rate and service rate are fuzzy natures and also analyzed the performance measure in pentagon fuzzy numbers. The numerical example is illustrated that it shows efficiency of the system.

Published by: K. Usha Madhuri, K. Chandan

Research Area: Operational Research

Review Paper

119. Bio Inspired Technique to Improve the Performance of VANETs

The Vehicular Ad-Hoc Network, or VANET, is an expertise that usages moving cars as nodes in a network to generate a moveable network. VANET turns each contributing car into a wireless router or node, permitting cars roughly 100 to 300 metres of each other to associate and, consecutively, form a network with an extensive range. Since vehicular ad hoc networks require the transmission of the safety related messages most of the times, it signifies that the successful dissemination of these messages is very important. Thus, this paper aims to improve the performance of the network by improving the delivery rate of the packets by using the concept of ant colony optimization combined with firefly algorithm. The proposed scheme has been implemented in NS2.35 and the performance has been measured using packet delivery ratio, throughput and routing overhead. These parameters showed an improvement over the existing scheme.

Published by: Shivani Sharma

Research Area: Engineering and Technology

Research Paper

120. Qualitative Risk Assessment and HAZOP Study of a Glass Manufacturing Industry

In the recent years, many industries have realized that maintaining good occupational health and safety is as equally important in accordance to production requirement. Hence there has been much scope for the risk assessment in order to protect the workers’ health and safety at workplace. Risk assessment is a process where the Identified hazards are evaluated to determine the potential cause of an accident and further to reduce to the lowest reasonable risk level to protect worker’s health and safety. It is a part of risk management. Risk assessment can be categorized into Qualitative and Quantitative risk assessment which is being carried out using different techniques. In this paper, an attempt has been made to carry a Qualitative risk assessment i.e. Hazard Identification and Risk Assessment and HAZOP study for the identified critical areas of a glass manufacturing industry. From the results obtained from this study, Physical Hazard is 47%, Ergonomic Hazard is 10%, Chemical Hazard is 9%, Electrical Hazard 2%, Biological Hazard 3%, Thermal Hazard is 29% etc. Risks are also categorized into Low, Medium and High risks which 12.09%, 45.05% and 42.06% respectively. Hazard analysis from the design intent for Silane and ethylene gas station is also studied using a HAZOP technique. Some of the control valves are suggested to install before the gas filter. ALOHA is also used to simulate the model to analyse the impact due to leakage from the various potential hole area in the natural gas pipeline. Leakage from the larger hole of about 324.51 sq.cm has shown the high hazardous zone - death to human beings i.e.12m down the line in the wind direction. However, periodical maintenance and monitoring of the Natural gas distribution pipelines along with Gas detection system and fire hydrant system with sprinklers can prevent the disaster.

Published by: Yadhushree B. J, Shiva Kumar B. P, Keerthi D’ Souza

Research Area: Occupational Health and Safety

Research Paper

121. Role of Tamas in the Manifestation of Nidra and a Study on Incidence of Sleep Patterns in Health and Disease

Abstract not visible on author's request.

Published by: Dr. Priyanka

Research Area: Ayurveda

Research Paper

122. Relevance of Paradi Gunas with Special Reference to Samskara and Abhyasa in Understanding and Management of Santarpana Nimittaja Vyadhi

Abstract not visible on author's request.

Published by: Dr. Ragini Bhardwaj

Research Area: Ayurveda

Research Paper

123. A Research on Fault Detection and Diagnosis of Rolling Bearing

Mechanical failure prevention and condition monitoring have been one of the mechanical engineers’ concerns in recent years due to the personal safety, reliability, failure cost, and equipment downtime issues. Proper system failure prevention process helps to reduce the possibility of the system malfunction, identification of source causes, and troubleshooting. The use of novel sensors such as an Air-coupled ultrasonic transducer, Eddy current, Piezoelectric ultrasonic transducer as diagnostic tools for detection of bearing faults has been investigated. A series of experiments were carried out in a laboratory environment. Localized defects with different sizes were created intentionally on the test bearing components simulating evolving cracks or other related faults. Four different signal processing techniques were applied to extract the signal features. The resulting data for different bearing speed and load showed that the sensors are capable of detecting different types of defects located on the bearing components.  

Published by: Ankur Gill

Research Area: Mechanical Engineering