International Journal of Computer Science and Technology
Vol 7.3 Ver -1 (July-Sept 2016)

S.No. Research Topic Paper ID Download

A Real Time System Repudiation of Provision Attack
Detection in Network via Multivariate Correlation Analysis

G Bhargav, S Ravi Kanth


The dependability and accessibility of system administrations are being debilitated by the developing number of Denial-of-Service (DoS) assaults. Powerful components for DoS assault identification are requested. Consequently, introduce a DoS assault identification framework that utilizations Multivariate Correlation Analysis (MCA) for exact system movement portrayal by separating the geometrical relationships between system activity highlights. MCA-based DoS assault discovery framework utilizes the standard of abnormality based location in assault acknowledgment. This makes answer for equipped for recognizing known and obscure DoS assaults adequately by taking in the examples of true blue system activity as it were. in this framework likewise distinguished different sorts of infections. Besides, a triangle-range based method is proposed to upgrade and to accelerate the procedure of MCA. Because of shared nature of the medium in remote systems it makes the circumstance too simple to make plausibility of an assault. DoS assault location framework utilizes Multivariate Correlation Analysis (MCA) for most precise system movement portrayal. MCA separate the geometrical connections between’s system movement highlights. MCA DoS assault discovery framework utilizes the guideline of abnormality based identification amid assault acknowledgment. This makes all the more simple for distinguishing known and obscure DoS assaults by essentially taking in the examples of authentic system movement.
Full Paper


The Innovative Data Construal for Encouraging Document Annotation Consuming Content and Probing Value

Joel Fedric Lodi, P Anuradha


Seeking the World Wide Web can be both profitable and inadequate in today’s life. Furthermore all accumulation of literary information contains expansive measure of organized data which stays covered up in unstructured configuration. Significant data is constantly hard to discover in these archives. You may discover unlimited measures of data, or you may not discover the sorts of data you’re searching for. Seeking online will furnish you with an abundance of data, yet not every last bit of it will be helpful or of the most astounding quality. In this paper we are consider an option approach for substance and inquiry looking taking into account Facilitating Document Annotation for the organized metadata by distinguishing reports in general framework that are prone to contain data of hobby and this data will be helpful for substance of questioning the database. Here distributer will prone to allot metadata, organized or unstructured identified with reports which they transfer which will effortlessly help the clients in recovering the records. Presently a-days numerous associations create and share enlightening and printed information of their items, administrations, and activities. Such kind of accumulations of printed information contain noteworthy measure of organized data, which stays spared in the unstructured content. While data extraction calculations encourage the extraction of organized relations in an exceptionally costly and wrong way particularly when working on top of content that does not contain any occurrences of the focused on organized data. There are numerous option approaches that encourage the era of the organized metadata by recognizing archives that are liable to contain data of hobby and this data will be along these lines valuable for questioning the database which depends on the thought that people will probably include the essential metadata amid creation time. This should be possible like provoking by the interface; or that it ought to make much less demanding for people (and/or calculations) to distinguish the metadata when such data really exists in the record, rather than gullibly inciting clients to fill in structures with data that is not accessible in the report. There are distinctive calculations that recognize organized ascribes that are prone to show up inside the archive, by mutually using the substance of the content and the inquiry workload.
Full Paper


Constrained Assay forms Proposed for Database Improbabilities

Keerthi Pushpavalli, A Swathi


Conventional Data mining advances can’t work with immense, heterogeneous, unstructured Data. Present day web database and experimental database keeps up gigantic and heterogeneous Data. These genuine word databases might contain hundreds or even a large number of relations and properties. Query structure is a standout amongst the most broadly utilized interfaces for questioning database. Customary question structures are planned and pre-characterized by engineer or DBA in different data administration frameworks. Yet, extricating the valuable data with this conventional query structure from extensive dataset and surges of the Data is unrealistic and it is hard to outline set of static question structures to fulfill various specially appointed database inquiries on those mind boggling databases. In this paper, we propose a Search streamlining utilizing dynamic query structure framework SODQF. SODQF is a question interface which can progressively producing query shapes for client. Not at all like to convention archive recovery, are clients in database recovery frequently ready to perform numerous rounds of activity before recognizing last results. Dynamic query structure catches client enthusiasm amid the client collaboration and to adjust the question frame intelligently. Every cycle comprises of three sorts of client communications, Selection from a collection of the structures, Query structure Renovation and Query Execution. The Query from is improved more than once until the client is fulfilled by the query results. In this paper we are predominantly centering dynamic era if question frames and positioning of query structure segments.
Full Paper


The Effective Procession of Apriori Algorithm Prescribed Data Mining on Medical Data

Potta Swathi, Bodapati Prajna


Data Mining is a standout amongst the most inspiring zone of exploration that is turned out to be progressively prominent in health organization. Data Mining assumes an essential part to uncover new patterns in medicinal services organization which thus accommodating for all the gatherings connected with this field. This study investigates the utility of different Data mining methods, for example, arrangement, bunching, affiliation, and relapse in health space. In this paper, we show a brief presentation of these systems and their focal points and detriments. This overview additionally highlights applications, difficulties and future issues of Data Mining in medicinal services. Suggestion with respect to the reasonable decision of accessible Data Mining method is additionally talked about in this paper. Broad measures of data put away in restorative databases require the advancement of devoted instruments for getting to the data, data investigation, learning revelation, and successful utilization of sloretl data and data. Across the board utilization of restorative data frameworks and dangerous growth of medicinal databases require ordinary manual data examination to be combined with strategies for skilled PC helped investigation. In this paper, I utilize Data Mining strategies for the data examination, data getting to and learning disclosure strategy to demonstrate tentatively and for all intents and purposes that how steady, capable and quick are these procedures for the study in the specific field? A strong numerical edge (0 to 1) is set to break down the data. The acquired result will be tried by applying the way to deal with the databases, data stockrooms and any data stockpiling of various sizes with various passage values.
Full Paper


The Early Augmentationfor Diabetes Diagnosis
Using Data Mining Approaches

Baratam Yasaswi, Bodapati Prajna

In medicinal field specialists need accessible data for decision making. Presently day’s data mining strategy is connected in therapeutic exploration so as to break down substantial volume of restorative data. This study endeavors to utilize data mining technique to investigate the database of diabetes and utilized for diagnosing for documenting better result. So this paper concentrate on examination of diabetes data by different data mining system. Cutting edge medication creates a lot of data which is betrayed into the restorative database. An appropriate examination of such data may uncover some fascinating truths, which may some way or another be covered up or go disperse. Data mining is one such field which tries to concentrate some fascinating actualities from immense data set. In this paper an endeavor is made to break down the diabetic data set and get some fascinating truths from it which can be utilized to build up the expectation model. Illness conclusion is one of the applications where data mining instruments are demonstrating effective results. Diabetes malady is the main source of death everywhere throughout the world in the previous years. A few scientists are utilizing factual data. The accessibility of enormous measures of therapeutic data prompts the requirement for intense mining instruments to help medicinal services experts in the analysis of diabetes infection. Utilizing data mining method as a part of the determination of diabetes malady has been completely explored, demonstrating the satisfactory levels of exactness. As of late specialists have been examining the impact of hybridizing more than one procedure demonstrating improved results in the finding of diabetes ailment.
Full Paper


Enhancement of CURE Clustering Technique in Spatial Data Mining Using Oracle 11G

Snehlata Bhadoria, U. Datta

CURE Clustering divides the data sample into groups by identifying few representative points from each group of the data sample. This paper presents enhanced CURE as a clustering technique for data mining, in this approach we have a specially designed pattern as representative to form enhancement in CURE clustering to make it more usable efficiently on big data. Oracle 11G is used as backend with its silent feature of storing big data. The supervised trained model used to analyze this pattern to enhancing the CURE clustering which execute their function by specified parameter or value. This algorithm makes clustering easier and applicable on huge data by reducing time complexity.
Full Paper


Adoption of E-Government Services Among Citizens in the Selected Districts of Tanzania

Mercy Mlay Komba

The government of Tanzania has been making efforts to provide its information and services through the use of information and communication technology. However, e-government adoption has been quite slow. Few studies explore e-government adoption in Tanzanian context; therefore, the purpose of this paper is to assess factors that influence citizen adoption of e-government in Tanzania. A multiple linear regression analysis was performed to assess the relationship between the independent variables (trust, intention to use, relative advantage, image, compatibility, perceived ease of use, perceived usefulness and social influence) with the dependent variable e-government adoption (net benefit). The results indicate that social influence determines adoption of e-government in Tanzania. In light of these findings, researchers should conduct a similar study using other different models of e-government adoption, in order to identify
Full Paper


A Distortion-Resistant Routing Framework for Video Traffic in Wireless Multihop Networks

Chaitanya Balagiri, Dr. R.China Appala Naidu

In general, traditional routing paths which are designed for wireless sensor networks are application-agnostic. We are considering an application flow in a wireless sensor networks consists of video traffic. To overcome the problem of video distortion there are many routing metric mechanisms. Popular link-quality-based routing metrics (ETX) do not account for the dependence of video across the links of a path which can cause a video to flow through few paths, thus causing high video distortion. In this paper, we construct an analytical framework model to evaluate the video frame loss process to understand the impact of wireless network not account for the dependence of video across the links of a path, thus causing high on video distortion. This framework allows to minimize the video distortion by formulating a routing policy by accounting the distortions caused by a flow, end-toend by distributing the frames across the paths through priority based. We evaluate via testbed experiments that our protocol is efficient in reducing the video distortion and minimizing the user experience degradation.
Full Paper


Client Assignment Problem For Continuous Distributive Interactive Applications

Methuku Archana, Dr. R.China Appala Naidu

In general, interactivity is a primary performance in Distributed Interactive Applications (DIAs) that enable clients at different locations to interact with each other in a real time. The DIAs performance depends, not only client to server latencies but also inter server network latencies, as well as synchronization delays to get the consistency and fairness requirements in DIAs. In this paper, We find the approach effectively assigning clients to servers to maximize the interactivity in DIAs by using algorithms called Greedy Assignment and distributed –modify assignment algorithms. By evaluating the proposed Greedy Assignment and Distributed-Modify Assignment the experimental results showthat optimal interactivity is produced and the interaction time between clients is reduced compared to the intuitive algorithmwhich is assigned to each client to its nearest server.
Full Paper


Cloud Integrated Sensor Node Enabling Internet of Things Ecosystem

Vivek Shah, Rohit Bhattar

With the advent of technology human mind has always vouched for easier ways of doing difficult tasks, and automation of any task is considered a task easily done. Any cumbersome task which require your constant approval or appears to be monotonous can easily managed if automation is achieved. With time remaining constant and work in hands getting exponentially increased full automation is the only way out. In this paper we have tried to automate and process our surrounding activities. We have used Arduino UNO as our development board. With the help of Ethernet shield and cloud service XOBXOB we have tried it to connect it with internet. In this paper we have shown that not only we have power to access the data involved but also we can use the data and take proper actions.
Full Paper


Design and Development of Magnetometer Based Real-Time Traffic Density Sensing System

Gargi Sharma, Jaspal Singh

Traffic in India is chaotic and unstructured; there is lack of lane discipline, narrow roads, illegal parking and a wide variety of vehicles because of which a lot of time gets wasted on the roads. There are several in roadways (pneumatic road tube, inductive loop detectors, magnetic sensors, piezoelectric sensors, and weighin- motion) and over-roadway (video image processor, microwave radar, infrared sensor, ultrasonic sensors, and passive acoustic array sensors) traffic density sensing systems that exist for traffic density monitoring. Though most of these are not suitable for traffic scenario in India. Therefore, a different method to estimate traffic density on road is required, which is inexpensive, non-intrusive and energy-efficient. The cost of operation and maintenance is also low. Day to day operations involve no human intervention. For this purpose, magnetometer based sensing system is proposed, tested and find suitable for use. Full Paper

A New Frontier of Alternate Source of Energy: ‘Human
Energy Harvesting System’ for a Wearable UV Exposure Meter

Neha, Dr. Sanjay. P. Sood

Rapid evolution in the domains of wearable devices and instrumentation have brought about a paradigm shift in the design and applications of sensing devices. Such has been the impact of these advances that the devices have pushed the frontiers of portability w.r.t. devices to an upcoming class of devices called wearable devices. Wearable devices are being used to gather physiological information, motion based information and weather related information centered around the wearer. This research is aimed at extending state of the art pertaining to the measurement of exposure to the UV (Ultraviolet) radiation which, beyond certain limits, is harmful for human skin. Following an extensive research on the concepts and technologies used for designing UV exposure monitors, a unique wearable UV exposure meter has been designed. It is not only an energy efficient wearable device but uses a new modality of energy’s source. The wearable UV exposure meter uses a mechanism that sources its energy from an inbuilt human energy harvesting system. This research presents design and development aspects of this wearable UV exposure meter that also guides the wearer to use an appropriate Sun Protection Factor (SPF) protection scheme.Full Paper

Smart Security Model by Predicting Future Crime with GIS and LBS Technology on Mobile Device

Gaurav Kumar, P. S. Game

People always try to ensure that their loved ones are safe and secure. Due to increase in the crime rate since the past few decades, pupils feel unsafe in an outside environment. Traditional method like FIR is not able to predict which type of crime will occur in the future. Technology like GIS i.e. Geographic Information System is used in the analysis of data. It helps in performing future prediction based on the gathered data. A decision tree algorithm is used for the classification of a criminal dataset. After classification, the crime rate for each state is calculated. On the basis of the crime rate, future prediction can be made using K-means clustering. If the current location of the user is in dense crime cluster, then an alert message will be invoked on the user screen. As per the user’s need, message can be send to one or all contacts. The system is implemented using a client-server architecture where Android phone acts as a client and a MYSQL database is used
on a server.Full Paper

A Game-theoretical Method of the Interactions Among the Nodes Violating the Timing Channel to Obtain Resilience to Jamming Attacks

Penmatsa Naga Venkata Divya, Dr. R.China Appala Naidu

Reactive jamming attacks caused by an energy-controlled malicious node has been countermeasure by using the timing channel. In this channel the information is encoded in the timing between events, and it is a simple logical communication channel. The timing channel information or events can not be jammed, even if a jammer is able to disrupt the attacked packets information, and also delivers the timing information to the receiver on jammed communication channel. The game theory can be used to structure the interactions of attacked nodes and the jammer. This paper, propose a game theoretical method of the interactions among the nodes violating the timing channel to obtain resilience to jamming attacks and a jammer is derived and analyzed. Importantly, Nash equilibrium is discussed under best response changing aspects subjected to uniqueness, existence, and convergence. And also, by considering perfect and imperfect knowledge of the utility procedure of jammer’s, jammer reactions on communication nodes setting strategies is modeled and analyzed and a Stackelberg game. Finally, this paper presents the numerical results and visualizes the impact of network parameters on the system performance.Full Paper

An Overview of Software Vulnerability Detection

Yunfei Su, Mengjun Li, Chaojing Tang, Rongjun Shen

Software vulnerability is the main cause of computer security problems and software vulnerability detection is a research hotspot recently. A lot of research has already been done regarding detection techniques, models, tools and all of them are covered in literatures. The main purpose of this paper is to provide a comprehensive survey and analysis of past and current research directions, including static analysis, fuzzing, taint analysis, symbolic execution and hybrid methods. Besides, this paper also provides an analysis and comparison of different tools and talks about the future direction of this field.Full Paper

A Framework on Hand Gesture Recognition Using Fuzzy-Logic with Backpropagation Neural Network Algorithm

Abhishek Patil, Kanchan Pathak, Saurabh Tagalpallewar, Tejas Bharambe

This paper presents a method of hand gesture sequence recognition through a multi-feature criteria based on a predefined area in two dimensional space, the presence of skin color, and non-motion for a defined period of time using a web camera. Hand gesture recognition is an active topic of research and development for human computer interfaces. Hand gestures naturally consist of movement and paused motion. This technique focuses on the nonmotion features of a hand gesture. It is easy for a human being to recognize the meaning of a common hand gesture but it is difficult for a computer to accomplish this same task. A systematic method was developed to distinguish paused motion from hand movements so that pattern recognition techniques can be effectively utilized to interpret the gesture. We have compared the results of gesture recognition using two algorithms – backpropagation feedforward
algorithm and neuro-fuzzy algorithm.Full Paper

Global Wireless E-Voting syste

J.E.Nivetha, K.Kiruthika

In the current system of technology, the voting machine is not much secured. The present electronic voting machine cannot able to determine whether the candidate is eligible or not and the whole control is kept in the hand of voting in charge officer. Another problem with the present voting machine is that anyone can change the vote count, because the vote count is kept their itself. In this new system “Thumbprint scanner as a candidate button in Global Wireless E-Voting system “ grasp can instigator the relatedness of the punter by scanning the fingerprint pattern in candidate button itself and also the result is not kept their itself instead of it is store in the remote server by converting it into radio waves. Consequently there is no fortuitous to change the reckoning, and no chance to allow illegal voter to vote. When there is any problem in the voting machine there will not be harm to continuity of the election process and vote count.Full Paper

An Analysis of Key Dependent Image Steganography using Hybrid Edge Detection in Spatial Domain

Indu Maurya

Steganography means hiding the fact that communication is taking place, by hiding information in some other information. Various carrier file formats can be used for this purpose, but digital images are the most popular. For hiding secret information or message in images, there is a large variety of steganographic techniques, in which, some are more complex than others and all of them have respective strong and weak points. Different applications have different requirements of the technique used. In this paper, we are going to propose an improved secured Image Steganography scheme. This scheme uses the secret key based random LSB substitution. In this proposed work, Message to be embedded is first encrypted with AES (Advanced encryption standard) to provide another level of security then the encrypted message bits are hidden in the two areas: smooth area pixels and edge area pixels. For hiding the message bits in smooth area LSB substitution technique has been used. Whereas two components-based LSB Substitution techniques has been used for hiding message bits in the edge area and edges can bear more variation than smooth areas without being detected. This method ensures the higher PSNR value and high embedding capacity.Full Paper

Modified Privacy Preserving Data Mining System for Improved Performance

Harshada A. Deshpande, Harshali P.Patil

Privacy of information and security issues now-a-days has become the requisite because of big data. A novel framework for extracting and deriving information when the data is distributed amongst the multiple parties is presented by Privacy Preserving Data Mining (PPDM). The concern of PPDM system is to protect the disclosure of information and its misuse. Major issue with PPDM that exists is to use the coherent data mining algorithm or preserving privacy of data. Various PPDM techniques have been proposed till now. One of them is DPQR (Data Perturbation & Query Restriction) algorithm, which is implemented only on Boolean data. In proposed approach, the SVD (Singular value decomposition) data perturbation technique is applied for data modification; discretization of raw data is done and generates the perturbed/distorted data. SVD technique improves the level of privacy protection by providing higher degree of data distortion. The algorithm is applied on perturbed data with association rule mining and Hamiltonian matrix concepts to find out frequent itemsets. By this the confidential data is preserved. The performance metrics for the approach are privacy level, efficiency, scalability and data quality. The time required for calculating matrix inversion is reduced by using Hamiltonian matrix. Main performance metrics is the privacy level and the privacy preserving degree is improved by dimensional reduction based perturbation i.e. multi dimensional perturbation (SVD and NMF) data perturbation techniques. The novelty of the proposed method is, it is being applied on numeric data and expected to achieve comparable parameters as shown on Boolean data.Full Paper

Big Data Analytics: A Review

Srikant Singh, Akhilesh Kumar Shrivas

In the information era, large amount of data have become available to the decision makers to decide the facts. Big data refers to datasets that are not only enormous, but also big in variety and velocity, which makes them difficult to work with traditional tools and techniques. Due to the rapid growth of this type of data, various results are needed to be followed and provided in order to check and extract value and data from the datasets. On the other hand, the executive systems need to be able to look for valuable insights from this kind of varied and frequently changing data, ranging from daily transactions to customer service and social network data. Such value can be supplied using big data analytics, which is the basic result of advanced data analytics techniques on big data. This paper focus on the analysis of the different data analytics methods and tools which can be applied on big data, as well as the data provided by the results and software that helps in analysis of
big data in different decision domains.
Full Paper

Content Based Image Retrieval Using Combined Gabor and Image Features

Monika Soni, Dr. Parvinder Singh

The rapid growth of image data on the internet has spurred the demand for methods and tools for efficient search and retrieval. Content Based Image Retrieval (CBIR) is a technique that uses visual contents such as color, shape, texture, and other image features to retrieve similar images from a large repository against a given query image. This has become an active research area with the advent of the digital media in all most all applications. Although many researchers have been done in the field of image search and retrieval, there are still many challenging problems to be solved. As the semantic gap is considered to be the main issue, recent works have focused on semantic-based image retrieval. Most of the proposed approaches learn image semantics by extracting low-level features from entire image. However, such approaches fail to take into consideration the semantic concepts that occur in the images. In this paper, we focus on the SVM based Classification Model for CBIR Process by combining various image Features.
Full Paper

Malignant Brain Tumor Detection Using Efficient Fuzzy Region Based Sementation

Kanika Mehta, Deepak Bagga

Picture division is an undertaking that is basic numerous picture preparing and PC vision applications. Because of the presence of clamor, low differentiation, and power in homogeneity, it truly is still a troublesome issue in lion’s share of utilizations. The pictures delivered by MRI outputs are every now and again dim pictures with quality in the item run scale that is dark. The MRI picture connected with the cerebrum contains the cortex that lines the surface that is outside of cerebrum also the dark cores profound within the psyche including the thalami and basal ganglia. As Cancer might be the main source of death for all as the clarification for the condition stays obscure, early discovery and analysis is one of the keys to growth control, and it will build the achievement of treatment, spare lives and diminish costs. Wellbeing imaging is regularly utilized devices which can be demonstrative identify and group absconds. To dispense with the reliance of the administrator and increment the accuracy of analysis framework helped finding PC are a profitable and guarantees that are beneficial the identification of malignancy tumors and arrangement. Division procedures in light of dark level strategies, for example, for case edge and techniques taking into account district are the simplest and discover application that is confined. Be that as it may, their execution can be enhanced by consolidating them with the methods for half breed grouping. hones in light of textural attributes map book that is utilizing gaze upward table can have great results on the division of restorative pictures, anyway, they require skill inside the development of the chart book Limiting the specialized map book based is that , in a few conditions , it gets to be hard to pick effectively and name data experiences issues in sectioning complex structure with variable structure, size and properties such conditions it is best to utilize unsupervised strategies, for example, fluffy calculations. In this work we proposed a novel fluffy based MRI Image Segmentation calculation, Fuzzy Segmentation includes the assignment of separating information focuses into homogeneous classes or groups ensuring that things inside the same class are as comparative as could reasonably be expected and things in various classes are as disparate as possible.
Full Paper

Security Reliability Trade off Analysis of Multi-Relay Aided Decode-and-Forward Cooperation Systems With Multiple Destinations


We considering a wireless network system having a source, and multiple destinations in the presence of eaves dropper .Eaves dropper is an attacker which taps the data from the source and destinations. The System proposes a multi relay–selection with multiple destinations under a cognitive radio network. The multi relay selection scheme gives the concept of sending data to destination through multiple relay other than single relay. The
previous system includes single relays and multiple relay with single source and destination. For better output representation we compare proposed system with previous systems. The previous system includes security reliability trade off to avoid attacking from the eaves dropper. The proposed system sends data through multiple relays and the multiple relays depends on the trust values of the nodes. According to the previous system multi relay selection outperforms better than single relay trade off. In proposed system outperforms previous systems.
Full Paper

The Exposure and Defend Against Malware Proliferation in Online Social Networks

Reddy.Sowjanya, A. Sharath Chandra, Nageswara Rao Kunjam

Online Social Networks are groups of individuals who offer regular interests. These sorts of networks are utilized by a great many individuals around the globe. The gigantic reception of this administration among users has made it a prominent mean for malicious exercises. The point of this paper is to recognize the parameters which are identified with malware engendering in online social networks. To do this, we first build an example system in view of the components of online social networks and after that we analyze the impact of parameters that could influence the rate of malware spread. We propose a two layer malware spread model to portray the advancement of a given malware at the Internet level. Contrasted and the current single layer pandemic models, the proposed model speaks to malware spread better in vast scale networks. The malware proliferation to depict the advancement of a given malware at the Internet level. Contrasted and the current single layer pestilence models, the proposed model speaks to malware proliferation better in extensive scale networks.
Full Paper

Estimating Effort in Comprehensive Software Development Using NN, ABE, GAFLANN

J. Sruthi, Dr. Jose Moses, M. Krishna Kishore

This research will observe the use of Artificial Neural Networks for estimating software cost. Reliable effort estimation remains a progressing test to software engineers. Precise estimation is an intricate procedure since it can be envisioned as software effort expectation, as the term demonstrates forecast never turns into a real. The objective of this paper is to concentrate on the observational software effort estimation. The essential conclusion is that no single system is best for all circumstances, and that a watchful examination of the consequences of a few methodologies is well on the way to create sensible appraisals.This paper gives a similar examination of different accessible software effort estimation techniques. The research tries to compare different type of neural network architectures. The proposed ABE model increases the accuracy and each technique have different results which will be shown using matlab. These strategies can be broadly utilized as a part of this paper, a standout amongst the most prevalent effort estimation techniques is similarity based estimation (ABE), Functional Link Artificial Neural Network (FLANN), Genetic Algorithm for Optimizing Functional Link Artificial Neural Network (GAFLANN).
Full Paper

Cloud Computing Using Elasticity

Karimella Vikram, Dr. Soundara Rajan, CH.Venkateshwara Rao, B.Rajani

Originating from the field of physics and economics, the term elasticity is nowadays heavily used in the context of cloud computing. In this context, elasticity is commonly understood as the ability of a system to automatically provision and deprovision computing resources on demand as workloads change. However, elasticity still lacks a precise definition as well as representative metrics coupled with a benchmarking methodology to enable comparability of systems. Existing definitions of elasticity are largely inconsistent and unspecific, which leads to confusion in the use of the term and its differentiation from related terms such as scalability and efficiency; the proposed measurement methodologies do not provide means to quantify elasticity without mixing it with efficiency or scalability aspects. In this short paper, we propose a precise definition of elasticity and analyze its core properties and requirements explicitly distinguishing om related terms such as scalability and efficiency Furthermore, we present a set of appropriate elasticity metrics and sketch a new elasticity tailored benchmarking methodology addressing the special requirements on workload design and calibration.
Full Paper

Control Identity Leakage in Cloud Data and Achieve the Full Anonymity

R Shiva Shankar, K Sravani, J Raghaveni, D Ravibabu

We propose an Attribute-Based Encryption system, any party can become an authority and there is no requirement for any global coordination other than the creation of an initial set of common reference parameters. Various schemes based on the attributebased encryption have been proposed to secure the cloud storage. However, most work focuses on the data contents privacy and the access control, while less attention is paid to the privilege control and the identity privacy. A semi-anonymous privilege control scheme Anony Control to address not only the data privacy, but also the user identity privacy in existing access control schemes. Anony Control decentralizes the central authority to limit the identity leakage and thus achieves semi anonymity. Besides, it also generalizes the file access control to the privilege control, by which privileges of all operations on the cloud data can be managed in a fine-grained manner. We prove our system secure using the recent dual system encryption methodology where the security proof works by first converting the challenge cipher text and private keys to a semi-functional form and then arguing security. Our security analysis shows that both Anony Control and Anony Control-F are secure under the decisional bilinear Diffie–Hellman assumption, and our performance evaluation exhibits the feasibility of our schemes.
Full Paper

Review on Secure Storage Systems

J. R. Pansare, A. D. Ambade

In the last few years, the idea of connecting existing computing devices has given birth to a new concept called “connecting things”. The thing includes sensors, actuators, RFID tags, any computing device which can sense the environment and act upon. Due to advances in sensor data collection technology, such as ubiquitous, embedded devices and RFID technology has led to a large number of devices connected to the net and that continuously transmit their data over time. This data is precious to many enterprises, so there is need of secure mass storage system for this data. This paper includes a survey of various storage systems and methodologies to store data securely on the storage system.
Full Paper

A Review of Network Intrusion Detection and Countermeasure

K.Vikram, B.Anitha, G.Padmavathi, D.Sravani

Nowadays every industry and even some parts of the public sector are using cloud computing , either as a provider or as a consumer. But there are many security issues present in cloud computing environment. There are many possible attacks in cloud computing environment, One such attack is the DoS or its version DDoS attack. Generally, attackers can explore vulnerabilities of a cloud system and compromise virtual machines to deploy further large-scale Distributed Denial-of-Service (DDoS). DDoS attacks usually involve early stage actions such as low frequency vulnerability scanning, multi-step exploitation and compromising identified vulnerable virtual machines as zombies and finally DDoS attacks using the compromised zombies. Inside the cloud system, especially the Infrastructure-as-a-Service clouds, the detection of zombie exploration attacks is very difficult. To prevent vulnerable virtual machines from being compromised in the cloud, we propose a multi-phase distributed vulnerability detection, measurement, and countermeasure selection mechanism called NICE, which is built on attack graph based systematic models and reconfigurable virtual network-based countermeasures. This paper provides a short Reveiw on the techniques to network intrusion detection and countermeasure selection in virtual network system.
Full Paper