JCT - Volume 4 Issue 10 (October 2015)


Sr/No/ Title Pdf
1 Integer Triples in Arithmetic Progression and Geometric Progression through Pythagorean Equation
M.A.Gopalan, S.Vidhyalakshmi, N.Thiruniraiselvi

Abstract- We search for three non-zero distinct integers x,y,z such that the triples : ( , , ) : ( 2 , , 2 ), {0}, 2 2 2 2 2 2 ii x y z x y i x ky z k y kx k z        represent Arithmetic and Geometric progressions respectively.

2 Volitional Decision Making on interactivity as a result of multi-cyclic cognitive processes and emotions
R. John Martin, S. Sujatha

Abstract-Human decision making is an important point where the situations aroused while the users interact with the computer and mobile apps. Understanding of human decisionmaking process is essential as it is considered as a complex cognitive process and as a process which may results emotions. Previous studies state that the decision making can be a result of perception as well as attention. Even if the perception and attention are the two important cognitive processes of decision making, due to plentiful persuasive technologies in the web environments the users may not make the intelligent decisions at the single phase. This paper is an extension of ongoing researches in human ergonomics and tries to state that the human decision making is a result of multiple cognitive cycles of varied cognitive processes and emotions while interacting with computing systems.

3 User Adaptive E-Commerce Website
Akshay Palekar, Anmol Sahota, Bhushan Parmar, Mihir Desai, Manisha Giri

Abstract-E-commerce websites today have become highly sensitive towards the end user’s needs. The data that is available on these websites is very systematically categorized so as to give the user a seamless experience. The technique that we are proposing takes into account the behaviour of the user on the website. Whenever a specific user logs on to his or her account, their surfing history is logged. Apriori Algorithm is used to find the Association Rules which determine a pattern as to what the user’s preferences are. Whenever the client makes a purchase on the website the transaction is recorded and in time a user profile is generated as to what a specific user finds interesting on the website. So on the next visits on the webpage the user is given a more personalized experience.br />
4 Design and Analysis of a Novel Greedy Algorithm for Solving Client-Server Assignment Problem
Sandeep Teja Mummoji, P.Chaitanya Vandavasu

Abstract-Network is an interconnection of various systems in order to transfer the data from one system to other. Generally networks are classified into various types based on the usage and configured. Client-Server network is one among the various types of networks, where a client will always sends a request and server will always generates a response. Interactivity is the major performance factor for any type of network including client-server architecture. Here the client will have a facility to connect to the server from any location to interact with each other for data transfer. In a large scale networks the interactive performance not only depends only on client-to-server network latency but also vice versa. Generally when a client tries to connect the server for the interactivity, sometimes client may connect directly if server is not busy with any other clients but maximum there will be failures for clients sue to server busyness. So in this paper we investigate the problem of efficiently assigning the clients to server for utilizing the maximum interactivity between each other. In this paper as an extension we have also implemented the server status updation like idle, normal, busy and overloaded. As the server is started initially it will be in idle state as there is no clients assigned till that state, once if any client is assigned with that server then the state will change to normal where the server can have the ability to access more clients also till limit less than 3.Once if the limit have exceeded by maximum clients then the state becomes overloaded. At this particular state the client can’t able to accept any more new requests by other clients and it will display the status as busy. In this paper we have also implemented the automatic re-directing of client requests to other server when it is in busy state as an additional extension. As a proof of concept we have implemented our proposed assignment problem on a set of nodes which are connected through LAN and our experimental results clearly tells that this is the approach which will utilize maximum network resources in order to improve the performance of server.

5 Trends towards Efficient Fundus Image Segmentation
Manjinder Singh, Rajeev Vashisht

Abstract-Image segmentation is a method for partitioning a digital image in to numerous divisions. Its objective is to modify and/or simplify illustration of a graphic in to something which may be much more significant and better for evaluation. Nowadays the foremost rational of vision defects and blindness in most of the persons is DR (i.e. Diabetic Retinopathy), which is an eye ailment associated to the set of diabetes mellitus. Diabetes Mellitus cause abnormality in retina of the eye. It is a major health crisis in developed countries. The foremost signs of DR are exudates. Exudates are acknowledged as yellow white dots with spiky boundaries. The detection of Diabetic Retinopathy lesion in fundus images, such as for instance in the fundus images exudates can helps in the premature treatment of the DR in initial stages. At present, numerous studies in the fiction have reported on detecting and partitioning exudates from the fundus image, but none of these strategies providing the results as required. In addition, partitioning the exudates in fundus using ant colony optimization algorithm, a new approach delivered better results however it is suffering from the effectuation of the noise.

6 CA review study on Transaction management in distributed database system
Deepali Aggarwal, Aastha Makkar

Abstract-The Conventional Transaction model does not provide much flexibility and high performance. The Transaction Processing System considered as a well established and mature technology. At the present time Transaction Management is a fully grown thought in Distributed Database Management Systems for the area of research. This paper highlights the basic concepts of Distributed Database System and explains the Transaction management in the database. Various types of problems are encountered while accessing the distributed data. The consistent termination of a transaction and atomicity in a distributed environment is ensured by using different techniques one of them is Two-Phase Commit Protocol. The main objective of this paper is to reduce the communication traffic and to improve transaction response time.

7 Perfomance Analysis of a Secure Cloud Storage by Using Data De-Duplication
Surimalla Satya Sudheer Varma, B.Krishna

Abstract-Cloud Computing is one of the practices of using a network of remote servers hosted on internet to store, access and retrieve data from remote machines not from local machines. Now a day’s almost each and every firm or organizations try to adapt cloud as a storage medium in order to store the data on remote servers. As a lot of users try to store the data on a remote server and try to access those data concurrently, they need to store the data in a encrypted manner rather than storing in a plain text manner. Generally the process of converting the plain text into a cipher text format which is not readable by normal user is known as encryption and vice versa is known as decryption. Generally the principle of encryption and decryption of secure data is not available in current clouds either of public usage or private usage. So in this application for the first time we have implemented a new concept of encryption of plain text message into a cipher text form at the time of data storage on remote server, and the same data will be viewed in plain text at the time of decryption by valid user. Also in the current cloud storage systems there was no mechanism to reduce the data redundancy by any service. This is one of the major issues in current cloud where a lot of cloud users pay excess amount for their data storage on remote servers. So in this application we have integrated a new concept called as De-duplication of data along with encryption and decryption in order to provide more and more security for the data which is stored on remote servers. By conducting various experiments on this proposed concept on hybrid cloud service (I.e Both public and private) we finally came to a conclusion that our simulation results can mainly incurs minimal overhead compared to normal operations.

8 A Novel Protocol for Efficient Data Transfer in Wireless Sensor Networks
Polisetti Veni, P.Chaitanya Vandavasu

Abstract-In present days security plays an awfully vital role in every and each side of human life. As security plays an awfully vital role in today’s human life, it's conjointly terribly crucial in wireless sensing element networks to own high level of security throughout transmission. In WSN there was a scarcity of security between supply and destination throughout communication in wireless media. As wireless networks don't have any mounted topology, it's invariably a serious downside for wsn to spot the malicious node that causes failure or attack throughout transmission, because it won't be reside during a mounted region. Therefore during this paper we've introduced a completely unique cluster based mostly wireless sensing element network known as NCWSN for secure and economical knowledge transmission in wireless media. These clusters square measure invariably fashioned dynamically and it'll modification sporadically from every and each individual supply and destination. during this paper we've enforced 2 algorithms called SETIBS (Secure economical knowledge Transmission with Identity-Based digital Signature scheme) and conjointly SET-IBOOS (Identity-Based Online/Offline digital Signature (IBOOS) scheme).These 2 algorithms square measure economical in distinguishing the failure nodes that square measure out there in wireless medium and provides and economical knowledge transmission. As an extension for this paper we've enforced this projected algorithmic program with three attacks like active, passive and compromised attacks within the wireless media. These 3 attacks increase system performance in terms of potency, integrity. during this projected Novel Cluster based mostly Wireless sensing element Networks (NCWSN) if any node that is extremely close to supply and if it resides in supply cluster creates any downside then it cannot be known as active aggressor by the server and server will right away provides alternate best path so as to transfer the info to the destination. If any node that belongs to the immediate cluster from supply cluster and if it creates any failure ,such a node is treated as passive attack node and therefore the server can right away offer alternate path from that downside space to destination. If the node that is close to removed from supply and that is in destination cluster then this can be treated as compromised aggressor and therefore the server will send the info during an alternate path that is close to destination. By conducting numerous experiments on the projected 2 algorithmic program on a LAN network our simulation results clearly tells that this projected idea can deliver the goods high level of knowledge & security whereas transferring data through wireless media.

9 Design and Analysis of a Novel Authenticated Data Access Control Scheme on the Cloud
Nammi Hemanth Kumar, P.S.L.Sravani

Abstract-Cloud Computing is one in every of the follow of employing a network of remote servers hosted on net to store, access, retrieve knowledge from remote machines not from native machines. Because the knowledge are going to be keep on remote server, the user can retrieve the info from that server at the time of would like. As the knowledge that is keep on the cloud server is keep in plain text it doesn’t have any security whereas retrieving that knowledge either by same person inside the cluster or from outside the cluster. So during this paper we've enforced a replacement redistributed access management mechanism so as to store the info firmly with associate anonymous authentication. In this new planned theme whenever associate cloud user wish to store or retrieve the info to and from the cloud server it'll attested for validatory the identity of that user. This method is completed by trusty Third Party critic within the cloud server. Once if the identity is verified and matches by the user’s original identity then solely the info may be keep or retrieved. If u user fails to verify his identity he is going to be treated as un-authorized for knowledge usage. And the user identity was verified with the token id which is substituted by the user, then only he can view the data in the plain text, otherwise the info are going to be in cipher mode solely. By conducting varied experiments on our planned theme, we finally came to associate conclusion that this planned theme is extremely best in giving security for the cloud knowledge throughout storage and retrieval time in an exceedingly redistributed manner compared with many primitive ways.

10 A Low Power Single-Rail Pipelined Adder Based on Partial Element Reuse
Anu Priyanka R, Prabhakaran G

Abstract-Asynchronous circuits do not assume any quantization of time. Therefore, they hold great potential for logic design as they are free from several problems of clocked circuits. This brief presents a parallel single-rail self-timed adder. It is based on a recursive formulation for performing multi bit binary addition. The operation is parallel for those bits that do not need any carry chain propagation. Thus, the design attains logarithmic performance over random operand conditions without any special speedup circuitry or look-ahead schema. A practical implementation is provided along with a completion detection unit. The implementation is regular and does not have any practical limitations of high fan outs. A high fan-in gate is required though but this is unavoidable for asynchronous logic and is managed by connecting the transistors in parallel. Simulations have been performed using an industry standard toolkit that verifies the practicality and superiority of the proposed approach over existing asynchronous adders.

11 Automated Attendance System
Prateek Neman, Mrunal Patil, Pravin Keluskar, Suvana Pansambal

Abstract-The face is the identity of a human. The strategies to take advantage of this physical feature have seen an excellent amendment since the appearance of image process techniques. The correct recognition of a person is the main aim of this system and its identification is used for other processes. Ancient face recognition systems use strategies to spot a face from the given input however the results aren't sometimes correct and precise as desired. The system described during this paper aims to deviate from such ancient systems and introduce a new approach to spot someone employing a face recognition system. This paper describes the operation of face recognition system that can be deployed as an automatic attendance system in different classroom surroundings. The techniques and algorithms used with the constraints and difficulties are highlighted in this paper. The employment of mathematical logic and also the concepts of Content based mostly Image Retrieval (CBIR) are the main side of the planned machine-controlled system.

12 Compressed Camera Fingerprint Matching Via Random Projections
Sowmiya N, Sadish Kumar S.T

Abstract-Sensor imperfections in the form of photo response non-uniformity (PRNU) patterns are a wellestablished fingerprinting technique to link pictures to the camera sensors that acquired them. The fingerprint image has to be compressed and encrypted by the authentication and matching purposes in the forensic tasks by using the PRNU values to be reconstructed the original image quality. The digital image processing is to be used for compressed camera fingerprint matching via random projections. Fingerprints are one of those irregular twists of nature. The fingerprints are to be used authentication and identification processes in forensic tasks such as detection of digital forgeries. Forensic tasks can to be performed in device identification problem, device linking problem, fingerprint matching problem. For random projections the compression technique is to be required with no information loss and to be measured by PRNUvalues.

13 Repair of Gastrointestinal Tract Perforation with Free Parietal Peritoneal Patch-A Control Study
Deepak Gupta, B L Sunkaria, Nirmal Singh, Deepashu Sachdeva

Abstract-Perforation of gastrointestinal tract is a life threatening surgical emergency with high mortality. This study aims to compare the results of free parietal peritoneal patch technique (study group) with standard techniques of either omental patch or direct suturing (control group) in the surgical management of perforation. A prospective study of 50 patients with gastrointestinal perforation was carried out. The highest incidence was seen in the age group of 21 to 30 years. The patients in study group showed the appearance of bowel sounds, passage of flatus and stool one day prior as compared to patients in control group. The study group was also started oral feeding much earlier and the difference in two groups was found to be statistically highly significant (p≤0.05).Six patients in control group showed the presence of anastomotic leakage as compared to one patient in study group. Four patients in control group developed adhesions leading to intestinal obstruction as compared to none in study group. Thus this technique of free parietal peritoneal patch is associated with evidence of early gastrointestinal motility, oral intake and less complications as compared with conventional methods in management of gastrointestinal perforations.

14 Built-In Self-Test Architecture with Normalized Euclidean Distance Based Cyclic Redundancy Check
Vanitha.D, Kavitha

Abstract-BIST (Build-in Self-Test), schemes are the solution of testing VLSI devices. BIST is used to make faster, less-expensive integrated circuit manufacturing tests. The IC has a function that verifies all or a portion of the internal functionality of the IC. In some cases, this is valuable to customers, as well. For example, a BIST mechanism is provided in advanced field bus systems to verify functionality. They perform during testing of circuits based on CTL. A cyclic redundancy check is an error-detecting code commonly used in digital networks and storage devices to detect accidental changes to raw data. The existing work is to design the low-power (LP) programmable generator capable of producing pseudorandom test patterns with desired toggling levels and enhanced fault coverage gradient compared with the best-to-date built-in self-test (BIST)- based pseudorandom test pattern generators. But it requires more limitation. So the proposed system is to design a LP-TPG based BIST architecture and to include the normalized Euclidean distance (NED) based CRC process. The proposed work is to modify test pattern generation scheme. The cyclic redundancy check (CRC) polynomial generation is used to modify the sign generation work. This work is used to identify the fault location. The proposed system is to improve the BIST system performance level and to reduce the power consumption level also. The proposed system is used to increase the test pattern generation process. Our proposed work is to develop the VHDL programmable logic function and to simulate the XILINX 14.2 software.

15 Fuzzy logic Control based IUPQC for Power Quality Improvement on Distribution Network
P.Venkatesan, Dr.S.Senthilkumar

Abstract-Power quality is a major concern of present era power system. To improve quality of power a custom power device Interline Unified Power Quality Conditioner (IUPQC) is used in the distribution network is embellished. i New control technique, namely, fuzzy control technique implemented for IUPQC in distribution system for IEEE 14 bus system. By using IUPQC result analyses is observed for reactive power compensation, voltage sag and swell mitigation and voltage unbalance. This paper embellishes how IUPQC can improve the power quality by mitigating all these PQ disturbances. The proposed configuration of the IUPQC is developed and verified for various Power Quality disturbances by simulating the model using MATLAB simulink software.

16 Adaptive Voltage Control for Renewable Energy Integrated With Smart Grid
P.Venkatesan, Dr.S.Senthilkumar

Abstract-In this Project the distributed generation (DG) into Smart Grid represents a great challenge of the future for power systems. The integration of DG based on renewable energy sources (RESs) in distribution networks without compromising the integrity of the grid. It requires the development of proper control techniques to allow power delivery to customers in compliance with power quality and reliability standards. The transmission of electric power has to take place in the most resourceful way without the transmission network failure. The transmission systems in the present time are becoming increasingly complex & stressed because of growing demand and because of restrictions on installation of new lines. For transmission network security & failure point of view it is quite important to calculate the most sensitive node in the network.

17 Neuro-Fuzzy Logic Controller for Wind Energy Conversion System
P.Venkatesan, Dr.S.Senthilkumar

Abstract-In this paper, an adaptive control strategy is proposed for wind energy conversion system based on permanent magnet synchronous generator Adaptive control is the control method used by a controller which must adapt to a controlled system with parameters which vary or are initially uncertain. Integration into the grid of this type of generators requires a full rated AC/AC converter which, in most cases, is based on the voltage source converter technology (VSC).Voltage source converter has the ability to control active and reactive power. The adaptive control strategy uses an adaptive PI which is self-tuned based on a linear approximation of the power system calculated at each sample time. Adaptive control allows the integration of wind resources as plugand- play devices in electric power systems. A model reference is also proposed in order to reduce the post-fault voltages. Simulation results demonstrate the advantages of the proposed control.

18 Design and Implementation of a Secure DBaaS Service on an Encrypted Cloud Database
Borra Srujana, P.Vijaya Bharati

Abstract-Cloud Computing is one in each of the follow of using a group of remote servers hosted on internet to store, access, retrieve information from remote machines not from stored with it remote server, the user will retrieve the information from that server at the time of his need. As the information which is stored in cloud data base is always stored in the form of plain text it has no security. There was several alternatives exist for storage services, whereas information confidentiality solutions for the info as a service paradigm square measure still immature. So in this paper for the first time we tend to propose a unique design that integrates cloud info services with information confidentiality and therefore the risk of capital punishment simultaneous operations on encrypted information. By implementing this new concept we can able to provide security for the data which is stored in the cloud database from the un-authorized access. Also this application is best suited for accessing data either in distributed environment or run the application concurrently at a same time by number of cloud users. As an extension of this paper, we have implemented the proposed architecture on live cloud server like DROPBOX, where the data which is stored on drop box with this application is stored in the form of encrypted manner rather than in plain text manner. By conducting various experiments on our proposed system our simulation results clearly tells that this proposed concept on cloud gives more security for almost all types of cloud users compared with several primitive mechanisms.

19 Implementation of 13-Level Inverter Based Flexible Ac Distribution System for a Microgrid
A.Subramani, Dr.S.Senthilkumar

Abstract-This paper, an implementation of five-level inverter based flexible ac distribution system for a microgrid application. The device aims to improve the power quality and reliability of the overall power distribution system that the microgrid is connected to. The control design employs a new model predictive control algorithm which allows faster computational time for large power systems by controlling the voltage sag. Extended Kalman filters are also employed for frequency tracking. Most of the power quality problems are voltage sag, swell, interruption, transient, fluctuation, etc. Among those power quality problem voltage sag is severe one. So it is analyzing and mitigated using multilevel inverter which is connected serious to the grid. The design concept is verified through different test case scenarios to demonstrate the capability of the proposed device and the results obtained are discussed.

20 A Survey: Map Reduce Techniques to Process Big Data and Its Application
Aishwarya dubey, Pnkaj Kumar Sahu, Pranjal Tiwari

Abstract-In the field of medical data, a large amount of data is generate and need to get process it rapidly because 24*7 service need to be deliver at multiple point by the system in a long way run. A data (Big data) processing knowledge and technique is always required while working and accessing this kind of data from the system. Thus an efficient mechanism to process gigabyte size of data using some distributed, parallel processing technique to process data efficiently and fast using Hadoop, map reduce technique for accessing this big data. Thus we have numerous processing techniques and task level is one of that. But the problem associated with the task level scheduling is that it provide or treat all machine as equal configuration and load , thus it divide the task equally irrespective of system knowledge which may lead to low efficiency if the system having different configuration and capacity. In order to mitigate this we have a phase level architecture using which the technique can be further made efficient. In this survey we study the various map reduce data processing technique and their application. We further would like to investigate the final proposed algorithm can apply which can work in medical field dataset and perform efficiently.

21 M-SUSIE: Multi Input Parameter Searching From Web Searching
Kamini Khatik, Ayonija Pathre

Abstract-The searching technique is required technique where the efficient binding for the input variable and accessing the related data through the SUSIE technique. We show how this idea can be integrated into a Web service orchestration system. Our approach is fully implemented in a prototype called SUSIE. Here required to propose to use Web-based information extraction (IE) on the web service to determine the right input values for maximize the number of outputs using query keyword. The input requirement should be less and the goal remains the computation of maximal results. We have proposed to use information extraction to guess bindings for the multiple input variables and then finding the maximum outputs using the keyword thus we get the result using our technique which produce large number of output for user input query at single running.

22 An Analysis of different Query technique to spatial dataset: A Survey
Harsha Shrivastava, Prof Ratnesh Dubey, Dr.Vineet Richhariya

Abstract-Spatial data and its usage is trending requirement for today, where environment is facing various issues associate with the natural phenomenon, some geographic, temperature, climate and different activity need to monitor using different technique. Also based on the previous log data and current data various predictions can be determine. Different resources such as mobile data, GPS, twitter, face book and other resources of data generation today create different range of dataset with the activity of users from different locations. Data mining technique is an efficient approach to extract top required data and further on analyse them. A recent approach studied based on reverse top K-Rank query dataset. In this paper we survey about the different data mining approach working on spatial data available at different geo condition. We observe the efficiency of different technique in terms of precision, accuracy, recall and detection rate on which we can utilize the data to determine again a new algorithm for better approach for keyword search in spatial dataset.

23 Advanced Data Processing In Peer to Peer Technology With Peer++ Approach
Y. Revathi, V.Srikanth, Ramesh Challagundla

Abstract-The network of corporate is frequently employed for covalent data where share familiar among the contribute companionship and craft trouble-free alliance in a certain industry segment. Eventually respond to the ever changing industrial demands and the peripheral of Cloud Computing modus operandi, BestPeer has built up into cloud-enabled Extended BestPeer system. The present paper focused on Extended BestPeer, a system which convey flexible data sharing services for the viable network applications in the cloud based on BestPeer a peer-to-peer (P2P) based data management platform. By coalescing cloud computing, database, and P2P technologies, Extended BestPeer accomplishes its query dispensation efficiency in a pay-as-you-go mode. The present work evaluates Extended BestPeer on Amazon EC2 Cloud platform. The benchmarking outcomes confirm that Extended BestPeer superior than HadoopDB, in presentation while both systems are working to uphold typical corporate network.

24 Web Service Recommendation System for Quality of Services Approach with Web Survey
Pinninti LakshmiRoopini, K. Sangeeta, Ramesh Challagundla

Abstract-Today, the web services are recurrent source for colossal information and also support the interaction of machine-to-machine over a network through integrated software components. So, web services became building service-oriented application in industry as well as in academia. There is a progressively rise in web services publicly on the internet. Therefore, there is an unambiguous requirement for computerized methods that can trace, recognize and retrieve the information to make available to the individual’s needs, urges. Internet also generates newer potentialities to categorize and recommended data. From huge number of services, a user may feel hard to pick a suitable service among it. An inapt service selection may root inconvenience to the resulting applications. A novel collaborative filtering-based Web service recommender system to assist users’ services with finest Quality-of-Service (QoS) is proposed in the paper. Our proposed recommender system includes data mining techniques for location information and QoS values to provide better web-based applications. An in-depth survey is presented in the paper especially on the recent studies of recommendation systems based on web usage mining and semantic web.

25 Access Control Mechanism in Relational DBMS for Acquiring Probalistic Accuracy for Privacy Protection
M.Dedeepya, M.Kranthi Kumar, Ramesh Challagundla

Abstract-Access control mechanism is helpful in safe guarding sensitive information from illicit users. However, an endorsed user still compromising with his confidentiality while sensitive information is shared which is leading to leakage of identity. A Privacy Protection Mechanism (PPM) with its censorship and simplification of relation data anonymizes and assures confidentiality necessities via k-anonymity and l-diversity, against leakages of identity and features. However, confidentiality is accomplished at the outlay of meticulousness of endorsed data. The present paper is mainly focused on an accuracy-constrained privacy-preserving access control structure. The access control strategies characterize selection predicates accessible to tasks whereas the confidentiality necessity is to assure the k-anonymity or l-diversity. An supplementary constraint that has to be fulfilled by the PPM is the ambiguity bound for each preference predicate. An intense literature review mostly revealed about the techniques of workload-aware anonymization for preference predicates. With the paramount awareness, the problem of fulfilling the accuracy constraints for manifold roles has not been investigated yet. The present paper is mainly focused on the formulation of the aforementioned problem which is based on anonymization heuristics algorithms and confirm empirically that the proposed technique assures ambiguity bounds for more permissions and has subordinate total imprecision than the existing situation of the art.

26 Numerical Simulation of Heat Transfer in Eccentric Annuli in the presence of Copper Nanofluid
K.V.Ramesh, B.Srinivas, B.Sreenivasulu, K.Rama Krishna Raju

Abstract-Numerical computations are presented for steady state two-dimensional natural convection flow and heat transfer of Cu-water nanofluid enclosed between the annuli of concentric and eccentric horizontal cylinders. Constant temperatures are imposed on the inner and outer cylinders as Th and Tc, respectively. The governing equations in terms of Navier-Stokes and energy equations are discretized using a finite element technique and then solved by iteration. The simulated fluid flow patterns and temperature distributions in the annuli are vividly portrayed by means of contour maps of streamlines and isotherms. Results are investigated and highlighted for the influences of pertinent parameters such as the horizontal and vertical eccentricity ε, the nanoparticles volume fraction parameter φ, the Rayleigh number Ra and the Prandtl number Pr on the mean Nusselt number Nu and also discussed in detail. Numerically evaluated mean Nusselt number is in excellent agreement exist between the present results and previous work, available from open literature.

27 Personalized Web Search with Secure Privacy Preserving
Badi.srilekha, N Sagar Pavan, Ramesh Challagundla

Abstract-The quality improvement of assorted search engines on the Internet has vividly demonstrated by personalized web search (PWS). On the other hand, facts illustrate that users’ disinclination to unveil their hush-hush information during search has become a foremost hurdle for the spacious proliferation of PWS. Privacy fortification in PWS applications that representation user predilections as hierarchical user profiles, we put forward a PWS scaffold called UPS that can adaptively oversimplify profiles by uncertainties while respecting user precise privacy necessities. We present novel energy aware direction-finding algorithm. We also provide an online forecast mechanism for make your mind up whether personalizing a query is of assistance. Vigorous research laid bare the effectiveness of our scaffold.

28 A Novel Cryptography Algorithm for Avoiding Key-Escrow Problem in DTN Based Military Networks
Dharmireddi Pradeep, CH.Sunil

Abstract-MANET’s are the devices which are always self-configuration nodes connected without wires. Generally in Manets each and every device has a chance to move from one location to other location independently without having any restriction. As the MANET’s have a capability to cover all parts of the world over small and large bandwidth areas. As the growth of mobiles, laptops have increased the usage of these mobile nodes also increased a lot. Generally manets are mainly used by military officials in order to have communication at the time of battlefields or hostile regions. As the military officials are taking this Manets as advantage, sometimes they may suffer with intermittent network connectivity and frequent network partitions. So in order to avoid this problem now a day’s almost each and every military officials are preferred with Disruption Tolerant Network. By using this DTN, the soldiers can communicate with each other and access the confidential data or commands without having any intervention between their communication.Eventhough a lot of mobile users are taking DTN technology as advantage they still face some challenging issues like authorization and secure data retrieval. Inorder to solve this challenging issues we use an advanced cryptography algorithm like Attribute Based Encryption with Cipher Text Policy. Also we have used the Identity Based Encryption algorithm in order to encrypt the data which is stored in storage node by the sender and here we use secret key encryption for generating the keys dynamically by the server. In this paper we mainly use this ABE algorithm over decentralized nodes over multiple key Authorities. As an extension we have implemented the Simulation results on Network Simulator tool which was mainly designed in java to show the performance of DTN over military networks. By conducting various experiments on this proposed ABE over Cipher text policy on Military networks we finally came to an conclusion that this proposed system is more secure in keeping data confidentially in the disruption-tolerant military network.

29 A Novel Protocol for Detecting and Preventing Abused Messages in OSN Networks
N.Purnima, Madugula Monica, Ramisetti Supriya, Balivada Sowmya, Sri Sravya Manne

Abstract-Online Social Networks (OSNs) plays an awfully vital role in today’s day to day communication. By exploitation OSN network plenty of billions of users share their recent updates and private feelings all round the world through public and personal access policy pattern. because the OSN Network is increasing its quality by gaining plenty of user’s attention ,the major vital issue that was sweet-faced by OSN user’s is that the ability to manage the messages denote on their own personal house to avoid that unwanted content is displayed. To resolve this drawback, during this paper, we have a tendency to plan a brand new novel filtering system that permits all the collaborating OSN users to own an immediate management on the messages denote on their walls. This new mechanism permits users to post the wall message with a freed from abused words. This is often achieved by a Machine Learning (ML) based mostly soft classifier algorithmic program that is employed for mechanically labeling messages if it's recognized as black list word.

30 Analyze the Performance of a Novel Live Traffic Index Indicator (NLTII) on Google Maps through Shortest Path Computation Model
Bejjipuram Asiri Naidu, Talada Indumathi, Ramesh Challagundla

Abstract-Data mining is the process of analyzing data from different perspectives and summarizing it into useful information. Now a days Google maps plays a very important role in almost all human lifes.They are designed by google inorder to view either on a desktop or web mapping advertisement through mobile. The information which appears on google maps come directly through satellite with its satellite view or panoramic views. One of the main uses of google maps is route navigator which was majorly used by almost all types of cars in metropolitan cities. This google maps provide the route navigation for the car drivers at the time of need. Although it was providing a lot of advantages for the new user who was very new to the city by showing the available routes, it was failed in displaying the live traffic updates regarding the travelling routes. In this paper we have implemented a NLTII (Novel Live Traffic Index Indicator) inorder to update the live traffic based on the index values that was recorded with wireless sensors that was placed on both the sides of road. Also we have implemented the shortest path computation in this project in order to get the available paths which are having shortest distance first and later on the next shortest paths. As an extension we have implemented Stochastic Model in this paper along with graph view where the stochastic model will display the routes in the form of direction wise manually. So by conducting various experiments on the proposed NLTII we finally came to a conclusion that this was the approach which was not yet implemented by any Car Companies as their Route Navigation System and if this was launched as early as possible we can reduce a lot of traffic problems in and around the cities.

31 A Novel Protocol to Integrate Various Security Primitives for a Secure Data Communication
Neerasa.Sindhuja, I.S. Srinivasa Rao

Abstract-In current days security plays a very vital role in each and every sector. In order to provide the security for the data a lot of security primitives have been proposed in literature. Each and every security primitive has its own advantage in providing security for the sensitive data. As security has highest privilege there were a lot of adversary users who try to access information of secured users illegally through various hacking codes. In this paper we have proposed a new security model by combing a set of three security primitives all together to give more security for the data which is to be stored on remote servers or on local machines. The combined set includes bio-metric authentication, spam filtering and intrusion detection system. In this proposed paper we also designed an additional concept like Monitoring the activities of users, like their login and logout details ,availability of network IP address and the details like their upload and download also. By conducting various experiments on our proposed new integration model, our simulation results clearly tells that by using this new framework a user can be free from adversary attack during his data storage as well as data retrieving.