Sr/No/ |
Title |
Pdf |
1 |
Managers Perception Of Trade Unions: Study Of Automobile Organizations In India
Saba Jafri
Abstract—Trade Unions are perceived as hampering work of organizations affecting
productivity and profitability of organizations. In recent times we have seen
that problems with unions have affected productivity and profitability in
automobile manufacturing organizations in India .This research paper tries to
empirically find out the perception about trade unions in the minds of managers
responsible for industrial relations in these organizations. Data was collected
from HR personnel from small, medium and large organizations in this sector from
different automobile organizations in India. One way ANOVA test was conducted to
find out what perceptions managers held about unions bargaining power,
communication with unions and their impact on work performance. Tukeys Post hoc
analysis was used to find difference in perception between different sized
organizations. Results revealed small sized organizations had biased perception
towards trade unions as compared to medium and large sized organizations.
|
Download
|
2 |
Wage and Salary Administration
In Agriculture and Allied
Sectors: A Case Study of
Nagaland
Thangasamy Esakk
Abstract— Worldwide, human resources are considered as one of the most important
assets of a country. Human Resource Management plays a very crucial role in
boosting the economic growth and development of a nation. Undoubtedly,
compensation for employees does play a vital role in human resource management.
Therefore, the management function further extends its jurisdiction to the
crucial dimension of the wage and salary administration in an organization at
micro level and in a nation at macro level. By and large, this task is very
significant to retain and motivate the employees for improving their
productivity of labour directly which entails indirectly the enhancement of
their compensation and standard of living. This situation does draw the greater
attention of the Government to enact suitable laws, rules and regulations to fix
minimum wage rates for multifarious jobs, considering all the internal and
external factors into account. Like other developed countries, regulation of
minimum wage rates becomes, therefore, very significant in India. Again, among
all the sectors, agriculture and allied sectors are the backbones for
socio-economic growth and development of a country. The scope of the human
resource management, thus, draws more attention, especially towards the
developing regions, like North Eastern Region, including the State of Nagaland.
Purpose of this paper is to identify the multifarious agriculture and allied
sectors in Nagaland, to analyze the minimum rate of wages for employment in
these sectors, and to bring out suggestive measures to the Government for
framing suitable policies to improve the wellbeing of the people engaged in such
activities within the State.
|
Download
|
3 |
A New Anticipatory Protocol for Removing of Fraud in Online Through Probit Auction Model
M.Priyanka, G.Seetharatnam, Kunjam Nageswara Rao, Peri.Srinivasa Rao
Abstract—Now a day’s e-commerce is increasing its reputation by making a lot of
supply and demand for the end users all around the world.it is expected very
fast in its popularity than predicted as it is up over 500% compared in the
past. This was increasing the user attention mainly due to the ease of customers
to do shopping in online rather than without spending much time by shopping
manually. As this e-commerce gives a lot of burden free product for the
customers, there are also some criminals who try to do fraud and get profit in
illegal ways. As people are enjoying the advantages from online trading,
traitors are also taking advantages to accomplish betrayal activities against
candid parties to obtain dishonest profit. In this paper in order to detect and
prevent such illegal and betrayal activities online probit fraud-detection
moderation systems by using machine learned models are commonly applied in
practice. In this paper, we show that this model can probably distinguish more
deceptions and extensively decrease customer complaints which are based on a
real-world online auction fraud detection data compared to several baseline
models and the human-tuned rule-based system.
|
Download
|
4 |
Numerical Studies on Fuzzy Retrial Queues
G. Easwara Prasad, D. Jayalakshmi
Abstract— This paper aims to construct the membership function for fuzzy retrial
queue system using non-linear programming approach with three fuzzy variables,
fuzzified exponential arrival, retrial and service rate which are represented as
fuzzy numbers. Using α-cut approach, fuzzy retrial queues can be reduced to a
family of crisp retrial queues with different α-cuts. Triangular fuzzy numbers
are used to demonstrate the validity of the proposal. The discussion of this
paper is confined to systems with one two fuzzy variables: nevertheless, the
procedure can be extended to systems with more than two fuzzy variables.
Numerical illustration has been carried out successfully.
|
Download
|
5 |
Efficient Clustering Algorithm for Outlier Detection
Dhananjay Kumar Tiwari, Hari Mohan Singh, Shiv Pratap Pal
Abstract— Data mining helps to extract an important and valuable knowledge from
the large massive collection of data. There are several techniques and
algorithms are used for extracting the hidden patterns from the large data sets
and finding the relationships between them. Clustering algorithms are used for
grouping the data items based on their similarity. Clustering means the act of
partitioning an unlabeled dataset into groups of similar objects. The goal of
clustering is to group sets of objects into classes such that similar objects
are placed in the same cluster while dissimilar objects are in separate
clusters. The algorithms used in this research work are PAM (Partitioning around
Medoid), CLARA (Clustering Large Applications) AND CLARANS (Clustering Large
Applications Based on Randomized Search) and a new clustering algorithm ENHANCED
CLARANS for detecting outliers. In order to find the best clustering algorithm
for outlier detection several performance measures are used. The experimental
results show that the outlier detection accuracy is very good in the ECLARANS
clustering algorithm compared to the existing algorithms. It has a very high
accuracy but still it takes time to be accurate. So by this research work this
can also be done. The aim of this research is to reduce the time complexity of
the ECLARANS.
|
Download
|
6 |
Relationship Between Selected Anthropometric Measurements and
Performance of Women Basketball Players
Kavita Sharma
Abstract—The role of anthropometry as a sports science is perhaps one of the
most crucial in this regard. It is essential because the physique, body
composition, physical growth and one's motor development are of fundamental
importance in developing the criteria of talent relation and development in
sports. Keeping in mind the fact the scholar decided to find out the
Relationship between Selected Anthropometric Measurements and Performance of
Women basketball Players. 35 female students studying in different colleges of
University of Delhi who have either participated in inter college or Zonal
championships in basketball were randomly selected to act as subjects for the
study. The average age of the subjects was 21 years. Weight, Standing Height,
Sitting Height, Leg Length, Lower Leg Length, Upper Leg Length, Arm Length,
Upper Arm Length and Lower Leg Length were taken into consideration for the
anthropometric measurements. Weighing Scale, Anthropometric Rod, Steel tape and
Skin Fold Calliper were the tools used for the measurements, whereas the
performance of the selected female basketball players was gathered by the help
of the three experts out of 10. The collected data was analyzed by computing
descriptive statistics followed by Pearson’s Product Moment Correlation. The
results revealed that mean and SD values of Weight, Standing Height, Sitting
Height, Leg Strength, Lower Leg Length, Upper Leg Length, Arm Length, Upper Arm
Length and Lower Leg Length were found to be 50.48±5.28, 154.91±3.85,
79.03±6.32, 86.82±7.38, 47.80±3.82, 43.80±1.99, 59.73±4.68, 29.15±3.20 and
31.0±4.21 respectively. Whereas, a significant relationship was found between
Performance score and the selected variables, as the values were found to be
0.324, 0.828, 0..468, 0.481, 0.655, 0.533, 0.352 and o.658 respectively against
the tabulated value 0.296 which was significant related at 0.05 level and only
upper leg length was not correlated to performance score as the value was found
to be 0.178. So finally it was concluded that for better prediction of
performance of women basketball players, all the selected anthropometric
variables may be considered in combination instead of studying the influence of
each independent variable.
|
Download
|
7 |
Analysis of Data Centre Resources for Businesses
Govinda.K, Saheel Sasi.A
Abstract—One of the practical concerns that have attracted significant attention
is the efficient resource management in the virtualised data center. In order to
maximize the revenue for commercial cloud providers, the economic allocation
mechanism is desired. Nowadays, industries are seeking scalable IT solutions
such as data centers, hosted in either in-house or a third party for the
advancement of virtualization technologies and the benefit of economies of
scale. It is ubiquitous to have data centers via cloud setting. Very little is
known about the interaction of workload demands and resource availability. A
large scale survey of in-production data center servers within a period of two
years would fill this gap. The seasonality of resource demands and its affects
by different geographical locations are the main focus. This paper presents a
brief analysis of data center growth to meet business demands in different
geographical locations.
|
Download
|
8 |
A New Privacy Preserving Protocol for
Authenticated Group Key Transfer of Data
Based on Secret Sharing
Appikatla Srihari, Dr.Koduganti Venkata Rao
Abstract—In recent days, KGC plays a very important role in generating keys for
the user account access in order to provide high level of security. Key transfer
protocols rely on a mutually trusted key generation center (KGC) to select
session keys and transport session keys to all communication entities secretly.
Most often, KGC encrypts session keys under another secret key shared with each
entity during registration. In this paper, we mainly proposed an authenticated
key transfer protocol based on secret sharing scheme that KGC can broadcast
group key information to all group members at once and only authorized group
members can recover the group key; but unauthorized users cannot recover the
group key. The confidentiality of this data transfer is always secure. We also
provide authentication for transporting this group key. By conducting several
experiments on this proposed model , we finally came to an conclusion that with
this mechanism we are able to give high security for data transfer as well as
bestly suited for reducing key sizes in the data base.
|
Download
|
9 |
A New Privacy Preserving Access
Control Policy for Event
Processing System
K Revathi Prasanna, Ch Srinivasa Reddy, P Srinivas
Abstract—Event processing is a method of tracking
and analyzing (processing) streams of information
(data) about things that happen (events) and deriving
a conclusion from them. Complex event processing,
or CEP, is event processing that combines data from
multiple sources to infer events or patterns that
suggest more complicated circumstances. Today
event processing systems lack methods to preserve
privacy constraints of incoming event streams in a
chain of subsequently applied stream operations.
This problem is mainly observed in large-scale
distributed applications like a logistic chain where
there were a lot of individual domains available for
processing the event. An intruder can always infer
from legally received outgoing event streams
confidential input streams of the event processing
system. This paper mainly concentrates a very new
fine-grained access management for processing
complex event processing tasks. Here each and
every operation is performed by individual roles
with their access policies specified. This paper is
mainly used for specifying the access policy and
also enforcement of those access policy
specifications in a proper way. By conducting
various experiments on our proposed access policy
system, we finally came to a conclusion that this
access control policy clearly suits for almost all
types of logistics for performing their operations
without any misuse in transaction. By conducting
various experiments on real time courier/shipping
company web sites, we finally came to an
conclusion that the current application suits best for
avoiding fake during courier deliveries.
|
Download
|
10 |
Classification of IDSs and challenges
Gulshan Kumar
Abstract—Nowadays, the Internet is the thing that we all want and like. In most
of the cases, we are dependent on its abilities like the ability to publish and
find the information, the ability to perform online shopping, and the ability to
communicate with others through various types of softwares. Unfortunately, most
of the popular softwares contain vulnerabilities and configuration errors. The
basic cause of these vulnerabilities is the software flaws. These flaws fail to
work with all possible conditions, especially unusual user input. Finding and
patching of all software flaws is a major problem of the industry. The malicious
users exploit vulnerabilities in software to mount a variety of intrusions. The
intrusions affect the users in multiple ways. The protection from intrusions
enforces the organizations to bear the additional costs. But, the cost involved
in protection from the intrusions is often insignificant when equated with the
actual cost of a successful intrusion. This factor forces the necessity to
develop an accurate intrusion detection system (IDS). Many efforts have been
made for the development of an effective IDS. But, still IDSs have to face many
challenges in providing true security against a variety of intrusions. In this
paper, we explored various IDSs and categorized them based upon their
architectural components. The IDSs have been critically analyzed for major
challenges and issues in detecting intrusions effectively. The study in this
paper will help the better understanding of different directions in which
research has been done in the field of intrusion detection. The findings of this
paper provide useful insights into literature and are beneficial for those who
are interested in applications and development of IDSs and related fields.
|
Download
|
11 |
Detecting Node Replication Attack In Wireless Sensor Networks Using Distributed Routing
Roshini K, P Hasitha Reddy, A V S M Adiseshu, S Lavanya Reddy
Abstract—Current sensor nodes lack hardware support and are often deployed in
such environments where they are vulnerable to capture and compromise by an
adversary. A serious consequence of node compromise is that once an adversary
has obtained the credentials of a sensor node, it can secretly insert replicas
of that node at strategic locations within the network. These replicas can be
used to launch a variety of insidious and hard-to-detect attacks on the sensor
applications and the underline networking protocols. Security in sensor network
is, therefore, a particularly challenging task. This paper discusses the current
state of the art in security mechanism for WSN. We present a novel distributed
approach called localized multicast for detecting node replication attacks. The
two variants of the localized multicast approach are analyzed those are: (1)
Single deterministic cell: SDC, (2) Parallel – Multiple Probabilistic Cell:
P-MPC, nodes which as their name suggests differ in the number of cells to which
a location claim is mapped and the manner in which the cells are selected. We
evaluate the performance and success rate of these approaches both theoretically
and via simulation.
|
Download
|
12 |
Computational Analysis of Multi Server Markovian Queue with Finite Capacity
P.Swamynathan, G.Eswara Prasad and V.S.Mathu Suresh
Abstract—A multi-server Markovian queue with machine repair problem has been
considered. If any operating machine fails, it is replaced by a spare machine
and the failed machine is admitted to repair. The life time and the repair time
of machines are expontially distributed. The balking rule is allowed for failed
machines. The values of failure and service rates are changed according to the
situation of the system. The management has appointed additional repairman if it
is necessary. The computational values for the expressions of the steady state
probabilities and expected number of failed machines are obtained. The related
curves are exhibited and compared.
|
Download
|
13 |
A Novel Mechanism for Controlling
Packet Coding Rate Dynamically in Delay
Tolerant Networks
Saragadam Ramu, Dr.Koduganti Venkata Rao
Abstract—Delay-tolerant network (DTN’s) is an
approach to computer network architecture that
seeks to address the technical issues in
heterogeneous networks that may lack continuous
network connectivity. Examples of such networks
are those operating in mobile or extreme terrestrial
environments, or planned networks in space. The
main characteristic of DTN is by their lack of
connectivity, resulting in a lack of instantaneous
end-to-end paths. In particular, routing schemes that
leverage relays’ memory and mobility are a
customary solution in order to improve message
delivery delay. When large files need to be
transferred from source to destination, not all
packets may be available at the source prior to the
first transmission. In particular, we determine the
conditions for optimality in terms of probability of
successful delivery and mean delay and we devise
optimal policies, so-called piecewise-threshold
policies. We account for linear block-codes and rate
less random linear coding to efficiently generate
redundancy, as well as for an energy constraint in
the optimization. We numerically assess the higher
efficiency of piecewise-threshold policies compared
with other policies by developing heuristic
optimization of the thresholds for all flavors of
coding considered.
|
Download
|
14 |
A survey on Mobile Threats and Detection Techniques
Gulshan Kumar, Manisha Batra, Sheenam Bhola
Abstract—In the past few years, the market adoption and utility of mobile devices has expanded intensely. Mobile devices store personal details like contacts and text messages. Due to this widespread growth, smart-phones are attracted towards cyber-criminals. Mobile phone security has become an important characteristic of security issues in wireless multimedia communications.In this research work, we have done a methodical review of the terms related to malware detection algorithms and have also a concise, interactive explanation of some known mobile malware in tabular form. After careful study of all the possible procedures and algorithms for detection of mobile-based malware, we give some recommendations for designing future malware detection algorithm by considering computational complexity and detection ration of mobile malware.In this paper, we will discuss mobile device attacks and types of detection techniques for mobile malware. At the end of this chapter we will give a conclusion by analyzing various techniques proposed by different researchers followed by some future recommendations.
|
Download
|
15 |
Analysis of New Clustering Algorithm on
High Dimensional Data based on Feature
Sub Set Selection
Ponduru Praveen Kumar
Abstract—Clustering high-dimensional data is the cluster analysis of data with
anywhere from a few dozen to many thousands of dimensions. While doing
clustering feature selection mainly involves in identifying a subset of the most
useful features that produces compatible results as the original entire set of
features. A feature selection algorithm may be evaluated from both the
efficiency and effectiveness points of view. While the efficiency of this
feature selection algorithm mainly tells the time required to find a subset of
features, the effectiveness is related to the quality of the subset of extracted
features. Based on these criteria, a fast clustering-based feature selection
algorithm, FAST, is proposed and experimentally evaluated in this paper. The
FAST algorithm which we proposed in this paper works in two steps. In the first
step, features are divided into clusters by using graph-theoretic clustering
methods. In the second step, the most representative feature that is strongly
related to target classes is selected from each cluster to form a subset of
features. Features in different clusters are relatively independent, the
clustering-based strategy of FAST has a high probability of producing a subset
of useful and independent features. In order to ensure the efficiency of FAST,
we adopt the efficient minimum-spanning tree clustering method. We have
conducted several experiments to compare FAST and several representative feature
selection algorithms that are already available in order to extract feature
selection. Finally our experimental results tells that FAST not only produces
smaller subsets of features but also improves the performances of the all the
existing classifiers.
|
Download
|
16 |
A New Enhanced Adaptive Acknowledgment Secure Intrusion Detection System for Mobile Adhoc Networks
Panga Narasimha Murthy, M M M Kumara Varma
Abstract—In recent days there was a lot of demand
for wireless sensor networks compared with wired networks as lot of users have
been migrating to wireless sensor networks from wired network .This is mainly
due to the mobility and scalability functions which was brought by WSN.As this
was increasing its attention of spreading almost all around the world, there was
a lot of intruders who try to attack the data transmission during WSN
communication. Among all the contemporary wireless networks, Mobile Ad hoc
NETwork (MANET) is one of the most important and unique applications. On the
contrary to traditional network architecture, MANET does not require a fixed
network infrastructure; every single node works as both a transmitter and a
receiver. Nodes communicate directly with each other when they are both within
the same communication range. Otherwise, they rely on their neighbors to relay
messages. The self-configuring ability of nodes in MANET made it popular among
critical mission applications like military use or emergency recovery. However,
the open medium and wide distribution of nodes make MANET vulnerable to
malicious attackers. For identifying the suspicious objects or intruders in
wireless sensor networks there was no proper system until an Intrusion Detection
System (IDS) has been proposed. In this paper, we propose and implement a new
intrusion-detection system named Enhanced Adaptive ACKnowledgment (EAACK)
specially designed for MANETs. Compared to contemporary approaches, EAACK
demonstrates higher –behavior-detection rates in certain circumstances while
does not greatly affect the network performances.
|
Download
|
17 |
A Novel Protocol for Preventing Posting of
Abused Messages in any OSN Networks
Gopika Chaganti, Prof. Peri Srinivasa Rao
Abstract—A social network is a social structure
made up of a set of social actors (such as individuals or organizations) and a
set of the dyadic ties between these actors. These social networks are generally
spread all around the world through internet. Online Social Networks (OSNs) are
today one of the most prominent interactive medium to communicate, share, and
disseminate a considerable amount of human life information. As the social
network is gaining its popularity in usage by various OSN users, the major
problem that was faced by OSN user is the ability to control the message content
posted on their own private space to avoid that unwanted content is displayed.
Now a days a lot of users are posting very un-parliamentary or rubbish messages
on their private walls and even post the same on others wall. To solve this
problem, in this paper, we have proposed a novel filtering protocol allowing all
participating OSN users to have a direct control on the messages posted on their
walls. This protocol was implemented by using automatic identification of
un-parliamentary words from the total message by using a Machine Learning (ML)
based soft classifier algorithm which labels the messages into blocked content
based on the category.
|
Download
|
18 |
A New Delegating Auditing Task to
TPA for Storage Correctness and
Privacy in Cloud
Lakshmanarao Simhadri , Rajendra Kumar Ganiya
Abstract—Cloud Computing is one of the recent
attraction in almost all types of business environments, where it is used for
storing a lot of individual private data on to a remote systems. As the data is
always stored remotely, we can’t able to give guarantee whether the data was
safe or it have been misused. Cloud means collection of storage servers
maintained by the cloud service provider which minimizes investment cost for
individual users and organizations. It providing on-demand self-service,
resource pooling, rapid elasticity and measured service. But users are worrying
about their data stored in untrusted cloud servers. For that introducing
third-party auditor along with privacy preserving public auditing technique
which audit, verifies and provides privacy of user’s data in cloud. In this
paper, we propose a secure cloud storage system supporting privacy-preserving
public auditing. We further extend our result to enable the TPA to perform
audits for multiple users simultaneously and efficiently. Extensive security and
performance analysis show the proposed schemes are provably secure and highly
efficient.
|
Download
|
19 |
A Novel Privacy Protection
Protocol for Hiding Sensitive Data
ON Social Network
Ravi Kumar Maradana, H Swapna Rekha
Abstract—Social data analysis is a style of
analysis in which people work in a social, collaborative context to make sense
of data. The term was introduced by Martin Wattenberg in 2005 [1] and recently
also addressed as big social data analysis [2] in relation to big data
computing. This project is motivated by the recognition of the need for a finer
grain and more personalized privacy in data publication of social networks. We
propose a privacy protection scheme that not only prevents the disclosure of
identity of users but also the disclosure of selected features in users'
profiles. An individual user can select which features of her profiles she
wishes to conceal. The social networks are modeled as graphs in which users are
nodes and features are labels. Labels are denoted either as sensitive or as
non-sensitive. We treat node labels both as background knowledge an adversary
may possess, and as sensitive information that has to be protected. We present
privacy protection algorithms that allow for graph data to be published in a
form such that an adversary who possesses information about a node's
neighborhood cannot safely infer its identity and its sensitive labels. We show
that our solution is effective, efficient and scalable while offering stronger
privacy guarantees than those in previous research.
|
Download
|
20 |
Secure Anonymization Protocol: A
Novel Protocol for Providing m-
Privacy in Collaborative Data
Publishing
Kalangi Vilson, Dr.Koduganti Venkata Rao
Abstract—Data publishing/Data publication is a
practice consisting in preparing certain data or data set(s) for public use thus
to make them available to everyone to use as they wish. This practice is an
integral part of the open science movement. There is a large and
multidisciplinary consensus on the benefits resulting from this practice. The
main goal is to elevate data to be first class research outputs. There are a
number of initiatives underway as well as points of consensus and issues still
in contention. In this paper, we consider the collaborative data publishing
problem for anonymizing horizontally partitioned data at multiple data
providers. We consider a new type of “insider attack” by colluding data
providers who may use their own data records (a subset of the overall data) to
infer the data records contributed by other data providers. This current paper
mainly concentrated on the insider attack that collides the published data. This
current issue is solved in 3 notions, First, we introduce the notion of
m-privacy, which guarantees that the anonymized data satisfies a given privacy
constraint against any group of up to m colluding data providers. Second, we
present heuristic algorithms exploiting the monotonicity of privacy constraints
for efficiently checking m-privacy given a group of records. Third, we present a
data provider-aware anonymization algorithm with adaptive m-privacy checking
strategies to ensure high utility and m-privacy of anonymized data with
efficiency. By conducting several experiments on these three issues, we finally
proposed a novel multiparty computation protocol for collaborative data
publishing with m-privacy. Experiments on real-life datasets suggest that our
approach achieves better or comparable utility and efficiency than existing and
baseline algorithms while satisfying m-privacy.
|
Download
|
21 |
A New Privacy Preserving Access
Control Policy for Event
Processing System
P Revathi Prasanna, Ch Srinivas Reddy, P Srinivas
Abstract—Event processing is a method of tracking
and analyzing (processing) streams of information (data) about things that
happen (events) and deriving a conclusion from them. Complex event processing,
or CEP, is event processing that combines data from multiple sources to infer
events or patterns that suggest more complicated circumstances. Today event
processing systems lack methods to preserve privacy constraints of incoming
event streams in a chain of subsequently applied stream operations. This problem
is mainly observed in large-scale distributed applications like a logistic chain
where there were a lot of individual domains available for processing the event.
An intruder can always infer from legally received outgoing event streams
confidential input streams of the event processing system. This paper mainly
concentrates a very new fine-grained access management for processing complex
event processing tasks. Here each and every operation is performed by individual
roles with their access policies specified. This paper is mainly used for
specifying the access policy and also enforcement of those access policy
specifications in a proper way. By conducting various experiments on our
proposed access policy system, we finally came to a conclusion that this access
control policy clearly suits for almost all types of logistics for performing
their operations without any misuse in transaction. By conducting various
experiments on real time courier/shipping company web sites, we finally came to
an conclusion that the current application suits best for avoiding fake during
courier deliveries.
|
Download
|
22 |
A New Tool with Piracy
Protection for Hiding Valuable
Data in Digital Media
S.Bala Sudha, K. V. N. Rajesh, G Jyothi
Abstract—Stegnography is a new branch of security
through which one form of data can be hidden in another form of data of either
same type or of different form type. This new mechanism is mainly implemented in
order to provide much more security for data which is transferring through the
network. As the user regularly transfer a lot of files from one system to other
system either within the range or far range by suing internet or intranet, he
eventually looks for more security .As we know that ordinary file encryption and
decryption concepts, which are readily available in java examples are easily
captured by middle way (I.e. During transmission) itself. So we need more
security combination for sending the digital form of data. This paper helps to
analyze how to send a file from one place to another in a secured manner.
Firstly the target file is encrypted using our new algorithm called as DES Bit
Shifting and it is embedded into an audio or video or any media file. The
resultant file will be protected by a password. This resultant media file is no
change in its original format and it can be run in the player, we can’t find any
encrypted data inside it. This format will be sent through Internet or through
any form of wired communication networks. In the destination point it will be
retrieved only by the same Stegnography protection software and giving the
relevant password. So it is highly secured.
|
Download
|
23 |
A Novel Technique for Secure Mining
of Horizontally Distributed Databases:
Model and Mechanism
G Kalyani Rajeswari, P Srinivas, K V N Rajesh
Abstract—We propose a novel protocol for the
authentication of the players participating in secure mining of association
rules in horizontally distributed databases. The novel protocol which is
proposed in this paper is based on the Fast Distributed Mining (FDM) algorithm.
Our proposed algorithm mainly computes the union of private subsets that each of
the interacting players holds and tests whether the element is present in the
subset held by another player imposing authentication to the data held by
players for finding the associations. In our proposed application we took an
example of university website and the players constitute faculty, student,
admin, principal and so on. Our proposed algorithm uses the symmetric key
encryption for the inequality verification of the player’s data. The symmetric
key encryption algorithm we use here is tiny encryption algorithm. Tiny
encryption algorithm proves to be efficient in terms of security and response
than the other symmetric key encryption algorithms. By conducting several
experiments on some university web site data we finally prove that TEA algorithm
is efficient in mining horizontal data bases in a secure manner.
|
Download
|
24 |
Web Stuggler: A New Tool for
Mining Web Pages based on Page
Traffic over Internet
M Sri Vidya, P Srinivas, Ch Srinivas Reddy
Abstract—A Web Crawler is a program in internet,
which automatically traverses the web by downloading documents and following
links from page to page. They are mainly used by web search engines to gather
data for indexing. Other possible applications include page validation,
structural analysis and visualization; update notification, mirroring and
personal web assistants/agents etc. Web Search/Web Crawlers is also known as
spiders, robots, worms etc. A Search resides on a single machine. The Search
simply sends HTTP requests for documents to other machines on the Internet, just
as a web browser does when the user clicks on links. All the Search really does
is to automate the process of following links. Web Searching speed is governed
not only by the speed of one’s own Internet connection, but also by the speed of
the sites that are to be searched. Especially if one is a Searching site from
multiple servers, the total Searching time can be significantly reduced, if many
downloads are done in parallel. This work implements the “Breadth First
Searching” algorithm, a refined version of one of the first dynamic Web search
algorithm. By conducting several experiments on various types of websites
comprising of educational websites like university, colleges, schools, public
sites, booking sites, banking sites and so on. We finally got a conclusion that
by using this proposed application or web crawler tool we can able to get the
web site rank based on individual page traffic not based on overall page
ranking. By this we clearly state that his proposed tool is mainly used by
website administrators in order to reduce the workload that was been caused by
participating users. by other users.
|
Download
|
25 |
A Novel Identity based Secure Distributed
Data Storage System in Cloud Computing
based on Database –as-a-Service
D S Priyanka, B Prasad
Abstract—Storage security is a specialty area of
security that is concerned with securing data storage systems and ecosystems and
the data that resides on these systems. The same storage security schema if
applied on distributed environment may gives more security for the data which is
stored on remote servers. Now a day’s cloud has become one of the fascinating
domain for storing a large number of data on to a server from remote locations,
stores them and gives facility for accessing the stored data by using a facility
called as “PAUZ”. In general the data which is stored in the cloud is encrypted
and stored on to the server location with the help of intermediate proxy
servers. In general proxy servers are those which can convert encrypted files
for the data owner to encrypted files for the data receiver without knowing the
original information. For space complexity the data owner will remove the
original files from his system. As data was stored on a remote server, we must
mainly concentrate on two major issues like confidentiality and integrity of the
outsourced data. In this paper, we have proposed two new identity-based secure
distributed data storage (IBSDDS) schemes. Our two new schemes can capture the
following properties: (1) Firstly whenever the data/file which is uploaded by
file owner on remote server he will decide the file access permission
independently on his own without the help of any third party private key
generator (PKG).(2) For one query, a receiver can only access appropriate one
file, instead of all files that are stored by the owner. Our two new schemes are
secure against the collusion attacks, namely even if the receiver can compromise
the proxy servers; he cannot obtain the owner’s secret key. To the best of our
knowledge, it is the first IBSDDS schemes where access permissions is made by
the owner for an exact file and collusion attacks can be protected in the
standard model. We also implemented mailing concept as an extension for this
paper in order to send the file name and key to the requested receiver mail id
instead of sending directly along with the data. This facility gives high
security as all the participating parties will have a proper authentication
before request arrives to them, so that there will be no chance of getting the
files by unauthorized users.
|
Download
|