Go to the main menu
Skip to content
Go to bottom
REFERENCE LINKING PLATFORM OF KOREA S&T JOURNALS
> Journal Vol & Issue
Journal of Information Processing Systems
Journal Basic Information
Journal DOI :
Korea Information Processing Society
Editor in Chief :
Young-Sik Jeong / Mohammad S. Obaidat
Volume & Issues
Volume 6, Issue 4 - Dec 2010
Volume 6, Issue 3 - Sep 2010
Volume 6, Issue 2 - Jun 2010
Volume 6, Issue 1 - Mar 2010
Selecting the target year
Hiding Secret Data in an Image Using Codeword Imitation
Wang, Zhi-Hui ; Chang, Chin-Chen ; Tsai, Pei-Yu ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 435~452
DOI : 10.3745/JIPS.2010.6.4.435
This paper proposes a novel reversible data hiding scheme based on a Vector Quantization (VQ) codebook. The proposed scheme uses the principle component analysis (PCA) algorithm to sort the codebook and to find two similar codewords of an image block. According to the secret to be embedded and the difference between those two similar codewords, the original image block is transformed into a difference number table. Finally, this table is compressed by entropy coding and sent to the receiver. The experimental results demonstrate that the proposed scheme can achieve greater hiding capacity, about five bits per index, with an acceptable bit rate. At the receiver end, after the compressed code has been decoded, the image can be recovered to a VQ compressed image.
Security Properties of Domain Extenders for Cryptographic Hash Functions
Andreeva, Elena ; Mennink, Bart ; Preneel, Bart ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 453~480
DOI : 10.3745/JIPS.2010.6.4.453
Cryptographic hash functions reduce inputs of arbitrary or very large length to a short string of fixed length. All hash function designs start from a compression function with fixed length inputs. The compression function itself is designed from scratch, or derived from a block cipher or a permutation. The most common procedure to extend the domain of a compression function in order to obtain a hash function is a simple linear iteration; however, some variants use multiple iterations or a tree structure that allows for parallelism. This paper presents a survey of 17 extenders in the literature. It considers the natural question whether these preserve the security properties of the compression function, and more in particular collision resistance, second preimage resistance, preimage resistance and the pseudo-random oracle property.
Distributed and Scalable Intrusion Detection System Based on Agents and Intelligent Techniques
El-Semary, Aly M. ; Mostafa, Mostafa Gadal-Haqq M. ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 481~500
DOI : 10.3745/JIPS.2010.6.4.481
The Internet explosion and the increase in crucial web applications such as ebanking and e-commerce, make essential the need for network security tools. One of such tools is an Intrusion detection system which can be classified based on detection approachs as being signature-based or anomaly-based. Even though intrusion detection systems are well defined, their cooperation with each other to detect attacks needs to be addressed. Consequently, a new architecture that allows them to cooperate in detecting attacks is proposed. The architecture uses Software Agents to provide scalability and distributability. It works in two modes: learning and detection. During learning mode, it generates a profile for each individual system using a fuzzy data mining algorithm. During detection mode, each system uses the FuzzyJess to match network traffic against its profile. The architecture was tested against a standard data set produced by MIT's Lincoln Laboratory and the primary results show its efficiency and capability to detect attacks. Finally, two new methods, the memory-window and memoryless-window, were developed for extracting useful parameters from raw packets. The parameters are used as detection metrics.
Medium Access Control with Dynamic Frame Length in Wireless Sensor Networks
Yoo, Dae-Suk ; Choi, Seung-Sik ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 501~510
DOI : 10.3745/JIPS.2010.6.4.501
Wireless sensor networks consist of sensor nodes which are expected to be battery-powered and are hard to replace or recharge. Thus, reducing the energy consumption of sensor nodes is an important design consideration in wireless sensor networks. For the implementation of an energy-efficient MAC protocol, a Sensor-MAC based on the IEEE 802.11 protocol, which has energy efficient scheduling, has been proposed. In this paper, we propose a Dynamic S-MAC that adapts dynamically to the network-traffic state. The dynamic S-MAC protocol improves the energy consumption of the S-MAC by changing the frame length according to the network-traffic state. Using an NS-2 Simulator, we compare the performance of the Dynamic S-MAC with that of the S-MAC protocol.
Fingerprint Detection Using Canny Filter and DWT, a New Approach
Islam, Md. Imdadul ; Begum, Nasima ; Alam, Mahbubul ; Amin, M.R. ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 511~520
DOI : 10.3745/JIPS.2010.6.4.511
This paper proposes two new methods to detect the fingerprints of different persons based on one-dimensional and two-dimensional discrete wavelet transformations (DWTs). Recent literature shows that fingerprint detection based on DWT requires less memory space compared to pattern recognition and moment-based image recognition techniques. In this study four statistical parameters - cross correlation co-efficient, skewness, kurtosis and convolution of the approximate coefficient of one-dimensional DWTs are used to evaluate the two methods involving fingerprints of the same person and those of different persons. Within the contexts of all statistical parameters in detection of fingerprints, our second method shows better results than that of the first method.
Mining Spatio-Temporal Patterns in Trajectory Data
Kang, Ju-Young ; Yong, Hwan-Seung ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 521~536
DOI : 10.3745/JIPS.2010.6.4.521
Spatio-temporal patterns extracted from historical trajectories of moving objects reveal important knowledge about movement behavior for high quality LBS services. Existing approaches transform trajectories into sequences of location symbols and derive frequent subsequences by applying conventional sequential pattern mining algorithms. However, spatio-temporal correlations may be lost due to the inappropriate approximations of spatial and temporal properties. In this paper, we address the problem of mining spatio-temporal patterns from trajectory data. The inefficient description of temporal information decreases the mining efficiency and the interpretability of the patterns. We provide a formal statement of efficient representation of spatio-temporal movements and propose a new approach to discover spatio-temporal patterns in trajectory data. The proposed method first finds meaningful spatio-temporal regions and extracts frequent spatio-temporal patterns based on a prefix-projection approach from the sequences of these regions. We experimentally analyze that the proposed method improves mining performance and derives more intuitive patterns.
An Optimized Approach of Fault Distribution for Debugging in Parallel
Srivasatav, Maneesha ; Singh, Yogesh ; Chauhan, Durg Singh ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 537~552
DOI : 10.3745/JIPS.2010.6.4.537
Software Debugging is the most time consuming and costly process in the software development process. Many techniques have been proposed to isolate different faults in a program thereby creating separate sets of failing program statements. Debugging in parallel is a technique which proposes distribution of a single faulty program segment into many fault focused program slices to be debugged simultaneously by multiple debuggers. In this paper we propose a new technique called Faulty Slice Distribution (FSD) to make parallel debugging more efficient by measuring the time and labor associated with a slice. Using this measure we then distribute these faulty slices evenly among debuggers. For this we propose an algorithm that estimates an optimized group of faulty slices using as a parameter the priority assigned to each slice as computed by value of their complexity. This helps in the efficient merging of two or more slices for distribution among debuggers so that debugging can be performed in parallel. To validate the effectiveness of this proposed technique we explain the process using example.
Efficient Server Virtualization using Grid Service Infrastructure
Baek, Sung-Jin ; Park, Sun-Mi ; Yang, Su-Hyun ; Song, Eun-Ha ; Jeong, Young-Sik ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 553~562
DOI : 10.3745/JIPS.2010.6.4.553
The core services in cloud computing environment are SaaS (Software as a Service), Paas (Platform as a Service) and IaaS (Infrastructure as a Service). Among these three core services server virtualization belongs to IaaS and is a service technology to reduce the server maintenance expenses. Normally, the primary purpose of sever virtualization is building and maintaining a new well functioning server rather than using several existing servers, and in improving the various system performances. Often times this presents an issue in that there might be a need to increase expenses in order to build a new server. This study intends to use grid service architecture for a form of server virtualization which utilizes the existing servers rather than introducing a new server. More specifically, the proposed system is to enhance system performance and to reduce the corresponding expenses, by adopting a scheduling algorithm among the distributed servers and the constituents for grid computing thereby supporting the server virtualization service. Furthermore, the proposed server virtualization system will minimize power management by adopting the sleep severs, the subsidized servers and the grid infrastructure. The power maintenance expenses for the sleep servers will be lowered by utilizing the ACPI (Advanced Configuration & Power Interface) standards with the purpose of overcoming the limits of server performance.
Intercepting Filter Approach to Injection Flaws
Salem, Ahmed ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 563~574
DOI : 10.3745/JIPS.2010.6.4.563
The growing number of web applications in the global economy has made it critically important to develop secure and reliable software to support the economy's increasing dependence on web-based systems. We propose an intercepting filter approach to mitigate the risk of injection flaw exploitation- one of the most dangerous methods of attacking web applications. The proposed approach can be implemented in Java or .NET environments following the intercepting filter design pattern. This paper provides examples to illustrate the proposed approach.
Generalized Proxy-Assisted Periodic Broadcasting (G-ProB) for Heterogeneous Clients in Video-on-Demand Service
Febiansyah, Hidayat ; Kwon, Jin-Baek ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 575~596
DOI : 10.3745/JIPS.2010.6.4.575
Video-on-Demand services are increasing rapidly nowadays. The load on servers can be very high, even exceeding their capacity. For popular contents, we can use a Periodic Broadcast (PB) strategy using multicast to serve all clients. Recent development of PB uses multiple channels broadcasting for segments of movies in certain patterns, so that users only need to wait for a small segment to start the service. However, users need higher download capacity to download multiple segments at a time. In order to compensate for this, a proxy server can help to reduce download bandwidth requirements by holding some segments for a certain time. This research will focus on more recent PB schemes that couldn't be covered by previous Proxy-Assisted Periodic Broadcast strategies.
A Dynamic Approach to Estimate Change Impact using Type of Change Propagation
Gupta, Chetna ; Singh, Yogesh ; Chauhan, Durg Singh ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 597~608
DOI : 10.3745/JIPS.2010.6.4.597
Software evolution is an ongoing process carried out with the aim of extending base applications either for adding new functionalities or for adapting software to changing environments. This brings about the need for estimating and determining the overall impact of changes to a software system. In the last few decades many such change/impact analysis techniques have been developed to identify consequences of making changes to software systems. In this paper we propose a new approach of estimating change/impact analysis by classifying change based on type of change classification e.g. (a) nature and (b) extent of change propagation. The impact set produced consists of two dimensions of information: (a) statements affected by change propagation and (b) percentage i.e. statements affected in each category and involving the overall system. We also propose an algorithm for classifying the type of change. To establish confidence in effectiveness and efficiency we illustrate this technique with the help of an example. Results of our analysis are promising towards achieving the aim of the proposed endeavor to enhance change classification. The proposed dynamic technique for estimating impact sets and their percentage of impact will help software maintainers in performing selective regression testing by analyzing impact sets regarding the nature of change and change dependency.
A Study on Design and Implementation of the Ubiquitous Computing Environment-based Dynamic Smart On/Off-line Learner Tracking System
Lim, Hyung-Min ; Jang, Kun-Won ; Kim, Byung-Gi ;
Journal of Information Processing Systems, volume 6, issue 4, 2010, Pages 609~620
DOI : 10.3745/JIPS.2010.6.4.609
In order to provide a tailored education for learners within the ubiquitous environment, it is critical to undertake an analysis of the learning activities of learners. For this purpose, SCORM (Sharable Contents Object Reference Model), IMS LD (Instructional Management System Learning Design) and other standards provide learning design support functions, such as, progress checks. However, in order to apply these types of standards, contents packaging is required, and due to the complicated standard dimensions, the facilitation level is lower than the work volume when developing the contents and this requires additional work when revision becomes necessary. In addition, since the learning results are managed by the server there is the problem of the OS being unable to save data when the network is cut off. In this study, a system is realized to manage the actions of learners through the event interception of a web-browser by using event hooking. Through this technique, all HTMLbased contents can be facilitated again without additional work and saving and analysis of learning results are available to improve the problems following the application of standards. Furthermore, the ubiquitous learning environment can be supported by tracking down learning results when the network is cut off.