Go to the main menu
Skip to content
Go to bottom
REFERENCE LINKING PLATFORM OF KOREA S&T JOURNALS
> Journal Vol & Issue
Journal of KIISE
Journal Basic Information
Journal DOI :
Korean Institute of Information Scientists and Engineers
Editor in Chief :
Volume & Issues
Volume 41, Issue 12 - Dec 2014
Volume 41, Issue 11 - Nov 2014
Volume 41, Issue 10 - Oct 2014
Volume 41, Issue 9 - Sep 2014
Selecting the target year
Detecting Software Similarity Using API Sequences on Static Major Paths
Park, Seongsoo ; Han, Hwansoo ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1007~1012
DOI : 10.5626/JOK.2014.41.12.1007
Software birthmarks are used to detect software plagiarism. For binaries, however, only a few birthmarks have been developed. In this paper, we propose a static approach to generate API sequences along major paths, which are analyzed from control flow graphs of the binaries. Since our API sequences are extracted along the most plausible paths of the binary codes, they can represent actual API sequences produced from binary executions, but in a more concise form. Our similarity measures use the Smith-Waterman algorithm that is one of the popular sequence alignment algorithms for DNA sequence analysis. We evaluate our static path-based API sequence with multiple versions of five applications. Our experiment indicates that our proposed method provides a quite reliable similarity birthmark for binaries.
A Black-Box based Testing for GUI Bug Detection
Lee, Jemin ; Kim, Hyungshin ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1013~1017
DOI : 10.5626/JOK.2014.41.12.1013
A variety of applications that are accessible through app markets provide useful features and functions. However, those applications can present many GUI bugs due to the deficiency of testing processes. Even though various approaches have been developed for mobile app testing, GUI bugs in applications are still difficult to be identified due to the absence of efficiency, lack of automation, and necessity of access to the source code. In this paper, we propose an automated black-box testing method for efficient GUI bug detection. Our experimental results show that the proposed method achieves better code coverage and uncovers GUI bugs when compared with existing black-box testing called Monkey.
Design and Implementation of Efficient Mitigation against Return-oriented Programming
Kim, Jeehong ; Kim, Inhyeok ; Min, Changwoo ; Eom, Young Ik ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1018~1025
DOI : 10.5626/JOK.2014.41.12.1018
An ROP attack creates gadget sequences which consist of existing code snippets in a program, and hijacks the control flow of a program by chaining and executing gadget sequences consecutively. Existing defense schemes have limitations in that they cause high execution overhead, an increase in the binary size overhead, and a low applicability. In this paper, we solve these problems by introducing zero-sum defender, which is a fast and space-efficient mitigation scheme against ROP attacks. We find a fundamental property of gadget execution in which control flow starts in the middle of a function without a call instruction and ends with a return instruction. So, we exploit this property by monitoring whether the execution is abused by ROP attacks. We achieve a very low runtime overhead with a very small increase in the binary size. In our experimental results, we verified that our defense scheme prevents real world ROP attacks, and we showed that there is only a 2% performance overhead and a 1% binary size increase overhead in several benchmarks.
Partial Garbage Collection Technique for Improving Write Performance of Log-Structured File Systems
Gwak, Hyunho ; Shin, Dongkun ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1026~1034
DOI : 10.5626/JOK.2014.41.12.1026
Recently, flash storages devices have become popular. Log-structured file systems (LFS) are suitable for flash storages since these can provide high write performance by only generating sequential writes to the flash device. However, LFS should perform garbage collections (GC) in order to reclaim obsolete space. Recently, a slack space recycling (SSR) technique was proposed to reduce the GC overhead. However, since SSR generates random writes, write performance can be negatively impacted if the random write performance is significantly lower than sequential write performance of the target device. This paper proposes a partial garbage collection technique that copies only a part of valid blocks in a victim segment in order to increase the size of the contiguous invalid space to be used by SSR. The experiments performed in this study show that the write performance in an SD card improves significantly as a result of the partial GC technique.
Meta-Modeling to Detect Attack Behavior for Security
On, Jinho ; Choe, Yeongbok ; Lee, Moonkun ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1035~1049
DOI : 10.5626/JOK.2014.41.12.1035
This paper presents a new method to detect attack patterns in security-critical systems, based on a new notion of Behavior Ontology. Generally security-critical systems are large and complex, and they are subject to be attacked in every possible way. Therefore it is very complicated to detect various attacks through a semantic structure designed to detect such attacks. This paper handles the complication with Behavior Ontology, where patterns of attacks in the systems are defined as a sequences of actions on the class ontology of the systems. We define the patterns of attacks as sequences of actions, and the attack patterns can then be abstracted in a hierarchical order, forming a lattice, based on the inclusion relations. Once the behavior ontology for the attack patterns is defined, the attacks in the target systems can be detected both semantically and hierarchically in the ontology structure. When compared to other attack models, the behavior ontology analysis proposed in this paper is found to be very effective and efficient in terms of time and space.
Motor Imagery EEG Classification Method using EMD and FFT
Lee, David ; Lee, Hee-Jae ; Lee, Sang-Goog ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1050~1057
DOI : 10.5626/JOK.2014.41.12.1050
Electroencephalogram (EEG)-based brain-computer interfaces (BCI) can be used for a number of purposes in a variety of industries, such as to replace body parts like hands and feet or to improve user convenience. In this paper, we propose a method to decompose and extract motor imagery EEG signal using Empirical Mode Decomposition (EMD) and Fast Fourier Transforms (FFT). The EEG signal classification consists of the following three steps. First, during signal decomposition, the EMD is used to generate Intrinsic Mode Functions (IMFs) from the EEG signal. Then during feature extraction, the power spectral density (PSD) is used to identify the frequency band of the IMFs generated. The FFT is used to extract the features for motor imagery from an IMF that includes mu rhythm. Finally, during classification, the Support Vector Machine (SVM) is used to classify the features of the motor imagery EEG signal. 10-fold cross-validation was then used to estimate the generalization capability of the given classifier., and the results show that the proposed method has an accuracy of 84.50% which is higher than that of other methods.
Query Expansion based on Word Sense Community
Kwak, Chang-Uk ; Yoon, Hee-Geun ; Park, Seong-Bae ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1058~1065
DOI : 10.5626/JOK.2014.41.12.1058
In order to assist user's who are in the process of executing a search, a query expansion method suggests keywords that are related to an input query. Recently, several studies have suggested keywords that are identified by finding domains using a clustering method over the documents that are retrieved. However, the clustering method is not relevant when presenting various domains because the number of clusters should be fixed. This paper proposes a method that suggests keywords by finding various domains related to the input queries by using a community detection algorithm. The proposed method extracts words from the top-30 documents of those that are retrieved and builds communities according to the word graph. Then, keywords representing each community are derived, and the represented keywords are used for the query expansion method. In order to evaluate the proposed method, we compared our results to those of two baseline searches performed by the Google search engine and keyword recommendation using TF-IDF in the search results. The results of the evaluation indicate that the proposed method outperforms the baseline with respect to diversity.
Visual Analytics for Abnormal Event detection using Seasonal-Trend Decomposition and Serial-Correlation
Yeon, Hanbyul ; Jang, Yun ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1066~1074
DOI : 10.5626/JOK.2014.41.12.1066
In this paper, we present a visual analytics system that uses serial-correlation to detect an abnormal event in spatio-temporal data. Our approach extracts the topic-model from spatio-temporal tweets and then filters the abnormal event candidates using a seasonal-trend decomposition procedure based on Loess smoothing (STL). We re-extract the topic from the candidates, and then, we apply STL to the second candidate. Finally, we analyze the serial-correlation between the first candidates and the second candidate in order to detect abnormal events. We have used a visual analytic approach to detect the abnormal events, and therefore, the users can intuitively analyze abnormal event trends and cyclical patterns. For the case study, we have verified our visual analytics system by analyzing information related to two different events: the 'Gyeongju Mauna Resort collapse' and the 'Jindo-ferry sinking'.
Designing an Algorithm for the Priority Deciding and Recommending of the Logistic Support with Stationary Distribution
Noh, Giseop ; Jeong, Sihyun ; Kim, Chong-Kwon ; Oh, Hayoung ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1075~1080
DOI : 10.5626/JOK.2014.41.12.1075
One of the important roles used to ensure victory in a war is to maximize the overall military forces and to make sure that the capability of the military forces can be sustained as much as possible. Although several researchers have proposed various possible methodologies for logistics support, no research trials have been undertaken to investigate logistics support that considers all relevant elements of such. Unlike previous in trials that consider and analyze the system fault ratio as the main methodology, we propose an approach that simultaneously decides and recommends logistic priority by reflecting and combining item costs, transportation, fault-ratio, and system complexity. Also, we designed an algorithm that can recommend optimized logistics support priority using stationary distribution.
Context-sensitive Spelling Error Correction using Eojeol N-gram
Kim, Minho ; Kwon, Hyuk-Chul ; Choi, Sungki ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1081~1089
DOI : 10.5626/JOK.2014.41.12.1081
Context-sensitive spelling-error correction methods are largely classified into rule-based methods and statistical data-based methods, the latter of which is often preferred in research. Statistical error correction methods consider context-sensitive spelling error problems as word-sense disambiguation problems. The method divides a vocabulary pair, for correction, which consists of a correction target vocabulary and a replacement candidate vocabulary, according to the context. The present paper proposes a method that integrates a word-phrase n-gram model into a conventional model in order to improve the performance of the probability model by using a correction vocabulary pair, which was a result of a previous study performed by this research team. The integrated model suggested in this paper includes a method used to interpolate the probability of a sentence calculated through each model and a method used to apply the models, when both methods are sequentially applied. Both aforementioned types of integrated models exhibit relatively high accuracy and reproducibility when compared to conventional models or to a model that uses only an n-gram.
: An Adaptive Cauchy Differential Evolution Algorithm with Improved Convergence Speed
Choi, Tae Jong ; Ahn, Chang Wook ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1090~1098
DOI : 10.5626/JOK.2014.41.12.1090
In this paper, an improved ACDE (Adaptive Cauchy Differential Evolution) algorithm with faster convergence speed, called ACDE2, is suggested. The baseline ACDE algorithm uses a "DE/rand/1" mutation strategy to provide good population diversity, and it is appropriate for solving multimodal optimization problems. However, the convergence speed of the mutation strategy is slow, and it is therefore not suitable for solving unimodal optimization problems. The ACDE2 algorithm uses a "DE/current-to-best/1" mutation strategy in order to provide a fast convergence speed, where a control parameter initialization operator is used to avoid converging to local optimization. The operator is executed after every predefined number of generations or when every individual fails to evolve, which assigns a value with a high level of exploration property to the control parameter of each individual, providing additional population diversity. Our experimental results show that the ACDE2 algorithm performs better than some state-of-the-art DE algorithms, particularly in unimodal optimization problems.
Static Analysis of Web Accessibility Based on Abstract Parsing
Kim, Hyunha ; Doh, Kyung-Goo ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1099~1109
DOI : 10.5626/JOK.2014.41.12.1099
Web-accessibility evaluation tools can be used to determine whether or not a website meets accessibility guidelines. As such, many such tools have been developed for web accessibility, but most of them dynamically fetch and analyze pages and as a result, some pages maybe omitted due to the lack of access authorization or environment information. In this paper, we propose a static method that analyzes web accessibility via abstract parsing. Our abstract parsing technique understands syntactic and semantic program structures that dynamically generate web pages according to external inputs and parameters. The static method performs its analysis without omitting any pages because it covers all execution paths. We performed an experiment with a PHP-based website to demonstrate how our tool discovers more accessibility errors than a dynamic page accessibility analysis tool.
Exploiting Friend's Username to De-anonymize Users across Heterogeneous Social Networking Sites
Kim, Dongkyu ; Park, Seog ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1110~1116
DOI : 10.5626/JOK.2014.41.12.1110
Nowadays, social networking sites (SNSs), such as Twitter, LinkedIn, and Tumblr, are coming into the forefront, due to the growth in the number of users. While users voluntarily provide their information in SNSs, privacy leakages resulting from the use of SNSs is becoming a problem owing to the evolution of large data processing techniques and the raising awareness of privacy. In order to solve this problem, the studies on protecting privacy on SNSs, based on graph and machine learning, have been conducted. However, examples of privacy leakages resulting from the advent of a new SNS are consistently being uncovered. In this paper, we propose a technique enabling a user to detect privacy leakages beforehand in the case where the service provider or third-party application developer threatens the SNS user's privacy maliciously.
Exploiting Query Proximity and Graph Profiling Method for Tag-based Personalized Search in Folksonomy
Han, Keejun ; Jang, Jincheul ; Yi, Mun Yong ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1117~1125
DOI : 10.5626/JOK.2014.41.12.1117
Folksonomy data, which is derived from social tagging systems, is a useful source for understanding a user's intention and interest. Using the folksonomy data, it is possible to create an accurate user profile which can be utilized to build a personalized search system. However there are limitations in some of the traditional methods such as Vector Space Model(VSM) for user profiling and similarity computation. This paper suggests a novel method with graph-based user and document profile which uses the proximity information of query terms to improve personalized search. We demonstrate the performance of the suggested method by comparing its performance with several state-of-the-art VSM based personalization models in two different folksonomy datasets. The results show that the proposed model constantly outperforms the other state-of-the-art personalization models. Furthermore, the parameter sensitivity results show that the proposed model is parameter-free in that it is not affected by the idiosyncratic nature of datasets.
A Dynamic Partitioning Scheme for Distributed Storage of Large-Scale RDF Data
Kim, Cheon Jung ; Kim, Ki Yeon ; Yoo, Jong Hyeon ; Lim, Jong Tae ; Bok, Kyoung Soo ; Yoo, Jae Soo ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1126~1135
DOI : 10.5626/JOK.2014.41.12.1126
In recent years, RDF partitioning schemes have been studied for the effective distributed storage and management of large-scale RDF data. In this paper, we propose an RDF dynamic partitioning scheme to support load balancing in dynamic environments where the RDF data is continuously inserted and updated. The proposed scheme creates clusters and sub-clusters according to the frequency of the RDF data used by queries to set graph partitioning criteria. We partition the created clusters and sub-clusters by considering the workloads and data sizes for the servers. Therefore, we resolve the data concentration of a specific server, resulting from the continuous insertion and update of the RDF data, in such a way that the load is distributed among servers in dynamic environments. It is shown through performance evaluation that the proposed scheme significantly improves the query processing time over the existing scheme.
An Evaluation Method for Contents Importance Based on Twitter Characteristics
Lee, Euijong ; Kim, Jeong-Dong ; Baik, Doo-Kwon ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1136~1144
DOI : 10.5626/JOK.2014.41.12.1136
Twitter is a social network service that generates about 140 million contents a day. Contents of Twitter contain a variety of information and many researchers research those in various fields. In this research, we propose a method for evaluating the importance of content based on characteristics of Twitter. We have found that number of follower means user's popularity and Re-tweet that means the popularity of content. We perform experiments about proposed method using real Twitter data for proving effectiveness of proposed method. Also, we found information providers in Twitter are public user who represent a company or a representative of a specific group.
A Multi-path Routing Mechanism with Local Optimization for Load Balancing in the Tactical Backbone Network
Kim, Yongsin ; Kim, Younghan ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1145~1151
DOI : 10.5626/JOK.2014.41.12.1145
In this paper, we propose MPLO(Multi-Path routing with Local Optimization) for load balancing in the tactical backbone network. The MPLO manages global metric and local metric separately. The global metric is propagated to other routers via a routing protocol and is used for configuring loop-free multi-path. The local metric reflecting link utilization is used to find an alternate path when congestion occurs. We verified MPLO could effectively distribute user traffic among available routers by simulation.
An Abnormal Activity Monitoring System Using Sensors and Video
Kim, Sang-Soo ; Kim, Sun-Woo ; Choi, Yeon-Sung ;
Journal of KIISE, volume 41, issue 12, 2014, Pages 1152~1159
DOI : 10.5626/JOK.2014.41.12.1152
In this paper, we presents a system to ensure the safety of residents through appropriate action or alarm in case the residents occurs an emergency situation and abnormal activity. We collect and analysis real-time data of living environment of the residents using video and sensor. The existing system have been determined by using only the sensor data it have several problems. Our system attach camera to solve the existing system problem. We use weighted difference image and motion vector. The existing system, it takes about 48 hours to determine that an abnormal activity occurs. However, our system takes less than 1 hour.