Go to the main menu
Skip to content
Go to bottom
REFERENCE LINKING PLATFORM OF KOREA S&T JOURNALS
> Journal Vol & Issue
The KIPS Transactions:PartD
Journal Basic Information
Journal DOI :
Korea Information Processing Society
Editor in Chief :
Volume & Issues
Volume 13D, Issue 7 - Dec 2006
Volume 13D, Issue 6 - Oct 2006
Volume 13D, Issue 5 - Oct 2006
Volume 13D, Issue 4 - Aug 2006
Volume 13D, Issue 3 - Jun 2006
Volume 13D, Issue 2 - Apr 2006
Volume 13D, Issue 1 - Feb 2006
Selecting the target year
An Enhanced Density and Grid based Spatial Clustering Algorithm for Large Spatial Database
Gao, Song ; Kim, Ho-Seok ; Xia, Ying ; Kim, Gyoung-Bae ; Bae, Hae-Young ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 633~640
DOI : 10.3745/KIPSTD.2006.13D.5.633
Spatial clustering, which groups similar objects based on their distance, connectivity, or their relative density in space, is an important component of spatial data mining. Density-based and grid-based clustering are two main clustering approaches. The former is famous for its capability of discovering clusters of various shapes and eliminating noises, while the latter is well known for its high speed. Clustering large data sets has always been a serious challenge for clustering algorithms, because huge data set would make the clustering process extremely costly. In this paper, we propose an enhanced Density-Grid based Clustering algorithm for Large spatial database by setting a default number of intervals and removing the outliers effectively with the help of a proper measurement to identify areas of high density in the input data space. We use a density threshold DT to recognize dense cells before neighbor dense cells are combined to form clusters. When proposed algorithm is performed on large dataset, a proper granularity of each dimension in data space and a density threshold for recognizing dense areas can improve the performance of this algorithm. We combine grid-based and density-based methods together to not only increase the efficiency but also find clusters with arbitrary shape. Synthetic datasets are used for experimental evaluation which shows that proposed method has high performance and accuracy in the experiments.
XML Schema Evolution Approach Assuring the Automatic Propagation to XML Documents
Ra, Young-Gook ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 641~650
DOI : 10.3745/KIPSTD.2006.13D.5.641
XML has the characteristics of self-describing and uses DTD or XML schema in order to constraint its structure. Even though the XML schema is only at the stage of recommendation yet, it will be prevalently used because DTD is not itself XML and has the limitation on the expression power. The structure defined by the XML schema as well as the data of the XML documents can vary due to complex reasons. Those reasons are errors in the XML schema design, new requirements due to new applications, etc. Thus, we propose XML schema evolution operators that are extracted from the analysis of the XML schema updates. These schema evolution operators enable the XML schema updates that would have been impossible without supporting tools if there are a large number of XML documents complying the U schema. In addition, these operators includes the function of automatically finding the update place in the XML documents which are registered to the XSE system, and maintaining the XML documents valid to the XML schema rather than merely well-formed. This paper is the first attempt to update XML schemas of the XML documents and provides the comprehensive set of schema updating operations. Our work is necessary for the XML application development and maintenance in that it helps to update the structure of the XML documents as well as the data in the easy and precise manner.
An Efficient Hashing Mechanism of the DHP Algorithm for Mining Association Rules
Lee, Hyung-Bong ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 651~660
DOI : 10.3745/KIPSTD.2006.13D.5.651
Algorithms for mining association rules based on the Apriori algorithm use the hash tree data structure for storing and counting supports of the candidate frequent itemsets and the most part of the execution time is consumed for searching in the hash tree. The DHP(Direct Hashing and Pruning) algorithm makes efforts to reduce the number of the candidate frequent itemsets to save searching time in the hash tree. For this purpose, the DHP algorithm does preparative simple counting supports of the candidate frequent itemsets. At this time, the DHP algorithm uses the direct hash table to reduce the overhead of the preparative counting supports. This paper proposes and evaluates an efficient hashing mechanism for the direct hash table
which is for pruning in phase 2 and the hash tree
, which is for counting supports of the candidate frequent itemsets in all phases. The results showed that the performance improvement due to the proposed hashing mechanism was 82.2% on the maximum and 18.5% on the average compared to the conventional method using a simple mod operation.
The Construction of Multiform User Profiles Based on Transaction for Effective Recommendation and Segmentation
Koh, Jae-Jin ; An, Hyoung-Keun ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 661~670
DOI : 10.3745/KIPSTD.2006.13D.5.661
With the development of e-Commerce and the proliferation of easily accessible information, information filtering systems such as recommender and SDI systems have become popular to prune large information spaces so that users are directed toward those items that best meet their needs and preferences. Until now, many information filtering methods have been proposed to support filtering systems. XML is emerging as a new standard for information. Recently, filtering systems need new approaches in dealing with XML documents. So, in this paper our system suggests a method to create multiform user profiles with XML`s ability to represent structure. This system consists of two parts; one is an administrator profile definition part that an administrator defines to analyze users purchase pattern before a transaction such as purchase happens directly. an other is a user profile creation part module which is applied by the defined profile. Administrator profiles are made from DTD information and it is supposed to point the specific part of a document conforming to the DTD. Proposed system builds user`s profile more accurately to get adaptability for user`s behavior of buying and provide useful product information without inefficient searching based on such user`s profile.
Comparison and Analysis of Implicit and Explicit Collaboration Process Languages
Jo, Myung-Hyun ; Park, Jung-Up ; Sul, Joo-Young ; Baeg, Moon-Hong ; Son, Jin-Hyun ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 671~682
DOI : 10.3745/KIPSTD.2006.13D.5.671
Until now, a variety of the standard and research activities are progressed in the business process management. However, since the common standard of the collaboration process language has not been determined, the research activities could not be well-systemized. We would present the guide-line to select and use the collaboration process language straightly through comparing different collaboration process languages (BPEL4WS, BPML, WSCI, WS-CDL, BPSS, etc). In this regard, we define the implicit and the explicit collaboration as the collaboration method in advance and present the result acquired according to compare and analyze the features of the collaboration process languages. First, the necessary elements the collaboration process languages have are extracted through the framework of the inter-organizational workflow proposed by Bernauer and the collaboration process modeling procedure(CPMP). Second, we analyze the properties of the collaboration process language based the essential elements. Finally, we show the complete example that the collaboration business process really reflects the characteristics of the collaboration business process languages
Architecture Evaluation Utilizing CBAM and AHP
Lee, Ji-Hyun ; Kang, Sung-Won ; Cho, Jin-Hee ; Kim, Jin-Sam ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 683~690
DOI : 10.3745/KIPSTD.2006.13D.5.683
The CBAM (Cost Benefit Analysis Method) provides a method for deciding the best architectural strategy by considering cost, schedule, and risks as well as the benefits of the architectural strategy. But many parts of the CBAM do not Present quantitative evidence explicitly on whether it is the best architectural strategy among others because it depends on the stakeholders` consensus, vote, and/or intuition. In this study, we apply the AHP (Analytic Hierarchy Process) to CBAM to provide explicit quantitative evidence and to reduce the possibility of subjective decision-making errors that may occur in the CBAM.
Agent Based Process Management Environment
Kim, Jeong-Ah ; Choi, Seung-Young ; Choi, Sung-Woon ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 691~698
DOI : 10.3745/KIPSTD.2006.13D.5.691
The companies need the enterprise-wide support environment to build the competency to gather VOM(Voice of Market) in the process of preparing and implementing the strategy and to help establishing and managing the business process in order to secure the continuous competitive edge The enterprise-wide support environment to establish, operate, improve and evaluate the business process must be carried out. In this paper we define the method to define process and business rule in order to enable accurate execution of the process. Furthermore, collection and refection of accurate data concerning the competency of individuals, the subjects of the process execution, allows prevention of weakness of the process execution result and is the basis for identifying the areas of improvement. Therefore, high visibility can be attained through the work knowledge and processes presented in rules, and it can help firmly establish the process centered work culture (or system) in the organization by process improvement strictly based on data.
Architecture Evaluation Utilizing CBAM and AHP
Yang, Hae-Sool ; Lee, Man-Ho ; Yoon, Young-Mi ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 699~708
DOI : 10.3745/KIPSTD.2006.13D.5.699
The latest biometric field have marched fast with security technology of IT. As importance of present biometrics is realized, internal and external biometrics software market is trend that is soaring. Accordingly, high reliability of biometric software and request of high quality software are enlarged. Evaluation items and criteria must be established for biometric software quality assurance. In this paper, we development the evaluation module for biometric software test based on ISO/IEC 12119 that is the standard about software quality requirement and test, and ISO/IEC 9126 that is standard about evaluation of software product, and ISO/IEC 14598-6 that is the standard about construction of the evaluation module. Constituents of biometric software products(product descriptor, user document program and data) is subject to the quality evaluation module that we developed in this paper, we can expect improvement in the quality of software by using with a standard such as ISO/IEC 9126-3 that can be used in software development process.
Estimation of Software Project Success and Completion Rate Using Gompertz Growth Function
Lee, Sang-Un ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 709~716
DOI : 10.3745/KIPSTD.2006.13D.5.709
As the software complexity increases, the development success rate decreases and failure rate increases exponentially. The failure rate related to the software size can be described by a growth function. Based on this phenomenon, this paper estimates the development success and completion rate using the Gompertz growth function. At first, we transformed a software size of numerically suggested
into a logarithm and kept the data interval constantly. We tried to derive a functional relationship between the development success rate and the completion rate according to the change of logarithmic software size. However, we could not find a function which can represent this relationship. Therefore, we introduced the failure rate and the cancel rate which are inverse to the development success rate and completion rate, respectively. Then, we indicated the relation between development failure rate and cancel rate based on the change of software size, as a type of growth function. Finally, as we made the Gompertz growth function with the function which describes the cancel rate and the failure rate properly. We could express the actual data suitably. When you apply the growth function model that I suggested, you will be able to get the success rate and completion rate of particular site of software very accurately.
A Cost Estimation Technique using the PRICE S Model for Embedded Software in Weapon Systems
Shin, Eon-Hee ; Kang, Sung-Jin ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 717~724
DOI : 10.3745/KIPSTD.2006.13D.5.717
The cost estimation of software is getting more important as the portion of software is increasing in acquiring weapon systems. However, the cost estimation of embedded software in a weapon system follows the cost estimation method for general purpose softwares and uses the PRICE S model as a tool. However, any validation result of the estimated cost through an evaluated software size is not well known. Hence, we propose an approach to estimate the cost through evaluating the embedded software site in weapon systems. In order to achieve our research goal, we evaluate the software size of using the line of codes and function points which are produced by the PRICE S model. Finally, we compare the estimated cost data the actual cost data provided by the production company. As a result, we propose an approach to estimate the size and the cost of embedded software in weapon systems which are not easy to estimate objectively. We also expect that the Proposed approach is used for the cost validation and negotiation in the acquisition of weapon systems in the future.
Extended Semantic Web Services Retrieval Model for the Intelligent Web Services
Choi, Ok-Kyung ; Han, Sang-Yong ; Lee, Zoon-Ky ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 725~730
DOI : 10.3745/KIPSTD.2006.13D.5.725
Recently Web services have become a key technology which is indispensable for e-business. Due to its ability to provide the desired information or service regardless of time and place, integrating current application systems within a single business or between multiple businesses with standardized technologies are realized using the open network and Internet. However, the current Web Services Retrieval Systems, based on text oriented search are incapable of providing reliable search results by perceiving the similarity or interrelation between the various terms. Currently there are no web services retrieval models containing such semantic web functions. This research work is purported for solving such problems by designing and implementing an extended Semantic Web Services Retrieval Model that is capable of searching for general web documents, UDDI and semantic web documents. Execution result is proposed in this paper and its efficiency and accuracy are verified through it.
The Design and Development of a WIPI Certification Toolkit
Lee, Sang-Yun ; Lee, Hwan-Gu ; Choi, Byung-Uk ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 731~740
DOI : 10.3745/KIPSTD.2006.13D.5.731
WIPI is developed by KWISF(Korea Wireless Internet Standardization Forum) and a wireless internet standard platform adopted by TTA. It needs the certification Process for standard specification in order to confirm interoperability. The WIPI is composed of the HAL, the Runtime Engine, and APIs(WIPI-C, WIPI-Java). nl applications are implemented through WIPI APIs that can be finished by themselves or provided essential functions from runtime engine or HAL. Therefore it needs to certify where the problems occur when errors occurred in a application. In this paper we propose the PCT that certifies a WIPI platform`s functionality and APIs and the HCT that certifies HAL APIs. Because the PCT reports the final certification results for the platform it is impossible to know where the problems occur when it fails to certify platform. So, it needs to certify the HAL regardless of platform certification.
Design and Implementation of Cooperation System of UPnP and ACAP
Kim, Dong-Hee ; Park, Dong-Hwan ; Park, Jun-Hee ; Moon, Kyeong-Deok ; Lim, Kyung-Shik ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 741~748
DOI : 10.3745/KIPSTD.2006.13D.5.741
This paper describes the system which provides home entertainment services to users using the middleware-cooperation module which makes UPnP and ACAP cooperate with each other. UPnP or ACAP providespowerful functionalities to its own network to play multimedia contents using its own resources. U one middleware uses the information and the resources of another middleware, itcould provide services with more contents and more powerful functionalities than those which it originally provides. This paper develops the middleware-cooperation module to make two middleware cooperate with each other and use resources of another middleware. The middleware-cooperation module makes UPnP and ACAP cooperate with each other, and it makes them share their information and resources without resource collision.
Information Activity Monitoring for Enhancing the Utilization of the Enterprise Information System
Han, Kwan-Hee ; Song, Hee-Seok ;
The KIPS Transactions:PartD, volume 13D, issue 5, 2006, Pages 749~754
DOI : 10.3745/KIPSTD.2006.13D.5.749
Recently, many enterprises are introducing information systems for their competitive advantages. For enhancing the utilization level of enterprise information system, it is quite important to monitor the usage states of the information systems continuously. However, most enterprise information systems lack this functionality. Proposed in this paper is the framework of IAM (Information Activity Monitoring), which is defined as real-time reporting and alerting of significant information-related activities. This IAM framework provides 4 different views about the information system (data, IT system, business process, and participant) and is implemented as a part of integrated design/manufacturing information system developed by aerospace parts manufacturer. By using the IAM function, IT personnel can monitor significant information-related activities systematically and feedback to their users timely, and ultimately enhance the utilization level of information system.