• Title/Summary/Keyword: unavoidable set

Search Result 33, Processing Time 0.035 seconds

A Note on Unavoidable Sets for a Spherical Curve of Reductivity Four

  • Kashiwabara, Kenji;Shimizu, Ayaka
    • Kyungpook Mathematical Journal
    • /
    • v.59 no.4
    • /
    • pp.821-834
    • /
    • 2019
  • The reductivity of a spherical curve is the minimal number of times a particular local transformation called an inverse-half-twisted splice is required to obtain a reducible spherical curve from the initial spherical curve. It is unknown if there exists a spherical curve whose reductivity is four. In this paper, an unavoidable set of configurations for a spherical curve with reductivity four is given by focusing on 5-gons. It has also been unknown if there exists a reduced spherical curve which has no 2-gons and 3-gons of type A, B and C. This paper gives the answer to this question by constructing such a spherical curve.

Blocking-Artifact Reduction using Projection onto Adaptive Quantization Constraint Set (적응 양자화 제한 집합으로의 투영을 이용한 블록 현상 제거)

  • 정연식;김인겸
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.40 no.1
    • /
    • pp.79-86
    • /
    • 2003
  • A new quantization constraint set based on the theory of Projection onto Convex Set(POCS) is proposed to reduce blocking artifact appearing in block-coded images. POCS-based postprocessing for alleviating the blocking artifact consists of iterative projections onto smoothness constraint set and quantization constraint set, respectively. In general, the conventional quantization constraint set has the maximum size of range where original image data can be included, therefore over-blurring of restored image is unavoidable as iteration proceeds. The projection onto the proposed quantization constraint set can reduce blocking artifact as well as maintain the clearness of the decoded image, since it controls adaptively the size of quantization constraint set according to the DCT coefficients. Simulation results using the proposed quantization constraint set as a substitute for conventional quantization constraint set show that the blocking artifact of the decoded image can be reduced by the small number of iterations, and we know that the postprocessed image maintains the distinction of the decoded image.

A Study on the Adaptive Refinement Method for the Stress Analysis of the Meshfree Method (적응적 세분화 방법을 이용한 무요소법의 응력 해석에 관한 연구)

  • Han, Sang-Eul;Kang, Noh-Won;Joo, Jung-Sik
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 2008.04a
    • /
    • pp.8-13
    • /
    • 2008
  • In this study, an adaptive node generation procedure in the radial point interpolation method is proposed. Since we set the initial configuration of nodes by subdivision of background cell, abrupt changes of inter-nodal distance between higher and lower error regions are unavoidable. This unpreferable nodal spacing induces additional errors. To obtain the smoothy nodal configuration, it's regenerated by local Delaunay triangulation algorithm This technique was originally developed to generate a set of well-shaped triangles and tetrahedra. To demonstrate the performance of proposed scheme, the results of making optimal nodal configuration with adaptive refinement method are investigated for stress concentration problems.

  • PDF

An Application of ISODATA Method for Regional Lithological Mapping (광역지질도 작성을 위한 ISODATA 응용)

  • 朴鍾南;徐延熙
    • Korean Journal of Remote Sensing
    • /
    • v.5 no.2
    • /
    • pp.109-122
    • /
    • 1989
  • The ISODATA method, which is one of the most famous of the square-error clustering methos, has been applied to two Chungju multivariate data sets in order to evaluate the effectiveness of the regional lithological mapping. One is an airborne radiometric data set and the other is a mixed data set of the airborne radiometric and Landsat TM data. In both cases, the classification of the Bulguksa granite and the Kyemyongsan biotite-quartz gneiss are the most successful. Hyangsanni dolomitic limestone and neighboring Daehyangsan quartzite are also classified by their typical lowness of the radioactive intensities, though it is still confused with some others such as water-covered areas and nearby alluvials, and unaltered limestone areas. Topographically rugged valleys are also classified as the same cluster as above. This could be due to unavoidable variations of flight height and the attitude of the airborne system in such rugged terrains. The regional geological mapping of sedimentary rock units of the Ockchun System is in general confused. This might be due to similarities between different sediments. Considarable discrepancies occurred in mapping some lithological boundaries might also be due to secondary effects such as contamination or smoothing in digitizing process. Further study should be continued in the variable selection scheme as no absolutely superior method claims to exist yet since it seems somewhat to be rather data dependent. Study could also be made on the data preprocessing in order to reduce the erratic effects as mentioned above, and thus hoprfully draw much better result in regional geological mapping.

A VHDL Implementation of Baseband Predistorter for the Compensation of Nonlinear Distortion in OFDM Systems (OFDM시스템에서 비선형 왜곡 보상을 위한 기저대역 사전왜곡기의 VHDL 구현)

  • 성시훈;김형호;최종희;신요안;임성빈
    • Proceedings of the IEEK Conference
    • /
    • 2000.06a
    • /
    • pp.256-259
    • /
    • 2000
  • The OFDM (orthogonal frequency division multiplexing) systems are based en the transmission of a given set of signals on multiple orthogonal subcarriers, resulting in large variation in amplitude of transmit signals, and severe distortion by nonlinear characteristic of a high power amplifier (HPA) is unavoidable. We propose in this paper a computationally efficient structure of a baseband predistorter for compensation of nonlinear distortion by the HPA. Moreover, a predistorter which can be utilized in high speed transmission systems such as wireless ATM based on the proposed structure is designed using VHDL, and synthesized by the Synopsys tool.

  • PDF

An Expert System for Fault Section Diagnosis in Power Systems using the information including operating times of actuated relays and tripped circuit breakers (보호 계전기와 차단기의 동작 순서를 고려한 전력 시스템 사고 구간 진단을 위한 전문가 시스템)

  • Min, S.W.;Lee, S.H.;Park, J.K.
    • Proceedings of the KIEE Conference
    • /
    • 2000.07a
    • /
    • pp.125-127
    • /
    • 2000
  • Multiple faults are hard to diagnose correctly because the operation of circuit breakers tripped by former fault changes the topology of power systems. The information including operating time of actuated relays and tripped circuit breakers is used for considering changes of the network topology in fault section diagnosis. This paper presents a method for fault section diagnosis using a set of matrices which represent changes of the network topology due to operation of circuit breakers. The proposed method uses fuzzy relation to cope with the unavoidable uncertainties imposed on fault section diagnosis of power systems. The inference executed by the proposed matrices provides the fault section candidates in the form of a matrix made up of the degree of membership. Experimental studies for real power systems reveal usefulness of the proposed technique to diagnose multiple faults.

  • PDF

Enhanced Genetic Programming Approach for a Ship Design

  • Lee, Kyung-Ho;Han, Young-Soo;Lee, Jae-Joon
    • Journal of Ship and Ocean Technology
    • /
    • v.11 no.4
    • /
    • pp.21-28
    • /
    • 2007
  • Recently the importance of the utilization of engineering data is gradually increasing. Engineering data contains the experiences and know-how of experts. Data mining technique is useful to extract knowledge or information from the accumulated existing data. This paper deals with generating optimal polynomials using genetic programming (GP) as the module of Data Mining system. Low order Taylor series are used to approximate the polynomial easily as a nonlinear function to fit the accumulated data. The overfitting problem is unavoidable because in real applications, the size of learning samples is minimal. This problem can be handled with the extended data set and function node stabilization method. The Data Mining system for the ship design based on polynomial genetic programming is presented.

A Diagnosis Study on the Korea Transport Database for Stable Feasibility Analysis on Transportation Facilities (국가교통시설 안정적 타당성 평가를 위한 국가교통데이터베이스 관리체제 진단 연구)

  • Kim, Jin-Tae
    • International Journal of Highway Engineering
    • /
    • v.16 no.4
    • /
    • pp.97-110
    • /
    • 2014
  • PURPOSES: This study is to find the substantial shortcomings embedded in the government policies and practical administrative processes associated with the Korean Transportation Database (KTDB) and to propose preliminary approaches to overcome. METHODS: Administrative and socioeconomic issues on inefficiency in public and private investment and redemption was found from the literature review. Through the interview of sets of experts and practitioners, a set of faultiness embodied in the administrative procedure utilizing and managing KTDB was found and analyzed. RESULTS: This study found the erroneous administrative elements categorized into four groups: faulty socioeconomic data supporting local governors's optimistic will yielded overestimation of future traffic demand; faulty data incidentally introduced in KTDB burdened traffic demand analysis; unavoidable misuse of KTDB worsened the unstability of KTDB; and apathy to manage the KTDB data deviated systematic management. The proposed includes the alteration of the administrative and technical systems to overcome those shortcomings. CONCLUSIONS : Erroneous administrative elements associated with KTDB should be concerned prior to indicating subsequential faultiness in demand analysis.

Error Concealment Method considering Distance and Direction of Motion Vectors in H.264 (움직임벡터의 거리와 방향성을 고려한 H.264 에러 은닉 방법)

  • Son, Nam-Rye;Lee, Guee-Sang
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.1C
    • /
    • pp.37-47
    • /
    • 2009
  • When H.264 encoded video streams are transmitted over wireless network, packet loss is unavoidable. Responding on this environment, we propose methods to recover missed motion vector in the decoder: At first, A candidate vector set for missing macroblock is estimated from high correlation coefficient of neighboring motion vectors and missing block vectors the algorithm clusters candidate vectors through distances amongst motion vectors of neighboring blocks. Then the optimal candidate vector is determined by the median value of the clustered motion vector set. In next stage, from the candidate vector set, the final candidate vector of missing block is determined it has minimum distortion value considering directions of neighboring pixels' boundary. Test results showed that the proposed algorithm decreases the candidate motion vectors $23{\sim}61%$ and reduces $3{\sim}4sec$ on average processing(decoding) time comparing the existing H.264 codec. The PSNR, in terms of visual quality is similar to existing methods.

A Decision Support System for Product Design Common Attribute Selection under the Semantic Web and SWCL (시맨틱 웹과 SWCL하의 제품설계 최적 공통속성 선택을 위한 의사결정 지원 시스템)

  • Kim, Hak-Jin;Youn, Sohyun
    • Journal of Information Technology Services
    • /
    • v.13 no.2
    • /
    • pp.133-149
    • /
    • 2014
  • It is unavoidable to provide products that meet customers' needs and wants so that firms may survive under the competition in this globalized market. This paper focuses on how to provide levels for attributes that compse product so that firms may give the best products to customers. In particular, its main issue is how to determine common attributes and the others with their appropriate levels to maximize firms' profits, and how to construct a decision support system to ease decision makers' decisons about optimal common attribute selection using the Semantic Web and SWCL technologies. Parameter data in problems and the relationships in the data are expressed in an ontology data model and a set of constraints by using the Semantic Web and SWCL technologies. They generate a quantitative decision making model through the automatic process in the proposed system, which is fed into the solver using the Logic-based Benders Decomposition method to obtain an optimal solution. The system finally provides the generated solution to the decision makers. This presentation suggests the opportunity of the integration of the proposed system with the broader structured data network and other decision making tools because of the easy data shareness, the standardized data structure and the ease of machine processing in the Semantic Web technology.