• Title/Summary/Keyword: Gestures

Search Result 472, Processing Time 0.257 seconds

A Unit Touch Gesture Model of Performance Time Prediction for Mobile Devices

  • Kim, Damee;Myung, Rohae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.35 no.4
    • /
    • pp.277-291
    • /
    • 2016
  • Objective: The aim of this study is to propose a unit touch gesture model, which would be useful to predict the performance time on mobile devices. Background: When estimating usability based on Model-based Evaluation (MBE) in interfaces, the GOMS model measured 'operators' to predict the execution time in the desktop environment. Therefore, this study used the concept of operator in GOMS for touch gestures. Since the touch gestures are comprised of possible unit touch gestures, these unit touch gestures can predict to performance time with unit touch gestures on mobile devices. Method: In order to extract unit touch gestures, manual movements of subjects were recorded in the 120 fps with pixel coordinates. Touch gestures are classified with 'out of range', 'registration', 'continuation' and 'termination' of gesture. Results: As a results, six unit touch gestures were extracted, which are hold down (H), Release (R), Slip (S), Curved-stroke (Cs), Path-stroke (Ps) and Out of range (Or). The movement time predicted by the unit touch gesture model is not significantly different from the participants' execution time. The measured six unit touch gestures can predict movement time of undefined touch gestures like user-defined gestures. Conclusion: In conclusion, touch gestures could be subdivided into six unit touch gestures. Six unit touch gestures can explain almost all the current touch gestures including user-defined gestures. So, this model provided in this study has a high predictive power. The model presented in the study could be utilized to predict the performance time of touch gestures. Application: The unit touch gestures could be simply added up to predict the performance time without measuring the performance time of a new gesture.

Towards Establishing a Touchless Gesture Dictionary based on User Participatory Design

  • Song, Hae-Won;Kim, Huhn
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.515-523
    • /
    • 2012
  • Objective: The aim of this study is to investigate users' intuitive stereotypes on non-touch gestures and establish the gesture dictionary that can be applied to gesture-based interaction designs. Background: Recently, the interaction based on non-touch gestures is emerging as an alternative for natural interactions between human and systems. However, in order for non-touch gestures to become a universe interaction method, the studies on what kinds of gestures are intuitive and effective should be prerequisite. Method: In this study, as applicable domains of non-touch gestures, four devices(i.e. TV, Audio, Computer, Car Navigation) and sixteen basic operations(i.e. power on/off, previous/next page, volume up/down, list up/down, zoom in/out, play, cancel, delete, search, mute, save) were drawn from both focus group interview and survey. Then, a user participatory design was performed. The participants were requested to design three gestures suitable to each operation in the devices, and they evaluated intuitiveness, memorability, convenience, and satisfaction of their derived gestures. Through the participatory design, agreement scores, frequencies and planning times of each distinguished gesture were measured. Results: The derived gestures were not different in terms of four devices. However, diverse but common gestures were derived in terms of kinds of operations. In special, manipulative gestures were suitable for all kinds of operations. On the contrary, semantic or descriptive gestures were proper to one-shot operations like power on/off, play, cancel or search. Conclusion: The touchless gesture dictionary was established by mapping intuitive and valuable gestures onto each operation. Application: The dictionary can be applied to interaction designs based on non-touch gestures. Moreover, it will be used as a basic reference for standardizing non-touch gestures.

The Types and Features of Gestures in Science Discourse of Elementary Students (초등학생의 과학 담화에서 나타나는 몸짓의 유형과 특징)

  • Na, Jiyeon;Song, Jinwoong
    • Journal of Korean Elementary Science Education
    • /
    • v.31 no.4
    • /
    • pp.450-462
    • /
    • 2012
  • Gestures are a common phenomenon of human communication. There exists little research concerned with the gestures in science education, and most researches of gestures have focused on individual gestures. However, learning occurs through sociocultural interactions with friends, family, teachers, and others in society. Hence, the purpose of this study was to investigate and identify the types and features of gestures which were made by elementary students to communicate with peers in science discourse. A group of six fourth-graders was observed in eight science discourses where they talked about ideas related to thermal concepts. The data was collected through interviews and questionnaires. The analysis of the data showed that students' gestures in science discourses could be classified into seven types: signal iconic gesture, illustrative iconic gesture, personal deictic gesture, object deictic gesture, beat gesture, emotional metaphoric gesture, and content metaphoric gesture. It was also found that these gestures had functions of repeating, supplementing, and replacing utterance to communicate with others. Students frequently expressed scientific terms metaphorically as everyday terms through their gestures. Gestures were shared, imitated, and transferred in the communication process, and students' gestures also made influence on other students' ideas through these processes.

Interacting with Touchless Gestures: Taxonomy and Requirements

  • Kim, Huhn
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.475-481
    • /
    • 2012
  • Objective: The aim of this study is to make the taxonomy for classifying diverse touchless gestures and establish the design requirements that should be considered in determining suitable gestures during gesture-based interaction design. Background: Recently, the applicability of touchless gestures is more and more increasing as relevant technologies are being advanced. However, before touchless gestures are widely applied to various devices or systems, the understanding on human gestures' natures and their standardization should be prerequisite. Method: In this study, diverse gesture types in various literatures were collected and, based on those, a new taxonomy for classifying touchless gestures was proposed. And many gesture-based interaction design cases and studies were analyzed. Results: The proposed taxonomy consisted of two dimensions: shape (deictic, manipulative, semantic, or descriptive) and motion(static or dynamic). The case analysis based on the taxonomy showed that manipulative and dynamic gestures were widely applied. Conclusion: Four core requirements for valuable touchless gestures were intuitiveness, learnability, convenience and discriminability. Application: The gesture taxonomy can be applied to produce alternatives of applicable touchless gestures, and four design requirements can be used as the criteria for evaluating the alternatives.

A Comparison of the Characteristics between Single and Double Finger Gestures for Web Browsers

  • Park, Jae-Kyu;Lim, Young-Jae;Jung, Eui-S.
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.5
    • /
    • pp.629-636
    • /
    • 2012
  • Objective: The purpose of this study is to compare the characteristics of single and double finger gestures related on the web browser and to extract the appropriate finger gestures. Background: As electronic equipment emphasizes miniaturization for improving portability various interfaces are being developed as input devices. Electronic devices are made smaller, the gesture recognition technology using the touch-based interface is favored for easy editing. In addition, user focus primarily on the simplicity of intuitive interfaces which propels further research of gesture based interfaces. In particular, the fingers in these intuitive interfaces are simple and fast which are users friendly. Recently, the single and double finger gestures are becoming more popular so more applications for these gestures are being developed. However, systems and software that employ such finger gesture lack consistency in addition to having unclear standard and guideline development. Method: In order to learn the application of these gestures, we performed the sketch map method which happens to be a method for memory elicitation. In addition, we used the MIMA(Meaning in Mediated Action) method to evaluate gesture interface. Results: This study created appropriate gestures for intuitive judgment. We conducted a usability test which consisted of single and double finger gestures. The results showed that double finger gestures had less performance time faster than single finger gestures. Single finger gestures are a wide satisfaction difference between similar type and difference type. That is, single finger gestures can judge intuitively in a similar type but it is difficult to associate functions in difference type. Conclusion: This study was found that double finger gesture was effective to associate functions for web navigations. Especially, this double finger gesture could be effective on associating complex forms such as curve shaped gestures. Application: This study aimed to facilitate the design products which utilized finger and hand gestures.

The Relationship between the Mental Model and the Depictive Gestures Observed in the Explanations of Elementary School Students about the Reason Why Seasons change (계절의 변화 원인에 대한 초등학생들의 설명에서 확인된 정신 모델과 묘사적 몸짓의 관계 분석)

  • Kim, Na-Young;Yang, Il-Ho;Ko, Min-Seok
    • Journal of the Korean Society of Earth Science Education
    • /
    • v.7 no.3
    • /
    • pp.358-370
    • /
    • 2014
  • The purpose of this study is to analyze the relationship between the mental model and the depictive gestures observed in the explanations of elementary school students about the reason why seasons change. As a result of analysis in gestures of each mental model, mental model was remembered as "motion" in case of CM-type, and showed more "Exphoric" gestures that expressed gesture as a language. CF type is remembered in "writings or pictures," and metaphoric gestures were used when explaining some alternative concepts. CF-UM type explained with language in detail, and showed a number of gestures with "Lexical." Analyzing depictive gestures, even with sub-categories such as rotation, revolution and meridian altitude, etc., a great many types of gestures were expressed such as indicating with fingers, palms, arms, ball-point pens, and fists, etc., or drawing, spinning and indicating them. We could check up concept understandings of the students through this. In addition, as we analyzed inconsistencies among external representations such as verbal language and gesture, writing and gesture, and picture and gesture, we realized that gestures can help understanding mental models of the students, and sometimes, we could know that information that cannot be shown by linguistic explanations or pictures was expressed in gestures. Additionally, we looked into two research participants that showed conspicuous differences. One participant seemed to be wrong as he used his own expressions, but he expressed with gestures precisely, while the other participant seemed to be accurate, but when he analyzed gestures, he had whimsical concepts.

The Relationship between Lexical Retrieval and Coverbal Gestures (어휘인출과 구어동반 제스처의 관계)

  • Ha, Ji-Wan;Sim, Hyun-Sub
    • Korean Journal of Cognitive Science
    • /
    • v.22 no.2
    • /
    • pp.123-143
    • /
    • 2011
  • At what point in the process of speech production are gestures involved? According to the Lexical Retrieval Hypothesis, gestures are involved in the lexicalization in the formulating stage. According to the Information Packaging Hypothesis, gestures are involved in the conceptual planning of massages in the conceptualizing stage. We investigated these hypotheses, using the game situation in a TV program that induced the players to involve in both lexicalization and conceptualization simultaneously. The transcription of the verbal utterances was augmented with all arm and hand gestures produced by the players. Coverbal gestures were classified into two types of gestures: lexical gestures and motor gestures. As a result, concrete words elicited lexical gestures significantly more frequently than abstract words, and abstract words elicited motor gestures significantly more frequently than concrete words. The difficulty of conceptualization in concrete words was significantly correlated with the amount of lexical gestures. However, the amount of words and the word frequency were not correlated with the amount of both gestures. This result supports the Information Packaging Hypothesis. Most of all, the importance of motor gestures was inferred from the result that abstract words elicited motor gestures more frequently rather than concrete words. Motor gestures, which have been considered as unrelated to verbal production, were excluded from analysis in many gestural studies. This study revealed motor gestures seemed to be connected to the abstract conceptualization.

  • PDF

The Role and Importance of Gesture in Science Exploration (과학 탐구에서 몸짓의 역할과 중요성)

  • Han Jae young;Choi Jung hoon;Shin Young Joo;Son Jeong woo;Cha Jeong Ho;Hong Jun Euy
    • Journal of Korean Elementary Science Education
    • /
    • v.25 no.1
    • /
    • pp.51-58
    • /
    • 2006
  • The language and the gestures of a teacher, generally, have a great influence on the effect of a lesson. This is because subject content is transferred to students by teachers' language and gestures. In the science lessons which focus on experiments, the language and gestures of both students and teachers will help the learning of scientific content. However, the role of gestures, despite its importance, has rarely been investigated in science education research. The role of gestures of students and teachers is a much needed area of study. This study investigated the gestures observed in the experimental process performed by students who participated in a science exploration activity. Students' gestures play an essential role in the successful performance of the experiment. and they could function as a process of solving the contradictory situation. In addition, the demonstration and the communication of gestures should be performed very cautiously. There were a number of implications for the long-standing problem of the relation between the understanding of science concepts and the performance of experiments.

  • PDF

Development of Finger Gestures for Touchscreen-based Web Browser Operation (터치스크린 기반 웹브라우저 조작을 위한 손가락 제스처 개발)

  • Nam, Jong-Yong;Choe, Jae-Ho;Jung, Eui-S.
    • Journal of the Ergonomics Society of Korea
    • /
    • v.27 no.4
    • /
    • pp.109-117
    • /
    • 2008
  • Compared to the existing PC which uses a mouse and a keyboard, the touchscreen-based portable PC allows the user to use fingers, requiring new operation methods. However, current touchscreen-based web browser operations in many cases involve merely having fingers move simply like a mouse and click, or not corresponding well to the user's sensitivity and the structure of one's index finger, making itself difficult to be used during walking. Therefore, the goal of this study is to develop finger gestures which facilitate the interaction between the interface and the user, and make the operation easier. First, based on the frequency of usage in the web browser and preference, top eight functions were extracted. Then, the users' structural knowledge was visualized through sketch maps, and the finger gestures which were applicable in touchscreens were derived through the Meaning in Mediated Action method. For the front/back page, and up/down scroll functions, directional gestures were derived, and for the window closure, refresh, home and print functions, letter-type and icon-type gestures were drawn. A validation experiment was performed to compare the performance between existing operation methods and the proposed one in terms of execution time, error rate, and preference, and as a result, directional gestures and letter-type gestures showed better performance than the existing methods. These results suggest that not only during the operation of touchscreen-based web browser in portable PC but also during the operation of telematics-related functions in automobile, PDA and so on, the new gestures can be used to make operation easier and faster.

The Roles of Maternal Responsiveness in the Relationship between Infants' Communicative Gestures and Play (13~18개월 영아의 의사소통적 몸짓과 놀이의 관계에서 어머니 반응성의 역할)

  • Lee, Jiyoung;Sung, Jihyun
    • Korean Journal of Child Studies
    • /
    • v.36 no.5
    • /
    • pp.19-36
    • /
    • 2015
  • The purpose of this study was to firstly, investigate the relationship between infants' communicative gestures, play and maternal responsiveness and secondly, to examine the role of maternal responsiveness in the associations between infants' communicative gestures and play. The subjects comprised 42 infants (21 boys and 21 girls) and their mothers. The infants' communicative gestures, the infants' play and maternal responsiveness were observed during free play sessions lasting 20 minutes. The results are as follows. Mothers of girls showed higher levels of responsiveness than the mothers of boys. In addition, here were positive correlations between infants' communicative gestures, play and maternal responsiveness. Maternal responsiveness was observed to moderate the effects of infants' communicative gestures on the infants' average level of play. These results indicate that it is important for caregivers to interpret infants' communicative intentions appropriately and respond promptly and adequately in play situations.