Research on Application of Cognitive-Driven Human-Computer Interaction

  • Qianwen Fu Key Laboratory of Advanced Manufacturing Technology, Ministry of Education, Guizhou University, Guiyang 550025, China)
  • Jian Lv Key Laboratory of Advanced Manufacturing Technology, Ministry of Education, Guizhou University, Guiyang 550025, China)
Keywords: Human-computer interaction mode, user experience, design cognition, analysis and evaluation, information interaction Introduction

Abstract

Human-computer interaction is an important research content of intelligent manufacturing human factor engineering. Natural human-computer interaction conforms to the cognition of users' habits and can efficiently process inaccurate information interaction, thus improving user experience and reducing cognitive load. Through the analysis of the information interaction process, user interaction experience cognition and human-computer interaction principles in the human-computer interaction system, a cognitive-driven human-computer interaction information transmission model is established. Investigate the main interaction modes in the current human-computer interaction system, and discuss its application status, technical requirements and problems. This paper discusses the analysis and evaluation methods of interaction modes in human-computer system from three levels of subjective evaluation, physiological measurement and mathematical method evaluation, so as to promote the understanding of inaccurate information to achieve the effect of interaction self-adaptation and guide the design and optimization of human-computer interaction system. According to the development status of human-computer interaction in intelligent environment, the research hotspots, problems and development trends of human-computer interaction are put forward.

References

Mapping the field of human-computer interaction (HCI)[J]. Proceedings of the American Society for Information Science and Technology, 2006, 43(1):1-7.

Seffah A,Gulliksen J ,Desmarais M C . [Human-Computer Interaction Series] Human-Centered Software Engineering — Integrating Usability in the Software Development Lifecycle Volume 8 || [J]. 2005, 10.1007/1-4020-4113-6.]

Adamczyk P D , Twidale M B . Supporting multidisciplinary collaboration:requirements from novel HCI education[C]// Conference on Human Factors in Computing Systems. DBLP, 2007.

Kuo-Yi L . User Experience–Based Product Design for Smart Production to Empower Industry 4.0 in the Glass Recycling Circular Economy[J]. Computers & Industrial Engineering, 2018:S036083521830295X-.

Hanington B . Chapter 6 – Design and Emotional Experience[J]. Emotions and Affect in Human Factors and Human-Computer Interaction, 2017:165-183.

Bunt H , Beun R J , Borghuis T . Multimodal Human-Computer Communication, Systems, Techniques, and Experiments[C]// Multimodal Human-computer Communication, Systems, Techniques, & Experiments. Springer-Verlag, 1998.

Quek F , Mcneill D , Bryll R , et al. Multimodal human discourse: Gesture and speech[J]. ACM Transactions on Computer-Human Interaction, 2002, 9(3):171-193.

Benoit A, Bonnaud L, Caplier A, et al. Multimodal signal processing and interaction for a driving simulator: Component-based architecture[J]. Journal on Multimodal User Interfaces, 2007, 1(1):49-58.

Chen F , Ruiz N , Choi E , et al. Multimodal behavior and interaction as indicators of cognitive load[J]. ACM Transactions on Interactive Intelligent Systems, 2012, 2(4):1-36.

CHENG Shiwei, SUN Shouqian.Human-computer interaction resource model based on distributed cognition [J]. Computer Integrated Manufacturing System, 2008 (9): 1683-1689.

Lu Lu, Tian Feng, Dai Guozhong, Wang Hongan.A multi-channel cognitive and interactive model integrating touch, hearing and vision [J]. Acta Computer Aided Design and Graphics, 2014, 26 (04): 654-661.

Pavlovic VI , Sharma R , Huang T S . Visual interpretation of hand gestures for human-computer interaction: a review[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1997, 19(7):0-695

Guzman A L. Voices, in, and, of, the machine: Source orientation toward mobile virtual assistants[J]. Computers in Human Behavior, 2018:S0747563218303844-.

Erden M S , Tomiyama T . Human-Intent Detection and Physically Interactive Control of a Robot Without Force Sensors[J]. IEEE Transactions on Robotics, 2010, 26(2):370-382.

Shackel B. The psychology of everyday things : Donald A. Norman, New York, Basic Books Inc. 1988, ISBN 0-465-06709-3[J]. Acta Psychologica, 1992, 79(3):275-278.

Cockburn A , Quinn P , Gutwin C . The Effects of Interaction Sequencing on User Experience and Preference[J]. International Journal of Human-Computer Studies, 2017, 108:89-104.

Norman, Draper D A , Stephen W . User Centered System Design : New Perspectives on Human-Computer Interaction[J]. The American Journal of Psychology, 1986, 101(1).

Mccarthy J. User experience and the idea of design in HCI[C]// International Conference on Interactive Systems: Design. 2005.

NetScan, Magazine A I. User Experience Research Group[J].

Bateman A, Zhao O K, Bajcsy A V, et al. A user-centered design and analysis of an electrostatic haptic touchscreen system for students with visual impairments[J]. International Journal of Human-Computer Studies, 2018, 109:102-111.

Eskridge T C, Still D, Hoffman R R. Principles for Human-Centered Interaction Design, Part 1: Performative Systems[J]. IEEE Intelligent Systems, 2014, 29(4):88-94.

Khatib O , Demircan E , Sapio V D , et al. Robotics-based synthesis of human motion[J]. Journal of Physiology - Paris, 2009, 103(3-5):211-219.

Guzman A L. Voices, in, and, of, the machine: Source orientation toward mobile virtual assistants[J]. Computers in Human Behavior, 2018:S0747563218303844-.

Hey Alexa examine the variables influencing the use of artificial intelligent in-home voice assistants Computers in Human Behavior, Volume 99, October 2019, Pages 28-37

Mccallum M C, Campbell J L, Richman J B, et al. Speech Recognition and In-Vehicle Telematics Devices: Potential Reductions in Driver Distraction[J]. International Journal of Speech Technology, 2004, 7(1):25-33.

Gouiaa R , Meunier J . Human posture recognition by combining silhouette and infrared cast shadows[C]// International Conference on Image Processing Theory. IEEE, 2016.

Boulay B , Francois Brémond, Thonnat M . Applying 3D human model in a posture recognition system[J]. Pattern Recognition Letters, 2006, 27(15):1788-1796.

Lou Zehua, Yin Jibin.Analysis of gesture design principles in human-computer interaction [J]. Software Guide, 2018, 17 (04): 19-24 +35.

Kim H , Albuquerque G , Havemann S , et al. Tangible 3D: Hand Gesture Interaction for Immersive 3D Modeling[C]// Workshop on Int Workshop on Immersive Projection Technology. Eurographics Association, 2005.

Rahman A , Saboune J , El Saddik A . Motion-Path Based in Car Gesture Control of the Multimedia Devices[C]// Acm International Symposium on Design & Analysis of Intelligent Vehicular Networks & Applications. ACM, 2011.

Barros A C D , LeitO R , Ribeiro J . Design and Evaluation of a Mobile User Interface for Older Adults: Navigation, Interaction and Visual Design Recommendations[J]. Procedia Computer Science, 2014, 27:369-378.

Shotton J, Kipman A, Kipman A, et al. Real-time human pose recognition in parts from single depth images[J]. Communications of the Acm, 2013, 56(1):116-124.

Taylor J , Shotton J , Sharp T , et al. The Vitruvian Manifold: Inferring Dense Correspondences for One-Shot Human Pose Estimation[C]// IEEE Computer Vision and Pattern Recognition. IEEE, 2012.

Bruce Lee. Design and Implementation of Virtual Human Anatomy Teaching System Based on Kinect Gesture Recognition [D]. Beijing: Beijing University of Technology, 2014.

Chun-Cheng Chang, Linda Ng Boyle, John D. Lee, James Jenness Using tactile detection response tasks to assess in-vehicle voice control interactions Transportation Research Part F: Traffic Psychology and Behaviour, Volume 51, November 2017, Pages 38-46

Harrison C, Tan D, Morris D. Skinput: Appropriating the Skin as an Interactive Canvas [J]. Communications of the ACM , 2011 (8): 111- 118.

Jiang Ning, Lu Xiaobo, Li Yuan, Xu Yingqing.Graphic display design method for blind people and its user experience research [J]. Journal of Computer Aided Design and Graphics, 2011, 23 (09): 1539-1544.

Dakopoulos D , Bourbakis N G . Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey[J]. IEEE Transactions on Systems Man and Cybernetics Part C (Applications and Reviews), 2010, 40(1):25-35.

Rosario H D , Louredo M , I?aki Díaz, et al. Efficacy and feeling of a vibrotactile Frontal Collision Warning implemented in a haptic pedal[J]. Transportation Research Part F Traffic Psychology & Behaviour, 2010, 13(2):0-91.

Lippiello V , Siciliano B , Villani L . Interaction Control of Robot Manipulators Using Force and Vision[J]. INTERNATIONAL JOURNAL OF OPTOMECHATRONICS, 2008, 2(3):257-274.

Rahman M M, Ikeura R, Mizutani K. Control characteristics of two humans in cooperative task and its application to robot control[C]// Conference of the IEEE Industrial Electronics Society. 2000.

Soltani S , Mahnam A . A practical efficient human computer interface based on saccadic eye movements for people with disabilities[J]. Computers in Biology and Medicine, 2016, 70:S0010482516300026.

Wolpaw J R, Birbaumer N, Heetderks W J, et al. Brain-Computer Interface Technology: A Review of the First International Meeting[J]. IEEE Transactions on Rehabilitation Engineering, 2000(2): 164-173.

Fu Yunfa, Guo Yanlong, Li Song, Xiong Xin, Li Bo, Yu Zhengtao.Research on direction and speed of direct brain-controlled robot based on SSVEP [J]. Acta Automata Sinica, 2016, 42 (11): 1630-1640.

Zhang Yi, Luo Mingwei, Luo Yuan, Xu Xiaodong.Human-computer interaction in intelligent wheelchair based on EEG/wave [J]. Journal of Huazhong University of Science and Technology (Natural Science Edition), 2013, 41 (07): 109-114.

Prats M Sanz P J , Pobil A P D . Vision-tactile-force integration and robot physical interaction[C]// IEEE International Conference on Robotics & Automation. IEEE Press, 2009.

Falchier A , Clavagnier S , Barone P , et al. Anatomical Evidence of Multimodal Integration in Primate Striate Cortex[J]. Journal of Neuroscience, 2002, 22(13):5749.

Leitao J , Thielscher A , Werner S , et al. Effects of Parietal TMS on Visual and Auditory Processing at the Primary Cortical Level - A Concurrent TMS-fMRI Study[J]. Cerebral Cortex, 2013, 23(4):873-884.

Vasconcelos N , Pantoja J , Belchior H , et al. Cross-modal responses in the primary visual cortex encode complex objects and correlate with tactile discrimination[J]. Proceedings of the National Academy of Sciences, 2011, 108(37):15408-15413.

Zhou W , Zhang X , Chen J , et al. Nostril-Specific Olfactory Modulation of Visual Perception in Binocular Rivalry[J]. JOURNAL OF NEUROSCIENCE, 2012, 32(48):17225-17229.

Powers A R , Hevey M A , Wallace M T . Neural Correlates of Multisensory Perceptual Learning[J]. Journal of Neuroscience, 2012, 32(18):6263-6274.

Yongda D , Fang L , Huang X . Research on multimodal human-robot interaction based on speech and gesture[J]. Computers & Electrical Engineering, 2018, 72:443-454.

Cutug no F, Leano V A, Rinaldi R, et al. Multimodal framework for mobile interaction[J]. 2012.

Marcos S , Jaime Gómez-García-Bermejo, Zalama E . A realistic, virtual head for human–computer interaction[J]. Interacting with Computers, 2010, 22(3):176-192.

Lamia Gaouar, Abdelkrim Benamar, Olivier Le Goaer, Frédérique Biennier,HCIDL: Human-computer interface description language for multi-target, multimodal, plastic user interfaces,Future Computing and Informatics Journal,Volume 3, Issue 1,2018,Pages 110-130,ISSN 2314-7288

Fernandez Montenegro J M, Argyriou V. Cognitive evaluation for the diagnosis of Alzheimer's disease based on Turing Test and Virtual Environments[J]. Physiology & Behavior, 2017, 173:42-51.

Nin T, Bowman D A, North C, et al. 2011. Design and evaluation of freehand menu selection interfaces using tilt and pinch gestures [J]. International Journal of Human-Computer Studies. 69:551- 562.

YAN Shengyuan.2013. Design and Evaluation of Human-Machine Interface [M]. Beijing: National Defense Industry Press

Naveen K, Jyoti K. Measurement of cognitive load in HCI systems using EEG power spectrum: an experimental study [J]. Procedia Computer Science, 2016, 84 (27): 70-78.

SHEN Qing-yang, XUE Cheng-qi, NIU Yafeng.Research on the design method of human-computer interaction interface based on ERP [J]. Design, 2018 (6): 134-135.

Su Weihua.Research on the Theory and Method of Multi-index Comprehensive Evaluation [D]. Xiamen University, 2000.

Negri P, Omedas P, Chech L, et al. Comparing Input Sensors in an Immersive Mixed-Reality Environment for Human-Computer Symbiosis[M]// Symbiotic Interaction. 2015.

Published
2020-02-01
Section
Articles