Hostname: page-component-cd9895bd7-hc48f Total loading time: 0 Render date: 2024-12-25T05:59:31.841Z Has data issue: false hasContentIssue false

An expressional simplified mechanism in anthropomorphic face robot design

Published online by Cambridge University Press:  09 July 2014

Chyi-Yeu Lin*
Affiliation:
Department of Mechanical Engineering, National Taiwan University of Science and Technology, Taipei, 10607Taiwan
Chun-Chia Huang
Affiliation:
Department of Mechanical Engineering, National Taiwan University of Science and Technology, Taipei, 10607Taiwan Email: [email protected]
Li-Chieh Cheng
Affiliation:
Department of Mechanical Engineering, National Taiwan University of Science and Technology, Taipei, 10607Taiwan Email: [email protected]
*
*Corresponding author. E-mail: [email protected]

Summary

The goal of this research is to develop a low-cost face robot which has a lower degree-of-freedom facial expression mechanism. Many designs of facial robots have been announced and published in the past. Face robots can be classified into two major types based on their respective degrees of freedom. The first type has various facial expressions with higher degrees of freedom, and the second has finite facial expressions with fewer degrees of freedom. Due to the high cost of the higher-degree-of-freedom face robot, most commercial face robot products are designed in the lower-degrees-of-freedom form with finite facial expressions. Therefore, a face robot with a simplified facial expression mechanism is proposed in this research. The main purpose of this research is to develop a device with a lower degree-of-freedom mechanism that is able to generate many facial expressions while keeping one basic mouth shape variation. Our research provides a new face robot example and development direction to reduce costs and conserve energy.

Type
Articles
Copyright
Copyright © Cambridge University Press 2014 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Breazeal, C., Sociable Machines: Expressive Social Exchange Between Humans and Robots, Ph.D. Dissertation (Cambridge, MA: Dept. EECS, MIT, 2000).Google Scholar
2.Zecca, M., Roccella, S., Carrozza, M. C., Miwa, H., Itoh, K., Cappiello, G., Cabibihan, J. J., Matsumoto, M., Takanobu, H., Dario, P. and Takanishi, A., “On the development of the emotion expression humanoid robot WE-4RII with RCH-1,” Proceedings of the 4th IEEE-RAS International Conference on Humanoid Robots, Santa Monica, CA, (2004) vol. 1 pp. 235252.Google Scholar
3.Lab, M. M., “MDS Head & Face,” Retrived March 2, 2014. [Online] http://robotic.media.mit.edu/projects/robots/mds/headface/headface.html.Google Scholar
4.Beira, R., Lopes, M., Praga, M., Santos-Victor, J., Bernardino, A., Metta, G., Becchi, F. and Saltaren, R., “Design of the robot-cub head,” Proceedings of the IEEE International Conference on Robotics and Automation, Orlando, FL (2006) pp. 94100.Google Scholar
5.Diana, C. and Thomaz, A. L., “The shape of Simon- creative design of a humanoid robot shell” Proceedings of the 29th Annual CHI Conference on Human Factors in Computing Systems, Conference Proceedings and Extended Abstracts, Vancouver, Canada (2011) pp. 283–298.Google Scholar
6.Lütkebohle, I., Hegel, F., Schulz, S., Hackel, M., Wrede, B., Wachsmuth, S., and Sagerer, G., “The bielefeld anthropomorphic robot head- Flobi,” Proceedings of the IEEE International Conference on Robotics and Automation, Anchorage, AK (2010) pp. 33843391.Google Scholar
7.Kobayashi, H., and Hara, F., “Study on face robot for active human interface-mechanisms of face robot and expression of 6 basic facial expressions,” Proceedings of the IEEE International Workshop on Robot and Human Communication, Tokyo (1993) pp. 276281.Google Scholar
8.Kobayashi, H., “Study on Face Robot Platform as a KANSEI Medium,” Proceedings of the IEEE International Conference on Industrial Electronics, Nagoya, Japan (2000) pp. 481486.Google Scholar
9.Hara, F., Akazawa, H. and Kobayashi, H., “Realistic Facial Expressions by SMA Driven Face Robot,” Proceedings of the IEEE International Workshop on Robot and Human Communication, Bordeaux-Paris, France (2001) pp. 504511.Google Scholar
10.Kobayashi, H., Ichikawa, Y., Senda, M. and Shiiba, T., “Realization of Realistic and Rich Facial Expressions by Face Robot,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Las Vegas, NV (2003) pp. 11231128.Google Scholar
11.Hanson, D., Pioggia, G., Cohen, Y. B. and De Rossi, D., “Androids: Application of EAP as Artificial Muscles to Entertainment Industry,” Proceedings of the SPIE – The International Society for Optical Engineering, Newport Beach, CA (2001) pp. 375379.Google Scholar
12.Hanson, D., Pioggia, G., Dinelli, S., Di Francesco, F., Francesconi, R. and De Rossi, D., “Bio-inspired Facial Expression Interfaces for Emotive Robots,” Proceedings of the AAAI Mobile Robot Competition, (2002) pp. 72–82.Google Scholar
13.Minato, T., Shimada, M., Ishiguro, H. and Itakura, S., “Development of an android robot for studying human-robot interaction,” Proceedings of the 17th International Conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems, Ottawa, Canada (2004) pp. 424434.Google Scholar
14.Matsui, D., Minato, T., MacDorman, K. and Ishiguro, H., “Generating natural motion in an android by mapping human motion,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB (2005) pp. 33013308.Google Scholar
15.Shimada, M., Minato, T., Itakura, S. and Ishiguro, H., “Evaluation of android using unconscious recognition,” Proceedings of the IEEE- RAS International Conference on Humanoid Robots, Genoa, Italy (2006) pp. 157162.Google Scholar
16.“Kokoro Company Ltd,” Retrived March 2, 2014. http://www.kokoro-dreams.co.jp.Google Scholar
17.Wu, W. G., Men, Q. M. and Wang, Y., “Development of the Humanoid Head Portrait Robot System With Flexible Face and Expression,” Proceedings of the IEEE International Conference on Robotics and Biomimetics, Shenyang, China (2004) pp. 757762.Google Scholar
18.Oh, J. H., Hanson, D., Kim, W. S., Han, Y., Kim, J. Y. and Park, I. W., “Design of Android type Humanoid Robot Albert HUBO,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China (2006) pp. 14281433.Google Scholar
19.Hanson Robotics Company Ltd, Retrieved on March 2, 2014 from http://www.hansonrobotics.com.Google Scholar
20.Lee, D. W., Lee, T. G., So, B., Choi, M., Shin, E. C., Yang, K. W., Back, M. H., Kim, H. S. and Lee, H. G., “Development of an Android for Emotional Expression and Human Interaction,” Proceedings of the International Federation of Automatic Control, Seoul, Korea (2008) pp. 43364337.Google Scholar
21.Berns, K. and Braun, T., “Design concept of a human like robot head,” Proceedings of the 5th IEEE-RAS International Conference on Humanoid Robots, Tsukuba, Japan (2005) pp. 3237.Google Scholar
22.Berns, K. and Hirth, J., “Control of facial expressions of the humanoid robot head roman,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China (2006) pp. 3119–3124.Google Scholar
23.Hashimoto, T., Hiramatsu, S. and Kobayashi, H., “Development of Face Robot for Emotional Communication between Human and Robot,” Proceedings of the IEEE International Conference on Mechatronics and Automation, Luoyang, China (2006) pp. 2530.Google Scholar
24.Minoru, H., Chisaki, Y. and Tsugutake, S., “Development and Control of a Face Robot Imitating Human Muscular Structures,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China (2006) pp. 18551860.Google Scholar
25.Kaneko, K., Kanehiro, F., Morisawa, M., Miura, K., Nakaoka, S. and Kajita, S., “Cybernetic Human HRP-4 C,” Proceedings of the 9th IEEE-RAS International Conference on Humanoid Robots, Paris, France (2009) pp. 714.Google Scholar
26.Nakaoka, S., Kanehiro, F., Miura, K., Morisawa, M., Fujiwara, K., Kaneko, K., Kajita, S. and Hirukawa, H., “Creating facial motions of Cybernetic Human HRP-4 C,” Proceedings of the 9th IEEE-RAS International Conference on Humanoid Robots, Paris, France (2009) pp. 561567.Google Scholar
27.Allison, B., Nejat, G. and Kao, E., “The Design of an Expressive Human-like Socially Assistive Robot,” ASME J. Mechanisms Robotics, 1 (1), (2009).Google Scholar
28.Lin, C. Y., Tseng, C. K., Teng, W. C., Lee, W. C., Kuo, C. H., Gu, H. Y., Chung, K. L. and Fahn, C. S., “The realization of robot theater-humanoid robots and theatric performance,” Proceedings of the International Conference on Advanced Robotics, Munich, Germany (2009) pp. 16.Google Scholar
29.Lin, C. Y., Cheng, L. C. and Tseng, C. K., “The development of mimetic engineering for theatric android head,” Proceedings of the International Conference on Service and Interactive Robotics, Taipei, Taiwan (2009).Google Scholar
30.Lin, C. Y., Cheng, L. C., Tseng, C. K., Gu, H. Y., Chung, K. L., Fahn, C. S., Lu, K. J. and Chang, C. C., “A face robot for autonomous simplified musical notation reading and singing,” J. Rob. Auton. Syst. 59 (11), 943953 (2011).CrossRefGoogle Scholar
31.Ekman, P. and Friesen, W. V., Unmasking the Face (Prentice-hall, Englewood Cliffs, 1975).Google Scholar
32.Ekman, P. and Friesen, W. V., The facial action coding system (Consulting Psychologists Press, Palo Alto, 1978).Google Scholar
33.Jenkins, G. W., Kemnitz, C. P. and Tortora, G. J., Anatomy and physiology: from science to life (John Wiley & Sons Inc. Press, New York, 2007).Google Scholar
34.Cheng, L. C., “Development, Evaluation and Improvement of Humanoid Robot Head for Theatric Performance,” PhD dissertation, Dept. ME, NTUST, Taipei, Taiwan, ROC (2012).Google Scholar
35.Dodgson, N. A., “Variation and extrema of human interpupillary distance,” Proceedings of SPIE-The International Society for Optical Engineering, San Jose, CA (2004) pp. 3646.Google Scholar
36.Cheng, L. C., Lin, C. Y., and Huang, C. C., “Visualization of Facial Expression Deformation Applied to the Mechanism Improvement of Face Robot,” Int. J. Soc. Rob. 5 (4), 423439 (2013).CrossRefGoogle Scholar
37.Dailey, M. N., Cottrell, G. W., Padgett, C. and Adolphs, R., “EMPATH: A neural network that categorizes facial expressions,” J. Cognitive Neurosci. 14 (8), 11581173 (2002).CrossRefGoogle ScholarPubMed
38.Aviezer, H., Hassin, R., Ryan, J., Grady, G., Susskind, J., Anderson, A., Moscovitch, M. and Bentin, S., “Angry, disgusted or afraid? Studies on the malleability of emotion perception,” Psychological Science 19 (7), 724732 (2008).CrossRefGoogle ScholarPubMed
39.Jack, R., Blais, C., Scheepers, C., Schyns, P. and Caldara, R., “Cultural confusions show that facial expressions are not universal,” Curr. Biol. 19 (18), 15431548 (2009).CrossRefGoogle Scholar
40.Kanade, T., Cohn, J. F. and Tian, Y., “Comprehensive Database for Facial Expression Analysis,” Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France (2000) pp. 484490.Google Scholar
41.Cohen, I., Sebe, N., Garg, A., Chen, L. S., and Huang, T. S., “Facial expression recognition from video sequences temporal and static modeling,” Comput. Vis. Image Understanding 91 (1–2), 160187 (2003)CrossRefGoogle Scholar
42.Pantic, M. and Patras, I., “Dynamics of Facial Expression Recognition of Facial Actions and Their Temporal Segments From Face Profile Image Sequen,” IEEE Trans. Syst. Men Cybernetics B: Cybernetics 36 (2), 433449 (2006).CrossRefGoogle Scholar
43.Wang, S. Y., “The introduction of patent analysis on global industrial robot-The patents for mechanical transmission techniques,”(in Chinese), Retrived March 2, 2014. [Online] http://www.robotworld.org.tw/index.htm?pid=10&News_ID=4409.Google Scholar