A survey on movement analysis (hand, eye, body) and facial expressions-based diagnosis autism disorders using microsoft kinect V2

Authors

  • Al-Jubouri AA College of IT, University of Babylon, Hillah, Iraq
  • Ali IH College of IT, University of Babylon, Hillah, Iraq

Keywords:

Autism Spectrum Disorders, Autism, Kinect v2, facial expressions, hand movement analysis, eye movement analysis, body movement analysis

Abstract

Kinect v2 may enhance the clinical practice of autism spectrum disorders (ASD). ASD means disorders of neurodevelopment that lasts a lifetime, which occurs in early childhood and usually associated with unusual movement and gait disturbances. The earlier diagnosis of ASD helps of providing well known of these disorders. The methods which are adopted by experts in diagnosis are expensive, time-consuming, and difficult to replicate, as it is based on manual observation and standard questionnaires to look for certain signs of behavior. This paper, to the best of our knowledge, is a first attempt to collect the previous researches of the Kinect v2 in the disorder's diagnosis. Relevant papers are divided into four groups which are: (1) papers suggest a system based on the analysis of facial expressions, (2) papers suggest a system based on the analysis of hand movement, (3) papers suggest a system based on analysis of eye movement, and (4) papers suggest a system based on analysis of body movement.

References

American Psychiatric Association, Diagnostic and Statistical Manual of Mental Disorders, 5th ed. Arlington: VA American Psychiatric Publishing, 2013.

C. Rhind et al., “An examination of autism spectrum traits in adolescents with anorexia nervosa and their parents.” Molecular autism vol. 5, no.1, p. 56. 20 Dec. 2014. doi:10.1186/2040-2392-5-56.

A. Tapus et al., “Children with Autism Social Engagement in Interaction with Nao , an Imitative Robot- A Series of Single Case Experiments,” Interaction Studies: Social Behaviour and Communication in Biological and Artificial Systems, vol. 13, Issue 3, p. 315 –347, Jan 2012. doi:10.1075/is.13.3.01tap.

S. Jaiswal et al., “Automatic Detection of ADHD and ASD from Expressive Behaviour in RGBD Data,” 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017), Washington, DC, USA, 30 May-3 June 2017. pp. 715-722. doi:10.1109/FG.2017.95.

J. Bloom, American Council On Science And Health, June 7, 2017. [Online]. Available: https://www.acsh.org/news/2017/06/07/mit-researcherglyphosate-will-cause-half-all-children-be-autistic-2025- yeah-sure-11337. [Last Accessed: July 2019]

J. Orquin and K. Holmqvist, “Threats to the validity of eye-movement research in psychology” Behavior Research Methods, vol. 50, p.1645–1656, Aug 2018. [Online]. Available: doi:10.3758/s13428-017-0998-z.

T. Y. Tang, “Helping Neuro-typical Individuals to "Read" the Emotion of Children with Autism Spectrum Disorder: an Internet-of-Things Approach,” IDC '16 Proceedings of the The 15th International Conference on Interaction Design and Children, Manchester, United Kingdom, June 21 - 24, 2016. pp. 666-671. doi: 10.1145/2930674.2936009.

A. E. Youssef et al., "Auto-Optimized Multimodal Expression Recognition Framework Using 3D Kinect Data for ASD Therapeutic Aid," International Journal of Modeling and Optimization, vol. 3, no. 2, pp. 112-115, 2013. [Online]. Available: 10.7763/IJMO.2013.V3.247.

D. Zhao et al., “Facial Expression Detection Employing a Brain Computer Interface,” 2018 9th International Conference on Information, Intelligence, Systems and Applications (IISA), Zakynthos, Greece, Greece, 23-25 July 2018. pp. 1-2. doi: 10.1109/IISA.2018.8633661.

M. Pantic and L. Ü. M. Rothkrantz, “Automatic analysis of facial expressions: The state of the art,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, no. 12, pp. 1424–1445, Dec 2000. doi: 10.1109/34.895976.

T. Baudel and M. Baudouin-Lafon, “Charade: Remote Control of Objects Using Free-Hand Gestures,” Comm. ACM, vol. 36, no. 7, pp. 28-35, Jul 1993. [Online]. Available: doi:10.1145/159544.159562.

GRS Murthy and RS Jadon, "A review of vision based hand gestures recognition," International Journal of Information Technology and Knowledge Management, vol. 2, no.2, pp. 405-410. Jul 2009.

L. Chen et al., “A survey of human motion analysis using depth imagery,” Pattern Recognition Letters, vol.34, Issue 15, pp.1995-2006, 1 November 2013. doi: https://doi.org/10.1016/j.patrec.2013.02.006.

L. M. Pedro and G. Augusto, “Kinect evaluation for human body movement analysis,” 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob)Z, Rome, Italy, 24-27 June 2012. doi: 10.1109/BioRob.2012.6290751.

Y. Li and A. S. Elmaghraby, “A framework for using games for behavioral analysis of autistic children,” 2014 Computer Games: AI, Animation, Mobile, Multimedia, Educational and Serious Games (CGAMES), Louisville, KY, USA, 28-30 July 2014. pp. 130–133. doi: 10.1109/CGames.2014.6934157.

A.Ramírez-Duque et al., “Robot-Assisted Autism Spectrum Disorder Diagnostic Based on Artificial Reasoning,” J Intell Robot Syst, vol. 96, pp.267–281, 29 March 2019. doi: 10.1007/s10846-018-00975-y.

F. Gomez-donoso et al., “Automatic Schaeffer ‟ s Gestures Recognition System,” Expert Systems, vol. 33, no.5, pp. 480–488, 13 July 2016. doi: https://doi.org/10.1111/exsy.12160.

S. Oprea et al., “A recurrent neural network based Schaeffer gesture recognition system,” 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, AK, USA, 14-19 May 2017. pp. 425–431. July 2017. doi: 10.1109/IJCNN.2017.7965885.

E. Marinoiu et al., “3D Human Sensing, Action and Emotion Recognition in Robot Assisted Therapy of Children with Autism,” 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18-23 June 2018. pp. 2158–2167. December 2018. doi: 10.1109/CVPR.2018.00230.

M. Magrini et al., “An Interactive Multimedia System for Treating Autism Spectrum Disorder,” Springer, Cham, vol. 9914, pp. 331–342, November 2016. doi: https://doi.org/10.1007/978-3-319-48881-3_23.

B. Ge, "Detecting engagement levels for autism intervention therapy using RGB-D camera," M.S. thesis, School of Electrical and Computer Engineering, Georgia Institute of Technology, May 2016. Accessed on: 25 June 2019. http://hdl.handle.net/1853/55043

JY Kang et al.,” Automated Tracking and Quantification of Autistic Behavioral Symptoms Using Microsoft

Kinect,” Stud Health Technol Inform, vol.220, pp. 167 –170, 2016. doi: 10.3233/978-1-61499-625-5-167.

R. Barmaki, ”Gesture Assessment of Teachers in an Immersive Rehearsal Environment,” Ph.D. dissertation, Computer Science, University of Central Florida, August 2016. [Online]. Available: http://purl.fcla.edu/fcla/etd/CFE0006260

A. Zaraki et al., “Toward autonomous child-robot interaction: development of an interactive architecture for the humanoid kaspar robot,” In 3rd Workshop on Child-Robot Interaction (CRI2017) in International Conference on Human Robot Interaction (ACM/IEEE HRI 2017), Vienna, Austria, Mar 2017, pp. 6-9.

I.Budman et al., “Quantifying the social symptoms of autism using motion capture,” Springer Nature, vol. 9,Issue 7712, pp. 1–8, May 2019. doi: 10.1038/s41598-019-44180-9.

S. Piana et al., “Effects of Computerized Emotional Training on Children with High Functioning Autism,”IEEE Trans. Affect. Comput., p. 1. May 2019. doi: 10.1109/TAFFC.2019.2916023.

M. Uljarevic and A. Hamilton, “Recognition of Emotions in Autism: A Formal Meta-Analysis,” J Autism Dev Disord, vol.43, pp.1517–1526. 2013. doi: 10.1007/s10803-012-1695-5

K. A. Pelphreyet al., “Visual Scanning of Faces in Autism,” J Autism Dev Disord, vol. 32, no. 4, pp. 249-261, Aug 2002. doi:10.1023/a:1016374617369.

K. Humphreys et al., “A fine-grained analysis of facial expression processing in high-functioning adults with autism,”Neuropsychologia, vol. 45. Issue 4, pp. 685–695, 2007. doi: 10.1016/j.neuropsychologia.2006.08.003.

C. Cook et al., “Alexithymia, not autism, predicts poor recognition of emotional facial expressions,” Psychol Sci, vol. 24, Issue 5, pp. 723-732, May 2013. doi: 10.1177/0956797612463582.

C. Nolker and H. Ritter, “Detection of Fingertips in Human Hand Movement Sequences,” Springer, Berlin, Heidelberg, volume 1371, pp.209-218. May 2006. doi: https://doi.org/10.1007/BFb0053001

A. Bulling et al., “Eye Movement Analysis for Activity Recognition Using Electrooculography Andreas,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol: 33, Issue: 4, pp. 741 – 753, 2010. doi: 10.1109/TPAMI.2010.86.

J. Hyönä, (2011). Documenting literature, “The usefulness and limitations of eye-tracking in the study of reading (and writing)”. [Online]. Available: http://www.writingpro.eu/upload/presentations/Summerschool_EyeTracking_JukkaHyona.pdf

Spezio ML et al., “Analysis of face gaze in autism using „Bubbles‟,” vol. 45, pp. 144–151, 2007. DOI: 10.1016/j.neuropsychologia.2006.04.027.

D. Fiedler and H. Muller, “Impact of Thermal and Environmental Conditions on the Kinect Sensor”. Advances in Depth Image Analysis and Applications, vol 7854, 2013. https://doi.org/10.1007/978-3-642-40303-3_3.

N. Smolyanskiy et al., “Real-time 3D face tracking based on active appearance model constrained by depth data”. Image and Vision Computing, vol. 32, Issue 11, pp. 860–869, Nov 2014. [Online]. Available: doi: https://doi.org/10.1016/j.imavis.2014.08.005

E. Lachat et al., “Assessment and Calibration of a RGBD Camera (Kinect v2 Sensor) Towards a Potential Use for Close-Range 3D Modeling”. Remote Sensing, vol.7, pp. 13070-13097, 2015. [Online]. Available: doi: DOI:10.3390/rs71013070

R. Seggers, “People Tracking in Outdoor Environments Evaluating the Kinect 2 Performance in Different Lighting Conditions”, Computer Science. June 26th, 2015.

G.R.S. Murthy and R.S. Jadon, “Computer Vision Based Human Computer Interaction. Journal of Artificial Intelligence,” Journal of Artificial Intelligence, vol. 4, pp. 245-256. December 08, 2011. Available doi:10.3923/jai.2011.245.256

Downloads

Published

2024-02-26

How to Cite

Al-Jubouri, A. A., & Ali, I. H. (2024). A survey on movement analysis (hand, eye, body) and facial expressions-based diagnosis autism disorders using microsoft kinect V2. COMPUSOFT: An International Journal of Advanced Computer Technology, 9(01), 3566–3577. Retrieved from https://ijact.in/index.php/j/article/view/554

Issue

Section

Review Article

Similar Articles

<< < 1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.