Hostname: page-component-586b7cd67f-t7fkt Total loading time: 0 Render date: 2024-11-24T11:49:55.905Z Has data issue: false hasContentIssue false

Cerebral and gaze data fusion for wheelchair navigation enhancement: case of distracted users

Published online by Cambridge University Press:  25 September 2018

Hachem A. Lamti*
Affiliation:
COnception de Systemes Mecaniques et Robotiques (COSMER) Laboratory, South University, Toulon-Var, France. E-mail: [email protected]
Mohamed Moncef Ben Khelifa
Affiliation:
Impact de l'Activite Physique sur la Sante (IAPS) Laboratory, South University, Toulon-Var, France. E-mail: [email protected]
Vincent Hugel
Affiliation:
COnception de Systemes Mecaniques et Robotiques (COSMER) Laboratory, South University, Toulon-Var, France. E-mail: [email protected]
*
*Corresponding author. E-mail: [email protected]

Summary

The goal of this paper is to present a new hybrid system based on the fusion of gaze data and Steady State Visual Evoked Potentials (SSVEP) not only to command a powered wheelchair, but also to account for users distraction levels (concentrated or distracted). For this purpose, a multi-layer perception neural network was set up in order to combine relevant gazing and blinking features from gaze sequence and brainwave features from occipital and parietal brain regions. The motivation behind this work is the shortages raised from the individual use of gaze-based and SSVEP-based wheelchair command techniques. The proposed framework is based on three main modules: a gaze module to select command and activate the flashing stimuli. An SSVEP module to validate the selected command. In parallel, a distraction level module estimates the intention of the user by mean of behavioral entropy and validates/inhibits the command accordingly. An experimental protocol was set up and the prototype was tested on five paraplegic subjects and compared with standard SSVEP and gaze-based systems. The results showed that the new framework performed better than conventional gaze-based and SSVEP-based systems. Navigation performance was assessed based on navigation time and obstacles collisions.

Type
Articles
Copyright
Copyright © Cambridge University Press 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Xerfi, “Zoom sur le march des fauteuils roulants,” Available at: https://www.capgeris.com/tendances-marches-1411/zoom-sur-le-marche-des-fauteuils-roulants-a28845.htm, Accessed 19 January 2017.Google Scholar
2. World Health Organization. World Health Statistics (WHO Press, World Health Organization, Geneva, Switzerland, 2008). ISBN 9789240682740.Google Scholar
3. Randolph, A. B. and Moore Jackson, M. M., “Assessing fit of nontraditional assistive technologies,” ACM Trans. Access. Comput. 2 (4), 16:116:31 (Jun. 2010). ISSN .Google Scholar
4. Vander Poorten, E. B., Demeester, E., Hüntemann, A., Reekmans, E., Philips, J. and De Schutter, J., “Backwards Maneuvering Powered Wheelchairs with Haptic Guidance,” Proceedings of the International Conference on Haptics: Perception, Devices, Mobility, and Communication - Volume Part I, EuroHaptics'12, Berlin, Heidelberg: Springer-Verlag (2012) pp. 419–431. ISBN 978-3-642-31400-1.Google Scholar
5. Ren, M. and Karimi, H. A., “A fuzzy logic map matching for wheelchair navigation,” GPS Solutions 16 (3), 273282 (2012). ISSN .Google Scholar
6. Urdiales, C., Perez, E. J., Peinado, G., Fdez-Carmona, M., Peula, J. M., Annicchiarico, R., Sandoval, F. and Caltagirone, C., “On the construction of a skill-based wheelchair navigation profile,” IEEE Trans. Neural Syst. Rehabil. Eng. 21 (6), 917927 (Nov. 2013). ISSN .Google Scholar
7. How, T.-V., Wang, R. and Mihailidis, A., “Evaluation of an intelligent wheelchair system for older adults with cognitive impairments,” J. NeuroEng. Rehabil. 10 (1), 90 (2013).Google Scholar
8. Tavares, J., Barbosa, J., Costa, C., Yamin, A. and Real, R., “A Smart Wheelchair Based on Ubiquitous Computing,” Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments PETRA '13, New York, NY, USA: ACM (2013) pp. 1:1–1:4. ISBN 978-1-4503-1973-7.Google Scholar
9. Yanco, H. A., “A Robotic Wheelchair System: Indoor Navigation and User Interface,” In: Lecture notes in Artificial Intelligence: Assistive Technology and Artificial Intelligence (Mittal, V. O., Yanco, H. A., Aronis, J. and Simpson, R., eds.) (Springer-Verlag, 1998) pp. 256268.Google Scholar
10. Lin, C., HO, C. W., Chen, W. C., Chiu, C. C. and Yeh, M. S., “Powered wheelchair controlled by eye-tracking system,” Opt. Appl. XXXVI (2–3), 401412 (2006).Google Scholar
11. Bartolein, C., Wagner, A., Jipp, M. and Badreddin, E., “Easing wheelchair control by gaze-based estimation of intended motion,” IFAC Proceedings Volumes. 41 (2), 91629167 (2008).Google Scholar
12. Wolpaw, J. R., Birbaumer, N., McFarland, D. J., Pfurtscheller, G. and Vaughan, T. M., “Braincomputer interfaces for communication and control,” Clin. Neurophysiol. 113 (6), 767791 (2002). ISSN .Google Scholar
13. Pfurtscheller, G. and Neuper, C., “Motor imagery and direct brain-computer communication,” Proc. IEEE 89 (7), 11231134 (Jul. 2001). ISSN .Google Scholar
14. Xu, P., Yang, P., Lei, X. and Yao, D., “An enhanced probabilistic lda for multi-class brain computer interface,” PLoS ONE 6 (1), e14634 (01 2011).Google Scholar
15. Bevilacqua, V., Tattoli, G., Buongiorno, D., Loconsole, C., Leonardis, D., Barsotti, M., Frisoli, A. and Bergamasco, M., “A Novel BCI–SSVEP Based Approach for Control of Walking in Virtual Environment Using a Convolutional Neural Network,” Proceedings of the International Joint Conference on Neural Networks IJCNN (Jul. 2014) pp. 4121–4128.Google Scholar
16. Burkitt, G. R., Silberstein, R. B., Cadusch, P. J. and Wood, A. W., “Steady-state visual evoked potentials and travelling waves,” Clin. Neurophysiol. 111 (2), 246258 (2000). ISSN .Google Scholar
17. da Cruz, J. N., Wong, C. M. and Wan, F., “An SSVEP-Based BCI With Adaptive Time-Window Length,” Proceedings of the 10th International Conference on Advances in Neural Networks - Volume Part II, ISNN13, Berlin, Heidelberg: Springer-Verlag (2013) pp. 305–314. ISBN 978-3-642-39067-8.Google Scholar
18. Muller, S. M. T., Bastos-Filho, T. F. and Sarcinelli-Filho, M., “Using a SSVEP–BCI to Command a Robotic Wheelchair,” Proceedings of the IEEE International Symposium on Industrial Electronics ISIE (Jun. 2011) pp. 957–962.Google Scholar
19. Diez, P. F., Torres Mller, S. M., Mut, V. A., Laciar, E., Avila, E., Bastos-Filho, T. F. and Sarcinelli-Filho, M., “Commanding a robotic wheelchair with a High-Frequency Steady-State Visual Evoked Potential Based BrainComputer Interface,” Med. Eng. Phys. 35 (8), 11551164 (2013). ISSN .Google Scholar
20. Mandel, C., Luth, T., Laue, T., Rofer, T., Graser, A. and Krieg-Bruckner, B., “Navigating a Smart Wheelchair with a Brain–Computer Interface Interpreting Steady-State Visual Evoked Potentials,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems IROS2009 (Oct. 2009) pp. 1118–1125.Google Scholar
21. Allison, B. Z., Jin, J., Zhang, Y. and Wang, X., “A four-choice hybrid p300/SSVEP BCI for improved accuracy,” Brain-Comput. Interfaces 1 (1), 1726 (2014).Google Scholar
22. Leeb, R., Sagha, H., Chavarriaga, R. and del R Millen, J., “A hybrid brain–computer interface based on the fusion of electroencephalographic and electromyographic activities,” J. Neural Eng. 8 (2), 025011 (2011).Google Scholar
23. Lamti, H. A., Ben Khelifa, M. M., Gorce, Ph. and Alimi, A. M., “A brain and gaze-controlled wheelchair,” Comput. Methods Biomech. Biomed. Eng. 16 (sup1), 128129 (2013).Google Scholar
24. Vilimek, R. and Zander, O., “Bc(eye): Combining Eye-Gaze Input with Brain-Computer Interaction,” In: Universal Access in Human–Computer Interaction. Intelligent and Ubiquitous Interaction Environments, Lecture Notes in Computer Science, vol. 5615 (Stephanidis, C., ed.) (Springer, Berlin-Heidelberg, 2009) pp. 593602. ISBN 978-3-642-02709-3.Google Scholar
25. Imai, T., Moore, S. T., Raphan, T. and Cohen, B., “Interaction of the body, head, and eyes during walking and turning,” Exp. Brain Res. 136 (1), 118 (Jan. 2001).Google Scholar
26. Hollands, M. A., Patla, A. E. and Vickers, J. N., ““Look where you're going!”: Gaze behaviour associated with maintaining and changing the direction of locomotion,” Exp. Brain Res. 143 (2), 221230 (Mar. 2002).Google Scholar
27. Friman, O., Volosyak, I. and Graser, A., “Multiple channel detection of steady-state visual evoked potentials for brain-computer interfaces,” IEEE Trans. Biomed. Eng. 54 (4), 742750 (Apr. 2007). ISSN .Google Scholar
28. Valbuena, D., Cyriacks, M., Friman, O., Volosyak, I. and Graser, A., “Brain–Computer Interface for High-Level Control of Rehabilitation Robotic Systems,” Proceedings of the IEEE 10th International Conference on Rehabilitation Robotics ICORR (Jun. 2007) pp. 619–625.Google Scholar
29. Friman, O., Luth, T., Volosyak, I. and Graser, A., “Spelling with Steady-State Visual Evoked Potentials,” Proceedings of the 3rd International IEEE/EMBS Conference on Neural Engineering CNE07 (May 2007) pp. 354–357.Google Scholar
30. Ranney, T. A., Garrott, W. R. and Goodman, M., “Nhtsa driver distraction research: Past, present and future,” Available at: https://www-nrd.nhtsa.dot.gov/departments/Human%20Factors/driver-distraction/Papers20233.htm#A233, Accessed July 2017.Google Scholar
31. Jimenez, P., Bergasa, L. M., Nuevo, J., Hernandez, N. and Daza, I. G., “Gaze fixation system for the evaluation of driver distractions induced by ivis,” IEEE Trans. Intell. Transp. Syst. 13 (3), 11671178 (Sep. 2012).Google Scholar
32. Santamaria, J. and Chiappa, K. H., “The EEG of drowsiness in normal adults,” J. Clin. Neurophysiol. 4 (4), 327382 (1987).Google Scholar
33. Renner, G. and Mehring, S., “Lane departure and drowsiness–-two major accident causes–-one safety system,” Technical report, Transport Research Laboratory (1997).Google Scholar
34. Galley, N., Schleicher, R. and Galley, L., “Blink parameter as indicators of drivers sleepiness–-possibilities and limitations,” Vis. Vehicles 10, 189196 (2004).Google Scholar
35. Wierwille, W. W., Ellworth, L. A., Fairbank, R. J., Wreggit, S. S. and Kim, C. L., “Research on vehicle-based driver status/performance monitoring: Development, validation, and refinement of algorithms for detection of driver drowsiness,” Technical report, National Highway Traffic Safety Administration (1994).Google Scholar
36. Evain, A., Argelaguet, F., Roussel, N., Casiez, G. and Lécuyer, A., “Can I Think of Something Else When Using a BCI?: Cognitive Demand of an SSVEP-Based BCI,” Proceedings of the CHI Conference on Human Factors in Computing Systems, CHI17, New York, NY, USA: ACM (2017) pp. 5120–5125. ISBN 978-1-4503-4655-9.Google Scholar
37. Lamti, H. A., Ben Khelifa, M. M., Alimi, A. M. and Gorce, Ph., “Effect of fatigue on SSVEP during virtual wheelchair navigation,” J. Theor. Appl. Inform. Technol. 65, 110 (2014a).Google Scholar
38. Lin, J. K., Grier, D. G. and Cowan, J. D., “Feature Extraction Approach to Blind Source Separation,” Proceedings of the IEEE Workshop on Neural Networks for Signal Processing NNSP, IEEE Press (1997) pp. 398–405.Google Scholar
39. Welch, P., “The use of fast fourier transform for the estimation of power spectra: A method based on time averaging over short, modified periodograms,” IEEE Trans. Audio Electroacoust. 15 (2), 7073 (Jun. 1967). ISSN .Google Scholar
40. Goodrich, M. A. and Schultz, A. C., “Human–robot interaction: A survey,” Found. Trends Hum.-Comput. Interact. 1 (3), 203275 (Jan. 2007). ISSN .Google Scholar
41. Boer, E. R., “Behavioral Entropy as a Measure of Driving Performance,” Proceedings of the 432 1st International Driving Symposium on Human Factors in Driver Assessment, Training, and 433 Vehicle Design (2001).Google Scholar
42. Li, X., Fang, W. and Zhou, Y., “Mental workload prediction model based on information entropy,” Comput. Assist. Surg. 21 (sup1), 116123 (2016).Google Scholar
43. Lin, Y.-P., Wang, C.-H., Wu, T.-L., Jeng, S.-K. and Chen, J.-H., “Multilayer Perceptron for EEG Signal Classification During Listening to Emotional Music,” Proceedings of the TENCON IEEE Region 10 Conference (Oct. 2007) pp. 1–3.Google Scholar
44. Moré, J., “The Levenberg-Marquardt Algorithm: Implementation and Theory,” In: Numerical Analysis, Lecture Notes in Mathematics, vol. 630 (Watson, G. A., ed.) (Springer, Berlin, Heidelberg, 1978) chapter 10, pp. 105116. ISBN 978-3-540-08538-6.Google Scholar
45. Herwig, U., Satrapi, P. and Schnfeldt-Lecuona, C., “Using the international 10–20 EEG system for positioning of transcranial magnetic stimulation,” Brain Topography 16 (2), 9599 (2003). ISSN .Google Scholar
46. Beyer, L., Hermans, A. and Leibe, B., “Drow: Real-time deep learning-based wheelchair detection in 2-d range data,” IEEE Robot. Autom. Lett. 2 (2), 585592, Apr. 2017.Google Scholar
47. Waytowich, N. R., Lawhern, V., Garcia, J. O., Cummings, J., Faller, J., Sajda, P. and Vettel, J. M., “Compact convolutional neural networks for classification of asynchronous steady-state visual evoked potentials,” CoRR, abs/1803.04566 (2018).Google Scholar
48. Siswoyo, A., Arief, Z. and Sulistijono, I. A., “Application of artificial neural networks in modeling direction wheelchairs using neurosky mindset mobile (EEG) device,” 5, 07 (2017).Google Scholar
49. Lamti, H. A., Ben Khelifa, M. M., Alimi, A. M. and Gorce, Ph., “Emotion detection for wheelchair navigation enhancement,” Robotica 34 (6), 118 (2014b).Google Scholar
50. Lamti, H. A., Ben Khelifa, M. M., Alimi, A. M. and Gorce, P., “Influence of Mental Fatigue on p300 and SSVEP During Virtual Wheelchair Navigation,” Proceedings of the 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society EMBC (Aug. 2014c) pp. 1255–1258.Google Scholar