Development of Training Game Application using Eye-gaze Control Technology to Support Employment of Physically challenged people

Assistive devices using eye-gaze control technology have been developed to control personal computers (PCs) so that they can be used by children and adults who have significant physical disabilities. In this time, we focused on human eye movements, which can be very effective as final input controls because they can often provide a means of input for people suffering from extensive physical disabilities. We describe a game application developed to assist persons when they practice controlling their PCs using eye-gaze control. Subjects were given a trial of an eye-controlled game in which they performed the task of counting vehicles. From the experimental results, we have confirmed the effectiveness of our proposed training application, although it needs to be reconsidered for extended use.


Introduction
In recent years, Japan's declining birthrate and aging population have led to a rapid increase in the proportion of elderly people, which has also resulted in an increasing burden on the working-age population. If the current situation continues, the per capita social security burden is expected to increase considerably. Therefore, we conjecture that if more physically challenged people could be gainfully employed, the above-mentioned situation could be improved. For this reason, we focused on human eye movements, which can be very effective as final input controls because they can often provide a means of input for people suffering from extensive physical disabilities.
Herein, we examine whether eye-gaze control could provide a means by which people with physical disabilities could perform simple traffic survey work (1)(2)(3) . This job type was selected because we thought that it would be relatively simple to replace the hand-based counting actions with the eye-gaze-based counting actions. This is particularly relevant as software that allows users to control personal computers (PCs) using eye-gaze control technology has already been developed and is becoming increasingly available. These applications are primarily designed to assist children and adults with significant physical disabilities when controlling their PCs. However, in order to control a PC using eye-gaze control technology, proper training is necessary. Accordingly, in this study, we first report on the development of a game application that allows users to experience conducting a traffic survey simply by looking at the traffic, after which we examine the prospect of employing users of our eye-gaze tracking system when conducting actual traffic surveys.

Eye-gaze Control device "Tobii EyeX"
The sensor used to capture the line of sight in this study is the Tobii EyeX, manufactured by Tobii Technology (4) (Stockholm, Sweden). This sensor is installed under the screen of the PC monitor. The specifications and overview of the device are shown in Table 1 and Fig. 1.

Tracking Core Software for "Tobii EyeX"
Tobii Eye Tracking Core Software, which is the configuration software for Tobii EyeX, was used to perform calibrations and other settings. This software allows calibration information to be saved for each individual and is compatible with the use of eyeglasses and contact lenses. Figure 2 shows how the screen-based eye tracker works. The system consists of a near-infrared light-emitting diode (LED), an eye-tracking camera, and an arithmetic processing unit that includes image detection, 3D eye modeling, and eye-tracking algorithms.

Mechanism of operation of screen-based eye trackers
(1) The eye tracker consists of cameras, illuminators, and algorithms. (2) The illuminators create a pattern of near-infrared light on the user's eyes.
(3) The cameras take high-resolution images of the user's eyes and the patterns. (4) The image processing algorithms find specific details and reflection patterns in the user's eyes. (5) Based on these details, the eyes' position and gaze point are calculated. For instance, by using a sophisticated three-dimensional (3D) eye model algorithm on a PC monitor.

Development environment
This section describes the development environment we used to develop system applications.

■ Unity 2018
This operating system (OS) is a game development platform developed by Unity Technologies (San Francisco, Ca.). This OS was selected because the system designers have provided a freeware package that supports Tobii EyeX. The version number is 2018.4.9f1. ■ Visual Studio 2017 : We used this application to write scripts for use in Unity. The development language used was C#. ■ Tobii Eye Tracking Core Software This is the configuration software for Tobii EyeX that was used to perform calibrations and other settings. As stated above, it allows calibration information to be saved for each  The cameras take high-resolution images of the user's eyes and the patterns 3 Based on these details, the eyes' position and gaze point are calculated. For instance, by using a sophisticated three-dimensional (3D) eye model algorithm on a PC monitor.

The Eye Tracker
The illuminators create a pattern of near-infrared light on the user's eyes The image processing algorithms find specific details and reflection patterns in the user's eyes 4 The eye tracker consists of cameras, illuminators, and algorithms 1 Gaze point individual and supports the use of eyeglasses and contact lenses.

■ Tobii Unity for Desktop
This is a Tobii EyeX SDK utility package for Unity that enables users to obtain the eye positions as screen coordinates.

Overview of application developed for practicing PC operation with eye-gaze control
This system allows the user to experience a simple traffic survey in a virtual space via eye-gaze control. The rules are designed as a game to measure the prospects of conducting actual traffic volume surveys via eye contact. In this traffic survey simulation, users count the number of cars traveling in both directions on a simulated road. Here it should be noted that it does not count each type of car separately. Instead, it simply counts the number of cars.

Basic process
The basic process consists of the following three items: (1) Obtaining the current position in screen coordinates from the eye-taking sensor. This will display a white circle (marker cursor) at the viewed position. (2) Using the acquired coordinates, the marker cursors changes the direction of the camera, determines whether a button is "pressed", and so on. (3) In order to count the vehicles, a "+" and "-" button is provided. The two buttons are used to count up and down the number of vehicles. The moment the user places his/her eyes there, the button is activated. In this case, in order to follow rapid eye movements, the cursor movement was programmed to react instantly.

Game rules and preferences
The game is cleared when the number of cars that pass in each direction within the time limit is counted and the error rate of the number of cars that are counted is less than 10% of the number that actually pass. The error rate is derived from Formula (1) shown below.

Default settings
When the app is launched, it is necessary to set a time limit and a difficulty level. For physically challenged persons, this process is usually performed by their caregivers. In the preferences screen, the buttons cannot be interacted with initially, so the user or caregiver must calibrate their eye movement sensitivity level by holding his/her gaze steady on the button point for a certain amount of time, as shown in Fig.3 (a-b). Then the user can select one of three choices for the time limits: 3, 5, or 10 min, and "easy" or "normal" as the difficulty level. Equations should be written as follows with a serial number, as shown in Fig. 4 (a-b).

Car counting
After the preferences have been set, the game begins. To count one car, the user holds his/her gaze on the white "+" button in the upper half of the screen. The button turns blue when the car is counted. Conversely, if he/she wants to subtract a car to correct the count, the "-" button on the bottom half of the screen is pressed to reduce the count by one car.
(a) Initial settings1 -setting the time limit.
(b) Initial settings2 -setting the difficulty Level.  (b) After adding one vehicle, the "+" button becomes blue. Fig. 4. Example of the vehicle counting screen.

Experimental results and discussion
The test subjects in this evaluation experiment were seven healthy non-disabled persons. After explaining how to play the game, the participants were seated in front of the PC monitor and assisted through a few practice games before playing the game by themselves.
Two difficulty levels, easy and normal, were played for five minutes by each of the seven subjects. The number of vehicles per 10 seconds was four for the easy level and seven for the normal level. Hence, the difficulty level of the practice game was determined by the difference in the number of passing vehicles per minute. Table 2 shows the experimental result of a game trial at the easy difficulty level. The average error rate of the seven subjects was calculated based on these results and is shown for each difficulty level in Fig. 5.

Experimental Results of Counting the vehicles
In this experiment, the subjects were asked to play the "easy" difficulty level first. In the first minute, all of the subjects tended to make numerous counting mistakes as they worked to gain familiarity with the eye-gaze control operation. However, in addition to showing the results of the mean error rate of the vehicle counting during trial games at both the easy and "normal" difficulty levels, Fig. 5 also shows that the average error rate improved as the subjects became accustomed to making the required eye movements.

Discussion
Figure 5 also shows that in the "easy" difficulty level, both the mean error rate and the standard deviation tended to be large, probably due to the subjects' unfamiliarity with the eye-gaze control method. However, the mean error rate gradually improved with the passage of time. On the other hand, when playing at the "normal" difficulty level, the subjects rapidly became accustomed to the eye-gaze control and the average error rate tended to improve until around three minutes had passed.
Around that time, however, the degree of fatigue increased and the average error rate tended to worsen again. Based on these experimental results, we determined that vehicle counting by eye-gaze control is possible, but that it appears to be unsuitable for long-term work. We were also able to confirm that being able to use a PC with eye-gaze control is a great support tool for people with physical disabilities.

Conclusions
Herein, we reported on the development of a practice application for vehicle counting tasks using eye-gaze control technology as a means to provide gainful employment to physically disabled persons. However, we found that there was a tendency for counting errors to increase after three minutes of the vehicle counting operation, primarily due to eye fatigue. This indicates that our system is unsuitable for tasks that require extended periods of concentration. As a potential solution, it might be possible to have artificial intelligence (AI) software perform the vehicle counting task and employ physically challenged persons to check the results.