Eye Gaze Pathway
Eye Gaze Pathway
Overview
The Eye Gaze Pathway develops the skill of using eye gaze to communicate and to learn. Each of the six steps follow a clear structure and contain practice activities, instructional videos and communication tips to help get you on your way with eye gaze.
Communication and learning are the focus points of the Eye Gaze Pathway. The practice activities are designed to demonstrate that communication is evident at each step of the pathway, regardless of the learner's experience with eye gaze. Added to this, the communication tips under the "Say" section provide some extra help for the communication partner to engage in and develop the conversation all the way through the pathway.
Learning objectives are connected to each practice activity. These relate to both the eye gaze skill that is being practised and the vocabulary that is developed through attempting the activity.
While we recommend starting at the "Screen Engagement" step and progressing through the steps at your own pace, it is not a requirement to complete one step before moving onto the next. Therefore, feel free to experiment with the pathway and the activities within each step in a way that works best for the individual.
Learning at this step is built through fun.This step shows how quick and simple it can be to get up and running with eye gaze. Learners are introduced to the idea that the movement of their eyes has a direct and immediate impact on what happens on the screen, "cause and effect".
The communication partner responds to the caterpillar's movements on-screen. These movements are directly linked to where the learner is looking, so communication is started from the learner looking at the screen.
This step does not require any calibration from the learner! However, a simple one-point calibration could be useful if possible. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your gaze software for videos about calibration.
Activities in this section are free to try if you have a Tobii Dynavox eye gaze device (separate or integrated eye tracker).
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Use phrases such as these to promote communication!
Looking for extra help with the activity?
Hearing sounds regularly is an indicator that the user is engaged with what happens on screen when they move their eyes.
The following software titles can be purchased from Tobii Dynavox and work well together with our eye trackers and gaze software. We recommend these activities for practicing Screen Engagement:
Other sources for activities:
At this step we are learning that looking at different parts of the screen produces different responses. The communication partner can model words connected to what the learner is looking at to teach new vocabulary. This new vocabulary is all based on where the learner is looking!
Two-way communication is generated by the learner looking and the communication partner responding. The vocabulary modeled by the communication partner depends on what the learner is looking at.
This step does not require any calibration from the learner! However, a simple one-point calibration could be useful if possible. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your software for videos about calibration.
Activities in this section are free to try if you have a Tobii Dynavox eye gaze device (separate or integrated eye tracker).
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Use phrases such as these to promote communication!
Looking for extra help with the activity?
The visible curser should move around different parts of the screen. Care-giver responses should be relevant to what is being looked at!
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Use phrases such as these to promote communication!
Looking for extra help with the activity?
Hearing the sounds within the visual scene indicates that the learner is gazing at different parts of the screen. Caregiver responses should be connected to the sounds heard within the visual scene!
The following software titles can be purchased from Tobii Dynavox and work well together with our eye trackers and gaze software. We recommend these activities for practicing Responding:
Other sources for activities:
Here we learn about cause and effect and that changes happen on-screen depending on where we look. Learning takes place through exploration, where the learner can freely look around and discover what happens when they look at different parts of the screen. When we are doing this we are also learning new vocabulary such as colours in painting activities or the names of different objects in scenes.
Communication at this step can come from the communication partner modelling the use of vocabulary at all areas of the screen. This teaches both navigation around the screen and new vocabulary!
For this step the learner needs to do a one point calibration. If possible, a two point calibration would work even better. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your software for videos about calibration.
Activities in this section are free to try if you have a Tobii Dynavox eye gaze device (separate or integrated eye tracker).
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Use phrases such as these to promote communication!
Looking for extra help with the activity?
Can you see different colours in different parts of the screen? Great! The learner has shown that they can paint on-screen and dwell their gaze.
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Use phrases such as these to promote communication!
Looking for extra help with the activity?
A sign of success is if some lights are on at all times, as they turn off automatically after a few seconds!
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Use phrases such as these to promote communication!
Looking for extra help with the activity?
A simple way to assess success in this instance is to witness if the learner has selected a grid/ scene and gazed at some of the options within it. If the learner manages this for several grids/ scenes, even better!
The following software titles can be purchased from Tobii Dynavox and work well together with our eye trackers and gaze software. We recommend these activities for practicing Responding:
Other sources for activities:
In order for eye tracking to work as accurately as possible, the eye tracker must know more about your eyes. This is why you need to do a calibration. During the calibration the eye tracker measures how your eyes reflect light. The calibration is done by following a point, video or other graphic element that moves across the screen. This calibration data is then combined with our unique 3D model of a human eye, and together they give you an optimal eye tracking experience.
The learner can calibrate within the settings of Gaze Point, Windows Control or Classic Tobii Gaze Interaction Software.
At this step we are learning that performing a “mouse-over” action with our eyes will make an action happen. This step really focuses on training the learner to focus their gaze on a part of the screen that they have chosen.
This is the first step where the learner is actively selecting what they want to say with their eyes! Since audio is only heard if the learner looks at a specific part of the screen, communication can be based around what the learner has decided to look at. The communication partner can then respond to the audio that the learner has played with their eyes.
For this step the learner needs as good a calibration as possible with as many calibration points as they can manage. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your software for videos about calibration.
Activities in this section are free to try if you have a Tobii Dynavox eye gaze device (separate or integrated eye tracker).
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Use phrases such as these to promote communication!
Looking for extra help with the activity?
Are any items within the scene painted? Then we can consider that to be a success. Are all items within the screen painted? Fantastic!
This practice activity is for the Windows version of Tobii Dynavox Snap Scene. Try for free by downloading the Snap Scene LITE app or buy the full version. Note that Snap Scene is also available for iPad on the App Store (only Touch access, no Eye tracking).
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Use phrases such as these to promote communication:
Looking for extra help with the activity?
Hearing audio associated with different parts of the visual scene represents that the learner has targeted their gaze effectively.
The following software titles can be purchased from Tobii Dynavox and work well together with our eye trackers and gaze software. We recommend these activities for practicing Responding:
Other sources for activities:
In order for eye tracking to work as accurately as possible, the eye tracker must know more about your eyes. This is why you need to do a calibration. During the calibration the eye tracker measures how your eyes reflect light. The calibration is done by following a point, video or other graphic element that moves across the screen. This calibration data is then combined with our unique 3D model of a human eye, and together they give you an optimal eye tracking experience.
The learner can calibrate within the settings of Gaze Point, Windows Control or Classic Tobii Gaze Interaction Software.
Modeling is when the communication partner talks to the learner while also pointing/ selecting keywords on the learner's AAC device. This helps develop the learner's understanding of both language and symbols.
At the choosing step we are learning that on-screen choices lead to real-life actions. For example, informing of hunger, enjoyment or tiredness by choosing the appropriate core-vocabulary on the screen. We do this by practicing the skill of dwelling our gaze over a specific point of the screen to make a selection.
When we know how to choose we can begin to explore other forms of learning activities.
Communication is based around real-life choices made by the learner. The choices made within a visual scene, communication grid or game lead to full two-way communication, where the communication partner responds appropriately to the choices made by assisting with the action or request said by the learner.
For this step the learner needs as good a calibration as possible with as many calibration points as they can manage. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your software for videos about calibration.
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Use phrases such as these to promote communication!
Looking for extra help with the activity?
The learner can show success by selecting parts of the grid that are relevant to a particular activity or mealtime that you are both currently engaged in.
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Use phrases such as these to promote communication!
Looking for extra help with the activity?
A sure sign of success here is hearing the words aloud that the learner has typed with their eyes. If it is sentences that you can hear, even better!
To get up and running with this activity you will need:
Use phrases such as these to promote communication!
A conversation that is underway via text message is a strong indication of success.
The following software titles can be purchased from Tobii Dynavox and work well together with our eye trackers and gaze software. We recommend these activities for practicing Choosing:
Other sources for activities:
In order for eye tracking to work as accurately as possible, the eye tracker must know more about your eyes. This is why you need to do a calibration. During the calibration the eye tracker measures how your eyes reflect light. The calibration is done by following a point, video or other graphic element that moves across the screen. This calibration data is then combined with our unique 3D model of a human eye, and together they give you an optimal eye tracking experience.
The learner can calibrate within the settings of Gaze Point, Windows Control or Classic Tobii Gaze Interaction Software.
The Fitzgerald Key is the traditional approach presenting Core words grouped together on the page by parts of speech. With this approach, you can start small and grow big (have a smaller grid full of words and increase the size to add more words, start big and fill in (have a larger grid with a few words and gradually show more words without adding buttons), or start with a full grid of words (select the grid size appropriate for the user and show all the words in it from the start).
Modeling is when the communication partner talks to the learner while also pointing/ selecting keywords on the learner's AAC device. This helps develop the learner's understanding of both language and symbols.
Dwelling is the skill of focusing your gaze over a specific point of the screen for a prolonged period of time. Doing this allows us to actively make choices using our gaze.
Learning at this step provides exposure to a wider range of communication methods. This encompasses the use of standard and specialized keyboards and distance communication tools.
The level of eye-tracking at this step opens up the user to the world, where controlling own environment, creativity and computer usage really become a reality.
Windows Control provides the opportunity to impact the physical environment with eye-tracking too, where users can affect factors within the home such as turning on lights or the television.
Completion of this step signifies full control with eye-gaze, where communication and environmental control have been altered significantly.
Communication stretches beyond the user’s immediate vicinity when tools such as Skype, Facebook, email and other distance communication tools are accessible and usable.
For this step the learner needs a high quality calibration with the maximum number of calibration points. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your software for videos about calibration.
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Note that the learner can continue to use "sticky" mode to control Windows if they like, but it's usually more comfortable and efficient to move over to the ordinary two-step selection described above to get rid of the mouse pointer continuously following the user's gaze. With ordinary two-step selection, the learner has a much larger degree of control.
Looking for extra help with the activity?
If the web browser is open and a website has been navigated to then the learner has shown the skills needed to control Windows using their gaze!
To get up and running with this activity you will need:
Following these steps makes it simpler to get going with the activity.
Looking for extra help with the activity?
If the learner has logged in to Facebook and is navigating their newsfeed, they have shown the skills needed to use Facebook using their gaze!