Eye Gaze Pathways

Eye Gaze Pathway

Eye Gaze Pathway

Overview

Welcome to the Eye Gaze Pathway

The Eye Gaze Pathway develops the skill of using eye gaze to communicate and to learn. Each of the six steps follow a clear structure and contain practice activities, instructional videos and communication tips to help get you on your way with eye gaze.

Getting Started

Communication and learning are the focus points of the Eye Gaze Pathway. The practice activities are designed to demonstrate that communication is evident at each step of the pathway, regardless of the learner's experience with eye gaze. Added to this, the communication tips under the "Say" section provide some extra help for the communcation partner to engage in and develop the conversation all the way through the pathway.

Learning objectives are connected to each practice activity. These relate to both the eye gaze skill that is being practised and the vocabulary that is developed through attempting the activity.

While we recommend starting at the "Screen Engagement" step and progressing through the steps at your own pace, it is not a requirement to complete one step before moving onto the next. Therefore, feel free to experiment with the pathway and the activities within each step in a way that works best for the individual.

Screen Engagement

Learning

Learning at this step is built through fun.This step shows how quick and simple it can be to get up and running with eye gaze. Learners are introduced to the idea that the movement of their eyes has a direct and immediate impact on what happens on the screen, "cause and effect".

Communication

The communication partner responds to the caterpillar's movements on-screen. These movements are directly linked to where the learner is looking, so communication is started from the learner looking at the screen.

Calibration

This step does not require any calibration from the learner! However, a simple one-point calibration could be useful if possible. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your gaze software for videos about calibration.

Give it a Try

Activities in this section are free to try if you have a Tobii Dynavox eye gaze device (separate or integrated eye tracker).

Practice 1: Sensory Circles

Objectives

  • We are learning to look at the circles on the screen
  • We are learning about shapes and colours

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  1. Connect your eye tracker/ start your device.
  2. Depending on eye gaze software:
    • Open Gaze Point, start it with the Play button and turn off Mouse Click. Hide the Mouse Cursor.
    • Or open Windows Control and select Sticky Move Cursor.
      • To select this option, start by going to Settings -> Taskbar -> Tasks -> Change and turn on Place Cursor.
      • Go to Taskbar -> Selection and turn on Tertiary Selection.
      • Exit Settings and gaze at the taskbar to activate the tertiary selection.
      • In the popup that shows, select the Sticky Move Cursor task.
    • Or open Classic Gaze Interaction Software.
      • In Settings, select Mouse Emulation.
      • Exit Settings and select Mouse Cursor.
  3. Open Eye-FX and select Sensory Circles.
  4. Turn up the volume!
  5. Encourage the learner to look at the screen to make the circles move.

Say

Use phrases such as these to promote communication!

  • “I can see that the circles move when you move your eyes around the screen!”  
  • “I can hear that the sounds stop when you look away!”

Top Tips

Looking for extra help with the activity?

  • Open the Eye Gaze Pathway on the learner's device and on another device to make reading the tips and instructions possible while the learner is attempting the activities.

Look for

Hearing sounds regularly is an indicator that the user is engaged with what happens on screen when they move their eyes.

More Activities

The following software titles can be purchased from Tobii Dynavox and work well together with our eye trackers and gaze software. We recommend these activities for practicing Screen Engagement:

Other sources for activities:

Responding

Learning

At this step we are learning that looking at different parts of the screen produces different responses. The communication partner can model words connected to what the learner is looking at to teach new vocabulary. This new vocabulary is all based on where the learner is looking!

Communication

Two-way communication is generated by the learner looking and the communication partner responding. The vocabulary modeled by the communication partner depends on what the learner is looking at.

Calibration

This step does not require any calibration from the learner! However, a simple one-point calibration could be useful if possible. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your software for videos about calibration.

Give it a Try

Activities in this section are free to try if you have a Tobii Dynavox eye gaze device (separate or integrated eye tracker).  

Practice 1: Favourite Photos

Objectives

  • We are learning that responses are linked to what we look at on-screen
  • We are learning new vocabulary that is connected to our favourite photographs

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  1. Connect your eye tracker/ start your device.
  2. Depending on eye gaze software:
    • Open Gaze Point, start it with the Play button and turn offMouse Click. Show the Mouse Cursor.
    • Or open Windows Control and select Sticky Move Cursor.
      • To select this option, start by going to Settings -> Taskbar -> Tasks -> Change and turn on Place Cursor.
      • Go to Taskbar -> Selection and turn on Tertiary Selection.
      • Exit Settings and gaze at the taskbar to activate the tertiary selection.
      • In the popup that shows, select the Sticky Move Cursor task.
    • Or open Classic Gaze Interaction Software (double-click the Windows Controlicon).
      • Open Settings, select Mouse Emulation.
      • Exit Settings and select Mouse Cursor in the menu.
  3. Open a photograph so that it displays on the screen.
  4. Respond to what the user is looking at when the curser moves across the screen, making sure to discuss what is happening in the photo at the parts the learner is looking at.
  5. This is a great chance to introduce new vocabulary connected to what the user is looking at!

Say

Use phrases such as these to promote communication!

  • "I see you are looking at the food on the table. It's almost time for breakfast/ lunch/dinner!"
  • "I can see a park in the background. I know you like going to the park!"

Top Tips

Looking for extra help with the activity?

  • Focus responses on the parts of the screen the learner is showing an interest in.
  • Refrain from guiding the learner to look at the parts of the screen you would like them to.
  • Open the Eye Gaze Pathway on the learner's device and on another device to make reading the tips and instructions possible while the learner is attempting the activities.

Look for

The visible curser should move around different parts of the screen. Care-giver responses should be relevant to what is being looked at!

Practice 2: Zoo Scene

Objectives

  • We are learning to gaze at different parts of visual scenes
  • We are learning about different types of animals

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  1. Connect your eye tracker/ start your device.
  2. You don't need to start your gaze software to use Communicator 5 with eye tracking, it just needs to be installed.
  3. Open Communicator 5 and the Sono Primo page set, found in All Page Sets -> Emerging Communication -> Add-On Products.
  4. Choose one of the visual scenes, for example the “Zoo" scene. It has gaze enabled regions that activate automatically when the user looks at them. If they don't, make sure your input method in Communicator 5 is set to Eye Gaze.
  5. Make sure the volume is turned up!
  6. Encourage the learner to explore the scene.
  7. Listen to the audio that plays when the learner gazes at certain parts of the screen.
  8. Communicate with the learner by responding to all audio that plays when the learner gazes at different parts of the screen.

Say

Use phrases such as these to promote communication!

  • “I see that you are looking at the elephant. I like elephants too!”

Top Tips

Looking for extra help with the activity?

  • Two-way communication can take place based on the audio that plays when the user looks at different parts of the visual scene.
  • Open the Eye Gaze Pathway on the learner's device and on another device to make reading the tips and instructions possible while the learner is attempting the activities.

Look for

Hearing the sounds within the visual scene indicates that the learner is gazing at different parts of the screen. Caregiver responses should be connected to the sounds heard within the visual scene!

More Activities

The following software titles can be purchased from Tobii Dynavox and work well together with our eye trackers and gaze software. We recommend these activities for practicing Responding:

 

Other sources for activities:

Exploring

Learning

Here we learn about cause and effect and that changes happen on-screen depending on where we look. Learning takes place through exploration, where the learner can freely look around and discover what happens when they look at different parts of the screen. When we are doing this we are also learning new vocabulary such as colours in painting activities or the names of different objects in scenes.

Communication

Communication at this step can come from the communication partner modelling the use of vocabulary at all areas of the screen. This teaches both navigation around the screen and new vocabulary!

Calibration

For this step the learner needs to do a one point calibration. If possible, a two point calibration would work even better. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your software for videos about calibration.

Give it a Try

Activities in this section are free to try if you have a Tobii Dynavox eye gaze device (separate or integrated eye tracker).  

Practice 1: Jackson Pollock

Objectives

  • We are learning to paint different parts of the screen with different colours
  • We are learning the names of colours

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  1. Connect your eye tracker/ start your device.
  2. Depending on eye gaze software:
    • Open Gaze Point, start it with the Play button and turn onMouse Click. Hide the Mouse Cursor.
    • Or open Windows Control and select the Sticky Left Click task.
      • To select this task, start by going to Taskbar -> Selection and turn on Tertiary Selection (unless it's already on).
      • Exit Settings and gaze at the Left Click task on the taskbar to activate the tertiary selection.
      • In the popup that shows, select the Sticky Left Click task.
    • Or open Classic Gaze Interaction Software.
      • In Settings, select Mouse Emulation (unless already selected).
      • Exit Settings and select Left Mouse Button combined with Single Clickin the menu.
  3. Click here to paint!
  4. The learner can paint on-screen using the movement of their gaze

Say

Use phrases such as these to promote communication!

  • “I can tell that you are looking at the top of the screen as that is where you are painting!”
  • "I can see lots of different colours on the screen. I like that you changed the colurs with your eyes."

Top Tips

Looking for extra help with the activity?

  • The colour of paint changes automatically when the learner dwells their gaze on one point of the screen for a short amount of time. Congratulate the learner for changing paint colour with their eyes!
  • The hidden cursor emphasizes the relationship between eye movement and the changes on-screen.
  • Open the Eye Gaze Pathway on the learner's device and on another device to make reading the tips and instructions possible while the learner is attempting the activities.

Look for

Can you see different colours in different parts of the screen? Great! The learner has shown that they can paint on-screen and dwell their gaze.

Practice 2: Lights

Objectives

  • We are learning to turn on lights displayed on the screen
  • We are learning to gaze at different parts of the screen

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  1. Connect your eye tracker/ start your device.
  2. Depending on eye gaze software:
    • Open Gaze Point, start it with the Play button and turn off Mouse Click. Hide the Mouse Cursor.
    • Or open Windows Control and select Sticky Move Cursor.
      • To select this option, start by going to Settings -> Taskbar -> Tasks -> Change and turn on Place Cursor.
      • Go to Taskbar -> Selection and turn on Tertiary Selection.
      • Exit Settings and gaze at the taskbar to activate the tertiary selection.
      • In the popup that shows, select the Sticky Move Cursor task.
    • Or open Classic Gaze Interaction Software (double-click the Windows Controlicon).
      • Open Settings, select Mouse Emulation.
      • Exit Settings and select Mouse Cursor in the menu.
  3. Turn up the volume!
  4. Open Eye-FX and select the Lights game from the list.
  5. The learner can explore different parts of the screen with their eyes and see that different lights turn on depending on where on the screen they are looking.

Say

Use phrases such as these to promote communication!

  • “I can see that you looked at the top of the screen because a bulb turned on in the same place”
  • “You looked at a lamp on the bottom row and I heard a sound at the same time!”

Top Tips

Looking for extra help with the activity?

  • A sound coincides with each light that turns on. Mention that you hear sounds because the learner is causing them to play with their gaze!

Look for

A sign of success is if some lights are on at all times, as they turn off automatically after a few seconds!

Practice 3: Exploring Scenes

Objectives

  • We are learning to explore a variety of visual scenes and communication grids
  • We are learning new vocabulary associated to visual scenes and communication grids

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  1. Connect your eye tracker/ start your device.
  2. You don't need to start your gaze software to use Communicator 5 with eye tracking, it just needs to be installed.
  3. Open Communicator 5 and the Sono Flex page set, found in All Page Sets -> Emerging Communication -> Add-On Products.
  4. Make sure the volume is turned up!
  5. Different grids or visual scenes can be selected depending on where the learner is looking. Encourage the learner to look at the screen.
  6. The learner can gaze at different areas of the screen to communicate different things once a grid or scene has been selected.
  7. The word associated with the part of the screen being looked will be heard through the speakers.
  8. Respond to what is heard!
  9. We are not focusing on learning how to accurately communicate feelings or needs at this point. Let’s simply get used to idea that there are a variety of different grids and scenes that can be used for different purposes.

Say

Use phrases such as these to promote communication!

  • "You said 'play outside'. We played outside yesterday and it was fun. We can play outside again today!"
  • "You said 'blow lots of bubbles'. Now I will blow lots of bubbles! I think it is fun to blow bubbles."

Top Tips

Looking for extra help with the activity?

  • What is communicated at this stage is not important, though try to respond each time the user gazes at a point of the screen to promote the idea that care-giver responses are directly linked to user gaze.
  • Feel free to give assistance to exit the grid/ scene and encourage the learner to select another

Look for

A simple way to assess success in this instance is to witness if the learner has selected a grid/ scene and gazed at some of the options within it. If the learner manages this for several grids/ scenes, even better!

More Activities

The following software titles can be purchased from Tobii Dynavox and work well together with our eye trackers and gaze software. We recommend these activities for practicing Responding:

 

Other sources for activities:

Word to Remember

Calibration

In order for eye tracking to work as accurately as possible, the eye tracker must know more about your eyes. This is why you need to do a calibration. During the calibration the eye tracker measures how your eyes reflect light. The calibration is done by following a point, video or other graphic element that moves across the screen. This calibration data is then combined with our unique 3D model of a human eye, and together they give you an optimal eye tracking experience.

The learner can calibrate within the settings of Gaze Point, Windows Control or Classic Tobii Gaze Interaction Software.

Targeting

Learning

At this step we are learning that performing a “mouse-over” action with our eyes will make an action happen. This step really focuses on training the learner to focus their gaze on a part of the screen that they have chosen.

Communication

This is the first step where the learner is actively selecting what they want to say with their eyes! Since audio is only heard if the learner looks at a specific part of the screen, communication can be based around what the learner has decided to look at. The communication partner can then respond to the audio that the learner has played with their eyes.

Calibration

For this step the learner needs as good a calibration as possible with as many calibration points as they can manage. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your software for videos about calibration.

Give it a Try

Activities in this section are free to try if you have a Tobii Dynavox eye gaze device (separate or integrated eye tracker).  

Practice 1: Painting!

Objectives

  • We are learning to actively target gaze on different items and colours to give different results
  • We are learning new vocabulary connected to the scenes we look at

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  1. Connect your eye tracker/ start your device.
  2. You don't need to start your gaze software to use Communicator 5 with eye tracking, it just needs to be installed.
  3. Open Communicator 5 and the Sono Flex page set, found in All Page Sets -> Emerging Communication -> Add-On Products.
  4. Make sure the volume is turned up!
  5. Open “Paint Activity”.
  6. The user can select the scene of their choosing by gazing at it.
  7. Different elements of the chosen scene can then be painted by gazing at a colour to choose it before gazing at the object to paint it with the chosen colour.
  8. Communication can be fostered by discussing the different scenes, the content within the scenes and the different colours. This can be instigated by the care-giver in response to what items and colours the user is painting.

Say

Use phrases such as these to promote communication!

  • “I see that you painted the barn blue. Blue is my favourite colour!”
  • "I can see the a sun on the screen. The sun is in the corner of the screen."

Top Tips

Looking for extra help with the activity?

  • Encourage the learner to look at all parts of the screen by modeling words from different areas of the screen

Look for

Are any items within the scene painted? Then we can consider that to be a success. Are all items within the screen painted? Fantastic!

Practice 2: Look & Listen

This practice activity is for the Windows version of Tobii Dynavox Snap Scene. Try for free by downloading the Snap Scene LITE app or buy the full version. Note that Snap Scene is also available for iPad on the App Store (only Touch access, no Eye tracking).

Objectives

  • We are learning to understand the connection between what is looked at and the audio that plays
  • We are learning new vocabulary from the selected scenes

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  • Connect your eye tracker/ start your device.
  • Depending on eye gaze software:
    • Open Gaze Point, start it with the Play button and turn on Mouse Click. Hide the Mouse Cursor.
    • Or open Windows Control and select the Sticky Left Click task.
      • To select this task, start by going to Taskbar -> Selection and turn on Tertiary Selection (unless it's already on).
      • Exit Settings and gaze at the Left Click task on the taskbar to activate the tertiary selection.
      • In the popup that shows, select the Sticky Left Click task.
    • If you use Classic Gaze Interaction Software, it only needs to be installed, you don't need to start it.
  • Open Snap Scene.
  • Explore Snap Scene and when you feel comfortable, upload a picture from your computer.
  • Add “hotspots” around different parts of the scene. For example, draw a hotspot around a friend or family member performing an action such as throwing.
  • Add audio to the hotspot that describes what is happening, such as “throw the ball”.
  • Make sure the volume is turned up.
  • Encourage the learner to explore the scene and listen for audio!

Say

Use phrases such as these to promote communication:

  • "You looked at the scene and said 'throw the ball'. I want to throw the ball too. Let's throw the ball!"

Top Tips

Looking for extra help with the activity?

  • Download the Pathways app for iPad, a free supplementary learning app for Snap Scene!
  • Choose familiar locations, people and situations for the visual scenes to give extra context and meaning!

Look for

Hearing audio associated with different parts of the visual scene represents that the learner has targeted their gaze effectively.  

More Activities

The following software titles can be purchased from Tobii Dynavox and work well together with our eye trackers and gaze software. We recommend these activities for practicing Responding:

 

Other sources for activities:

Word to Remember

Calibration

In order for eye tracking to work as accurately as possible, the eye tracker must know more about your eyes. This is why you need to do a calibration. During the calibration the eye tracker measures how your eyes reflect light. The calibration is done by following a point, video or other graphic element that moves across the screen. This calibration data is then combined with our unique 3D model of a human eye, and together they give you an optimal eye tracking experience.

The learner can calibrate within the settings of Gaze Point, Windows Control or Classic Tobii Gaze Interaction Software.

Modeling

Modeling is when the communication partner talks to the learner while also pointing/ selecting keywords on the learner's AAC device. This helps develop the learner's understanding of both language and symbols.

Choosing

Learning

At the choosing step we are learning that on-screen choices lead to real-life actions. For example, informing of hunger, enjoyment or tiredness by choosing the appropriate core-vocabulary on the screen. We do this by practicing the skill of dwelling our gaze over a specific point of the screen to make a selection.

When we know how to choose we can begin to explore other forms of learning activities.

Communication

Communication is based around real-life choices made by the learner. The choices made within a visual scene, communication grid or game lead to full two-way communication, where the communication partner responds appropriately to the choices made by assisting with the action or request said by the learner.

Calibration

For this step the learner needs as good a calibration as possible with as many calibration points as they can manage. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your software for videos about calibration.

Give it a Try

Practice 1: Communicate by Choosing

Objectives

  • We are learning to select core vocabulary using grids and symbols to communicate feelings and needs
  • We are learning new vocabulary that is connected to various grids

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  1. Connect your eye tracker / start your device.
  2. You don't need to start your gaze software this time, it just needs to be installed.
  3. Open Communicator 5 and the Sono Flex page set, found in All Page Sets -> Emerging Communication -> Add-On Products.
  4. Make sure the volume is turned up!
  5. Model the steps needed to select a context.
    • Through modeling, the communication partner can assist the user to select a context, which are shown on the right hand side. A good place to start could be one of the meal-time contexts, though additional contexts can be chosen by selecting “more contexts” from the top right corner. For example, the communication partner can model by saying, “It is almost time for lunch and I am hungry. I would like to talk about lunch. Breakfast, dinner, lunch!” while pointing at breakfast, dinner, then lunch, while saying the words.
  6. Model the steps needed to select an item from within the context.
    • The word types within each context are colour classified with the Fitzgerald Key. Using these different colours to model good practice is a great way to assist the user become familiar with the words and categories within the context. For example, the communication partner can model by saying, “I want to say ‘pizza. I know that ‘pizza’ is a noun. Those are orange. Let’s look through the orange ones. Cucumber, fruit, pizza!” while pointing at cucumber, fruit, then pizza, while saying the words.
  7. Now the learner can try selecting content themselves. Remember to comment on everything the user says!
  8. Here we can really emphasize the notion that the choices made with gazing at the on-screen context results in real-life actions.

Say

Use phrases such as these to promote communication!

  • “You focused your gaze on the word 'thirsty'. Your favourite drink is on the table. Let’s have something to drink!”

Top Tips

Looking for extra help with the activity?

  • Make sure to repeat all words that the learner selects.
  • Look at the items under the “things” section. Gather some of the real-life versions of the items in this section close by so that they can be reached easily to emphasize that when something is selected on-screen it promotes a real action!

Look for

The learner can show success by selecting parts of the grid that are relevant to a particular activity or mealtime that you are both currently engaged in.

Practice 2: Keyboard Communication

Objectives

  • We are learning to communicate by selecting letters on a keyboard using eye gaze

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  1. Connect your eye tracker / start your device.
  2. You don't need to start your gaze software to use Communicator 5 with eye tracking, it just needs to be installed.
  3. Open Communicator 5 and select “All page sets”.
  4. Add one of the keyboards to the homepage.
  5. Encourage the learner to gaze over the screen and select letters. The communication partner can use phrases such as “you selected the letter ‘M’ from the keyboard. When you press the ‘speak’ button I can hear the letter ‘M’". Think of this as an exploration of the keyboard. This should be thought of as “no-fail” and everything the learner says should be given a response.
  6. After this, we can now practice saying words using the same keyboard. A good way to start with this is to model words that you know the learner is familiar with and has used before, for example from communication grids in Sono Flex. We can use vocabulary such as, “I have seen you select the word ‘pizza’ before. ‘Pizza’ begins with a ‘P’. I wonder if you can select the letter ‘P’?" The communication partner can then guide the learner through the rest of the word in the same way.
  7. We can then encourage the learner to build on these skills by constructing sentences!

Say

Use phrases such as these to promote communication!

  • “You are spelling out what you want to say. I can see the word ‘pizza’ has been suggested. I wonder if you would like some pizza? I know that I would!”

Top Tips

Looking for extra help with the activity?

  • Look at the word suggestion list at the top of the screen while the learner is spelling the word. The word will often appear here after only a few letters have been chosen.

Look for

A sure sign of success here is hearing the words aloud that the learner has typed with their eyes. If it is sentences that you can hear, even better!

Practice 3: Beam Messaging

Objectives

  • We are learning to communicate via distance communication methods by sending text messages through Communicator 5

Get Ready

To get up and running with this activity you will need:

Let’s Go!

  1. Watch the video above to set up Beam Messaging.
  2. Connect your eye tracker / start your device.
  3. You don't need to start your gaze software this time, it just needs to be installed.
  4. Now that Beam Messaging is set up, the learner can open “text messaging” from the Communicator 5 homepage.
  5. The learner can select “new message” and begin to type to using the keyboard!
  6. Encourage the learner to send a new message to someone in the same room.
  7. When the recipient receives the message, they can show it to the learner to demonstrate how the process works from the receiver’s side.
  8. The receiver can then reply, of course!

Say

Use phrases such as these to promote communication!

  • "I can see that you can send and receive text messages from Communicator 5. I would like to communicate through text messaging. I will send you a message now!"

Top Tips

  • The learner can receive text messages though Communicator 5, too. Why not try sending a text message to their number and encourage the learner to reply!
  • Give the learner a task to do within Communicator 5 and let them know about it via text message! Text them a task such as, “I wonder if you can add the ‘E-mail’ page set to the Communicator homepage? The first step is by selecting ‘All page sets’.”

Look for

A conversation that is underway via text message is a strong indication of success.

More Activities

The following software titles can be purchased from Tobii Dynavox and work well together with our eye trackers and gaze software. We recommend these activities for practicing Choosing:

 

Other sources for activities:

Words to Remember

Calibration

In order for eye tracking to work as accurately as possible, the eye tracker must know more about your eyes. This is why you need to do a calibration. During the calibration the eye tracker measures how your eyes reflect light. The calibration is done by following a point, video or other graphic element that moves across the screen. This calibration data is then combined with our unique 3D model of a human eye, and together they give you an optimal eye tracking experience.

The learner can calibrate within the settings of Gaze Point, Windows Control or Classic Tobii Gaze Interaction Software.

Fitzgerald Key

The Fitzgerald Key is the traditional approach presenting Core words grouped together on the page by parts of speech. With this approach, you can start small and grow big (have a smaller grid full of words and increase the size to add more words, start big and fill in (have a larger grid with a few words and gradually show more words without adding buttons), or start with a full grid of words (select the grid size appropriate for the user and show all the words in it from the start).

Modeling

Modeling is when the communication partner talks to the learner while also pointing/ selecting keywords on the learner's AAC device. This helps develop the learner's understanding of both language and symbols.

Dwelling

Dwelling is the skill of focusing your gaze over a specific point of the screen for a prolonged period of time. Doing this allows us to actively make choices using our gaze.

Full Control

Learning

Learning at this step provides exposure to a wider range of communication methods. This encompasses the use of standard and specialized keyboards and distance communication tools.

The level of eye-tracking at this step opens up the user to the world, where controlling own environment, creativity and computer usage really become a reality.

Windows Control provides the opportunity to impact the physical environment with eye-tracking too, where users can affect factors within the home such as turning on lights or the television.

Completion of this step signifies full control with eye-gaze, where communication and environmental control have been altered significantly.

Communication

Communication stretches beyond the user’s immediate vicinity when tools such as Skype, Facebook, email and other distance communication tools are accessible and usable.

Calibration

For this step the learner needs a high quality calibration with the maximum number of calibration points. How to do the calibration can be different for different pieces of software, check out the Support & Training pages for your software for videos about calibration.

Give it a Try

Practice 1: Opening Websites

Objectives

  • We are learning to control and browse the internet using eye gaze.

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  1. Connect your eye tracker / start your device.
  2. Depending on eye gaze software:
    • Open Windows Control and make sure the ordinary Left Click task is available in the Taskbar.
      • If the Left Click task is in Sticky mode (following the eye movements), start by going to Settings -> Taskbar -> Selection and turn on Tertiary Selection (unless it's already on).
      • Exit Settings and gaze at the Sticky Left Click task on the taskbar to activate the Tertiary Selection.
      • In the popup that shows, select the Left Click task.
    • Or open Classic Gaze Interaction Software.
      • In Settings, select Gaze Selection (unless already selected).
      • Exit Settings.
  3. The first step is to look at the desired task from the Taskbar. We will start by looking at the Left Clickbriefly to select it.
  4. When the learner gazes back at a section of the screen with Left Click activated, Windows Control will instigate a zoom function to make it easier to select the desired item.
  5. Why not try encouraging the learner to open a web browser by gazing at it? Tip: Windows can be set to open applications by a single left click, which means the user doesn't have to use the Double click function. Otherwise, the Double Click task can be used to open applications on the Desktop.
  6. When the web browser is open, the learner can navigate to any website by typing into the browser’s address bar. This can be done using the Keyboard, which is a separate task on the Taskbar that you gaze on a little longer to open.
  7. Take some time to explore different websites, click on links, play videos and do anything online that can be done with a mouse click!

Note that the learner can continue to use "sticky" mode to control Windows if they like, but it's usually more comfortable and efficient to move over to the ordinary two-step selection described above to get rid of the mouse pointer continuously following the user's gaze. With ordinary two-step selection, the learner has a much larger degree of control.

Top Tips

Looking for extra help with the activity?

  • If you feel that accuracy is a little "off", why not try calibrating your device?
  • Encourage the learner to navigate to a website that is connected to an interest that you both share and use it as a base for a conversation.

Look for

If the web browser is open and a website has been navigated to then the learner has shown the skills needed to control Windows using their gaze!

Practice 2: Using Facebook with Windows Control

Objectives

  • We are learning to use eye-gaze to communicate on Facebook.

Get Ready

To get up and running with this activity you will need:

Let’s Go!

Following these steps makes it simpler to get going with the activity.

  1. Follow the above instructions to get started.
  2. Encourage the learner to use Windows Control to open the web browser on their laptop or device.
  3. Use the Left Click task and the Keyboard to log in to Facebook.
  4. From the taskbar, use the Scroll, Left Click and Keyboard tasks to begin communicating through Facebook!

Top Tips

Looking for extra help with the activity?

  • If you feel that accuracy is a little off, why not try calibrating your device?
  • Why not try sending a Facebook message to the learner to promote communication?

Look for

If the learner has logged in to Facebook and is navigating their newsfeed, they have shown the skills needed to use Facebook using their gaze!

Keep Growing