university of south dakota - technical...

51
UNIVERSITY OF SOUTH DAKOTA Volume 1

Upload: others

Post on 25-Jun-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

UNIVERSITY OF SOUTH DAKOTADepartment of Psychology – Human Factors

Gazepoint 3 Eye Tracker Manual

Volume

1

Page 2: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

DEPARTMENT OF PSYCHOLOGY – HUMAN FACTORS

Gazepoint 3 Eye Tracker Manual

NotePrior to reading this manual, it is suggested that you have already read the manuals on the GP3 eye tracker, and its software Gazepoint Control and Gazepoint Analysis, provided by Gazepoint. This manual covers the same topics as the aforementioned

manuals, but tries to explain the functions of the various options in more depth.

This document is currently in a “Rough Draft” format. Editing and changes will be made.

Jonathan Vogl November 2014

Phone 605.651.1985 • E-mail [email protected]

Page 3: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

Table of ContentsSection 1Getting to Know The Eye Tracker..........................................1

Technical Specifications and Requirements............................................1GP3 Cable Connections..........................................................................2GP3 Placement.......................................................................................2Example Experiment: Part 1....................................................................2

Section 2Gazepoint Control....................................................................3

The GP3 Camera....................................................................................4Select Screen..........................................................................................5Calibrate..................................................................................................5Gaze Pointer...........................................................................................6Switch Tracker Type................................................................................6Example Experiment: Part 2....................................................................6

Section 3Gazepoint Analysis .................................................................7

Creating a New Project............................................................................8Layout......................................................................................................9 Media List – Adding Stimuli................................................................9 Recording List – Managing Recorded Data......................................13 AOI List – Areas of Interest..............................................................14Collecting Data – Using the Functions of the Collect Data Toolbar.......16 Start/Stop Record..............................................................................16 Select Screen....................................................................................16 Next Media........................................................................................17 Visualization......................................................................................17 Gaze Video........................................................................................19 Thinkaloud.........................................................................................19

Page 4: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

Show Cursor......................................................................................19 Show AOI.........................................................................................20Analyzing Data......................................................................................21 Data Analysis Toolbar......................................................................21Exporting Data.......................................................................................23Example Experiment: Part 3..................................................................26

Section 4The ‘How To…’ Guide ...........................................................27

How to Create Dynamic AOIs................................................................27How to Use Offsets to Correct Gaze Data.............................................29How to Analyze Data when Using a Web Page Stimulus......................31How to Transfer Data to iMap................................................................??

Page 5: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

Getting to Know the Eye TrackerThis section will provide a brief overview of the physical properties of the Gazepoint 3 (GP3) eye tracker. Connecting the eye tracker to the control computer and placement of the GP3 camera and participant will be discussed.

T S E C T I O N 1

Tech Specs

Connections

Placement

Experiment: Part 1he GP3 eye tracker is a easy to use, high-performance eye tracker. The eye tracker comes with a software suite, Gazepoint Control and Gazepoint Analysis, which allows you to calibrate a participant, build an experiment, collect data, and analyze data. Understanding the physical properties of the GP3 eye tracker will help you see the exciting possibilities the GP3 brings to the table as well as its limitations.

Technical Specifications and RequirementsThe technical specifications and requirements are listed below. The information was taken from the Gazepoint Control User Manual Rev 2.0.Tech Specs

Accuracy: 0.5 – 1 degree of visual angle 60 Hz update rate 5 and 9 point calibration Easy to use open standard API 25 cm x 11 cm (horizontal x vertical) movement ±15 cm range of depth movement Powered by USB

1

Section

1

Page 6: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

Requirements Processor: Modern processor i5 to i7 recommended Memory: 2GB OS: Windows XP/Vista/7/8, 32/64 bit

GP3 Cable ConnectionsThe GP3 has two physical connections to the control computer:

USB Power Cable – This DC power cable provides power to the eye tracker through a USB port in the control computer. This cable can be connected to the control computer through a USB hub as well.

USB Data Cable – This cable is used to communicate back and forth between the eye tracker and the control computer. This cable must be directly connected to the USB port on the control computer, as it uses almost 100% of the data bus. Ideally, refrain from using other devices in the same USB hub that the GP3 eye tracker is connected to (to ensure that no other devices will interfere with the eye tracker’s image transmission).

GP3 PlacementThe GP3 eye tracker should be positioned directly below the computer screen on which you are going to present stimuli, henceforth known as the experiment screen. Avoid using the eye tracker in a room with direct or indirect sun-light on the face of the participants, as it will interfere with the reading of the corneal reflexes.

The GP3 should be about arm’s length distance away from the participant, centered, and pointing at the face of the participant. The ideal distance is 65 cm from the participant’s eyes.

Important: If a participant wears glasses, it is best to tilt the GP3 unit upwards at a greater angle to prevent reflections from appearing off the glasses lens.

Example Experiment: Part 1If requested, details outlining the creation, running, and analysis of an example experiment will be highlighted in this section.

2

!

Page 7: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

Gazepoint ControlThis section will provide an overview of Gazepoint Control software. This program is used to turn on the GP3 eye tracker, position participants in the camera’s field of view, and calibrate participants.

T S E C T I O N 2

Overview

GP3 Camera

Select Screen

Calibrate

Gaze Pointer

Switch Tracker Type

Experiment: Part 2he Gazepoint Control program is one of two programs that are needed to run an experiment using the GP3 eye tracker. Gazepoint Control acts as a ‘power switch’ for the IR LEDs and camera and processes the images captured by the camera to estimate the point of the participant’s gaze on the screen. As such, Gazepoint Control needs to be running for data collection, but not when you analyze data at a later time.

OverviewMost of what Gazepoint Control does is automatic and requires no monitoring or input from the researcher. However, there are four options that can be controlled by the researcher: Calibrate, Gaze Pointer, Select Screen, and Switching the Tracker Type.

3

Section

2

NoteIt is important to note that you must minimize

Gazepoint Control before switching over to Gazepoint Analysis to run your experiment.

Failing to do so will keep the calibration image on the screen, thus not allowing stimuli to be

presented.

Page 8: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

When starting Gazepoint Control, the following screen will appear:

FIGURE 2.1 depicts the Gazepoint Control program on its initial startup. A simple toolbar with four options lines the top of the camera displays, which are blank until a participant is sitting in the

proper position.Note

You will also notice that the IR LEDs in the eye tracker turn on when Gazepoint Control is started.

The GP3 CameraThe camera inside the GP3 turns on as soon as Gazepoint Control is started. The display window shows four sets of information: Distance of Participant, Camera View, Right Eye Capture, and Left Eye Capture.

1. Distance of Participant – This slider gives you a rough estimate of the distance between the eye tracker camera and the participant’s eyes. Ideally, you want the green dot to be near the center of the spectrum, between close and far.

2. Camera View – This display shows you the entire image that the camera is able to view. You will want to position your participant to appear in the center of this display window. Coupled with the ‘Distance of Participant’ slider, you now have a defined physical area where your participant’s eyes should be located.

4

Page 9: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

3. Right Eye Capture – This display zooms in on what Gazepoint Control believes to be the right eye of the participant. You can monitor the eye tracker’s ability to discern the participant’s pupil and corneal reflex in this display.

4. Left Eye Capture – This display functions the same as the ‘Right Eye Capture’ display, except for the left eye.

Select ScreenThe ‘Select Screen’ option allows the researcher to quickly identify a screen as the active screen, the experiment screen, on which the calibration and experimental stimuli will be presented to the participant. Clicking the button once will bring up a small, black window (on what is the current experiment screen) listing the display’s dimensions.

NoteTake note of the display’s dimensions if you will need them for

quantitative analysis after data collection.

CalibrateOnce the participant is seated in the proper, yet comfortable, position and the experiment screen is selected, a calibration needs to be performed to allow Gazepoint Control to calculate the participant’s point of gaze. This calibration determines the differences between the participant’s eyes to GP3’s eye model. To perform a calibration, make sure your participant is ready and click the ‘Calibrate’ button. The following image will appear on the experiment screen once the participant completes the calibration:

5

Page 10: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

FIGURE 2.2 depicts the calibration screen AFTER the participant completes the calibration. To complete the calibration, the participant is required to stare at the center of each of the five dots one

at a time.

The calibration results screen shows the level of error the system calculated in the participant’s calculated point of gaze for both the right and left eyes, red and green dots, respectively.If you are unhappy with the calibration results, you can simply run the entire calibration again by pressing the ‘c’ key. If you would like to recalibrate one point, simply click on the point you want to recalibrate and Gazepoint Control will rerun the calibration for that one point. If you realize that your previous calibration was your desired result and you accidentally performed another calibration, you can press the ‘u’ key to undo the most recent calibration and accept the previous calibration.

Gaze PointerThe ‘Gaze Pointer’ option allows you to set the mouse pointer under the control of the participant’s point of gaze. To regain control of the mouse pointer, simply block the camera sensor’s ‘view’ with your hand and use the mouse to turn the ‘Gaze Pointer’ option off.

Important: DO NOT touch the eye tracker when blocking the sensor. Simply holding your hand in front of the sensor will be sufficient to block the view to the participant’s eyes.

Switch Tracker TypeThe ‘Switch Tracker Type’ option allows you to switch between multiple eye tracker models created by Gazepoint. As of the date this manual was created, the Vision Lab only uses the GP3 eye trackers. Thus, this option will be used only if and when new eye trackers are purchased.

Experiment: Part 2If requested, details outlining the creation, running, and analysis of an example experiment will be highlighted in this section.

6

!

Section

3

Page 11: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

Gazepoint AnalysisThis section will describe how to use the Gazepoint Analysis software. This program presents stimuli, records eye tracking data, and offers multiple options to visualize and analyze the recorded data.

he processes of collecting data and qualitatively analyzing data are performed in the same program, Gazepoint Analysis. This program offers a wide range of options to view the participant’s, or participants’, point of gaze

data as a fixation map, heat map, opacity map, or as a single point both in real time or recorded formats. The data collected by Gazepoint Analysis can be exported in *.csv format along with snapshots or videos of the participant’s point of gaze data.

TS E C T I O N 3

Overview Creating a New Project Layout Collect Data Analyze Data Export Data Experiment: Part 3

OverviewGazepoint Analysis performs two primary functions: Collecting Data and Analyzing Data. As such, this section will view each part separately. First, a description of the layout of the Gazepoint Analysis program will be given. Each part of the user interface will be described in detail, yet complex functions of these parts may only provide you a reference to Section 4, the “How to…” section.

Next, the processes of data collection and data analysis will be covered. Many of the functions offered by the data collection toolbar will also be offered in the data analysis toolbar. Thus, the descriptions of data analysis functions that are similar to data collection functions will be referred back to the data collection toolbar section, in an effort to avoid repetition.

Finally, details on the process of exporting data will be highlighted and we will collect, analyze, and export data in the example experiment at the end of the section.

7

Page 12: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

Creating a New Project – A Suggested ProcedureWhen you open Gazepoint Analysis, you will be prompted by the following window:

FIGURE 3.1 depicts the project selection window of the Gazepoint Analysis program.

To create a new project, click the ‘New Project’ button. You will be prompted to name the project and designate a location to save the project. For efficient file management on our research computer, follow these steps to save your project:

STEP ONE: In the ‘Save As’ window, navigate to the C: drive and scroll down to find the folder named ‘GP3’. Double click this folder to open the directory.

STEP TWO: Create a new folder in the GP3 folder and give it a name that describes your project. This creates another directory that will hold the project file and the subfolders Gazepoint Analysis uses as destinations to retrieve stimuli and export data. Double click the folder you just created to open the directory.

STEP THREE: Name your project and press ‘Save’. This will save your project in the subfolder you created in the GP3 folder. This will ensure that no project is overwritten and allow for ease of navigation through the folder for you and other researchers using the eye tracker’s control computer.

NoteYou may wish to create a folder with your name where you store all of your projects. Follow the process above, but create a folder with your name in the GP3 folder, open your new folder, then continue

with Step Two.

8

Page 13: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

LayoutGazepoint Analysis offers two distinct functions, data collection and data analysis. Each of these functions shares the same layout with different toolbars. However, the Media List, Recording List, and AOI List remain the same between each function. The following figure displays Gazepoint Analysis at start up:

FIGURE 3.2 depicts the Gazepoint Analysis start up screen. The areas highlighted are the Media List (top), Recording List(middle), and AOI List (bottom).

Media ListThe ‘Media List’ box displays a list of the stimuli you want to present during your experiment. Gazepoint Analysis allows you to utilize a wide variety of stimuli for your experiments: Images, Text files, Videos, Webpages, and Screen Captures. The media box is shown here:

FIGURE 3.3 depicts the ‘Media List’ in the Gazepoint Analysis program.

9

Page 14: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

NoteIt is important to note that your experiment can only consist of one of three stimuli groups: 1. Image/Text/Video stimuli, 2. Webpage

Stimuli, or 3. Screen Capture. As such, you cannot display 3 images and a webpage in the same experiment.

To add a new stimulus, simply click the ‘+’ button below the media list box, as shown in figure 3.3. When adding a new stimulus, the following box will appear:

FIGURE 3.4 depicts the Add Media Item prompt that can be accessed via the ‘Media List’ box in the Gazepoint Analysis program. The 5 types of available stimuli options are shown, with the ‘Image’

stimulus option chosen.

Adding a(n) Image/Video/Text StimulusIn this example, we are going to add an image stimulus. In the ‘Media Name’ field, you should input a name that describes the stimulus being presented. In the ‘File Name/URL’ field, you can type in the path to the stimulus file, or click the ‘Browse’ button to find the file you want to use as a stimulus. In the ‘Display Duration’ field, you can enter the amount of time, in seconds, you wish for your stimulus to be presented.

NoteThis value will always default to 10 seconds if left unchanged.

Finally, you have the option to include the stimulus you linked in a ‘Randomized Playback Order’ by checking the box, as shown in

figure 3.4.

Adding a text file stimulus follows the same protocol as adding an image stimulus. Adding a video stimulus follows the same protocol as adding an image or text stimulus, except an additional option to display the ‘Full Video’ is available. This will automatically change the duration to include the entire length of the video file

10

Page 15: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

you have included in your experiment. However, you can change the duration to any desired value under the maximum duration of the video file. Changing this duration only changes where the video ends, as the video stimulus will always start from the beginning of the file.

Adding a Web Page StimulusWhen adding a webpage, the following prompt appears on the screen:

FIGURE 3.5 depicts the Add Media Item window when adding a webpage stimulus in the Gazepoint Analysis program.

When adding a web page stimulus, the ‘Media Name’ field should be filled with a descriptive name, often including the name of the webpage you are presenting. In the ‘File Name/URL’ field, type the web address you want your participants to explore during the experiment.

When presenting a web page stimulus, the web page you typed in the ‘File Name/URL’ field will be displayed on the experiment screen in a Gazepoint Analysis web browser (that is based on the Internet Explorer program installed on the control computer). The following image depicts the Gazepoint Analysis web browser:

INSERT IMAGE HERE

FIGURE 3.6 depicts the Gazepoint Analysis web browser. The top toolbar contains new options.

11

Page 16: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

Adding a Screen Capture StimulusWhen adding a screen capture stimuli, the following prompt appears:

FIGURE 3.7 depicts the ‘Add Media Item’ prompt when selecting a Screen Capture as the stimulus type in the Gazepoint Analysis program.

The only field that needs to be edited is the ‘Media Name’ field. The screen capture stimulus records everything occurring on the experiment screen and places the gaze data recorded by Gazepoint control over the screen capture.

The actual stimulus must then be added to the experiment screen by your 3rd party program prior to running the experiment.

NoteWhile screen capture can be used for recording gaze data on web

pages, it is designed to record gaze data while using 3rd party programs, such as running a MatLab script or playing a video

game.

12

Page 17: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

Recording ListThe recording list displays the list of participants’ gaze point data you have recorded for your experiment.

FIGURE 3.8 depicts the ‘Recording List’ box of the Gazepoint Analysis program.

After recording the data of a new user, a new recording will be added to the ‘Recording List’ box with a name of “User X”. You can change the name of each user by double clicking on the data file you wish to edit or highlighting the data file you wish to edit and clicking the settings button on the lower right of the ‘Recording List’ box. The following box will then appear:

13

Page 18: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

FIGURE 3.9 depicts the User Data Settings prompt of the Gazepoint Analysis program.

To edit the name of the gaze data recording, edit the ‘User Name’ field. You can also input the participant’s gender and age in the appropriate fields. The color associated with the user’s data can also be edited by clicking the colored box in the top right corner.

Using the ‘Data Offset X’ and ‘Data Offset Y’ features will be discussed in Section 4, on page 29.

AOI ListThe AOI (Area of Interest) List box allows you to create, view, and run basic statistics on AOIs that you create within the Gazepoint Analysis program. The AOI List box is depicted below:

FIGURE 3.10 depicts the Gazepoint Analysis AOI List box.

After selecting a stimulus, the ‘AOI List’ box will show the AOIs currently being used for the selected stimulus. This allows you to assign unique areas of interest for each stimulus in the ‘Media List’

If you would like to add an AOI, first, ensure you are in ‘Analyze Data’ mode by pressing the ‘Analyze Data’ button on the top toolbar. To create a static AOI, press the ‘+’ button in the bottom left corner of the ‘AOI List’. The AOI Settings window will appear:

14

Page 19: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

FIGURE 3.11 depicts the AOI Settings window in the Gazepoint Analysis Program.

By default, the new AOI will be named AOI 0. You may wish to change this name to something more descriptive, as this will be the name of the AOI in the exported statistics file, discussed later.

To create a static AOI that needs to span the entire duration of the stimulus, first, make sure the time slider is set to the beginning of the stimulus. Next, simply click on the display window, near your desired AOI, and drag a box around the location. After creating the box, you can adjust the size and location by clicking and dragging the box’s edges and center, respectively.

Once you are happy with the placement of the AOI, you will notice a value was added to the ‘Frame Time’ list in the ‘AOI Settings’ window. This value indicates the time when the AOI will begin collecting data. To cover the entire length of the stimulus duration, the AOI value should be set to 0. When only one value is in the ‘Frame Time’ list, the duration of the AOI is between the first value, 0, to the end of the duration, X.

You can also have an AOI start or stop during the presentation of the stimulus. To do so, drag the time slider to the desired start point and highlight your AOI. Next, drag the time slider to the desired endpoint and press the ‘Copy Previous’ button. This will automatically insert a box that is the exact same size as the box you just created. Now, you will see two values in the ‘Frame Time’ box. The first time value indicates the start time of the AOI, while the second time value indicates the end time of the AOI. This can be helpful if you are presenting a video that has objects appear or disappear.

Creating dynamic (moving) AOIs will be discussed in detail in Section 4, page 27.

Finally, you can run statistics on your AOIs in your AOI List. To do so, simply click the ‘Run Statistics’ button. Gazepoint Analysis will automatically run the statistics and display the results. See Figure 3.10 for an example of the statistical results. The statistics reported are:

1. Name – The name you have given the AOI.2. Viewers – The number of participants who viewed the AOI.3. 1st View – The time at which the participant first looked at the AOI.4. View Time – How long the participant’s fixation remained in the AOI.5. Viewed Time – The percentage of the stimulus duration spent in the AOI.6. Revisitors – The number of viewers whose gaze revisited the AOI.7. Revisits – The total number of times the AOI was revisited.

15

Page 20: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

Collecting DataNow that the constants of the Gazepoint Analysis have been covered, this section will move on to the first function of Gazepoint Analysis: Collecting data. To begin collecting data, click the ‘Collect Data’ button and the data collection toolbar will appear, as the following image depicts:

FIGURE 3.12 depicts the Gazepoint Analysis start up screen with the ‘Collect Data’ option used.

Collect Data Toolbar1. Start Record – The ‘Start Record’ button runs the experiment on the

experiment screen. To do this, Gazepoint Analysis will begin the experiment by presenting the first stimulus listed in the ‘Media List’ box, and, after the duration you specified, will continue to the next stimulus until all stimuli have been used.

NoteIf you randomized your stimuli, Gazepoint Analysis will present the stimuli in random order and end the experiment when all stimuli

have been presented.

While the experiment is being presented on the experiment screen, the screen of the control computer will display the participant’s point of gaze in real time in

16

Page 21: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

the display window. You may also notice that the ‘Start Record’ button changes to a ‘Stop Record’ button. Pressing the ‘Stop Record’ button will end your experiment.

2. Select Screen – The ‘Select Screen’ button functions the same as the ‘Select Screen’ option in Gazepoint Control. Prior to collecting data, you should have calibrated the participant on the experiment screen using Gazepoint Control. Gazepoint Analysis will use the same experiment screen as Gazepoint Control, so this should not need to be changed. If the screen has been changed for some reason, simply press the ‘Select Screen’ button to choose what screen is the experiment screen.

3. Next Media – The ‘Next Media’ button can be used to switch to the next stimulus on the ‘Media List’. Typically, for image/video/text stimuli, you will input a duration for the stimuli to remain on the screen when you add them to the ‘Media List’. However, when using web page stimuli, there is no option for inputting a specified duration. So, when you would like the participant to move on to the next stimulus, you can press the ‘Next Media’ button to change the web page to the next web page on the list.

NoteThe participant has the same ability to move to the next webpage by clicking the ‘Next’ button in the Gazepoint Analysis web browser, as

described on page 11.

4. Visualization – The ‘Visualization’ button lets you change the method in which the real time gaze point data is presented on your screen. The same option exists on the ‘Analyze Data’ toolbar. After pressing the ‘Visualization’ button, the following prompt will appear:

17

Page 22: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

FIGURE 3.13 depicts the Vizualization Settings of the Gazepoint Analysis program.

There are four visualization options: Fixation Map, Heat Map, Opacity Map, and the Bee Swarm, also known as the Single Point render.

1. Fixation Map – The Fixation Map option will use dots, which represent fixations, connected by lines, which represents the saccade. The size of the fixation dots is proportional to the duration of the fixation. As such, you may see the fixation dot grow larger for longer durations.

2. Heat Map – The Heat Map option renders a classic heat map over the stimulus.

18

Page 23: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

3. Opacity Map – The Opacity Map option shows only the portion of the screen the participant was fixating on. This provides a “spotlight” view of fixations.

4. Bee Swarm – The Bee Swarm option renders a single dot indicating where the eye tracker thinks the participant is looking. The term ‘Bee Swarm’ is used due to the fact that the playback of multiple participants’ recordings using this option looks like a swarm of bees flying around the stimulus.

With each of the visualization options, various properties can be adjusted.

1. Size – The ‘Size’ slider adjusts the size of the overlay. For example, the dots of a fixation map become larger or smaller based on the value of the Size slider. This can be helpful to clearly see data points of multiple users without the fixation dots overlapping.

2. Opacity – The ‘Opacity’ slider adjusts the transparency of the visualization overlays. It can be helpful to make a fixation map or heat map more transparent to get a sense of what stimuli are under the visualization overlays. If you are using an ‘Opacity Map’ you can adjust how tight you wish your fixations to be.

3. Gaze Duration – The ‘Gaze Duration’ allows you to set the number of seconds of visualization overlays you want to be visible at one point. For example, setting the ‘Gaze Duration’ to 3 seconds means that the most

19

Page 24: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

recent fixation will be displayed on the screen, along with all of the previous fixations that occurred 3 seconds prior to the current fixation.

4. Heat Map Style – Relative Vs. Absolute Scale; to be added at a later time.5. Show All – The ‘Show All’ option allows you to show the entire gaze path

over the stimulus. To do this, Gazepoint Anlaysis automatically changes the value in the Gaze Duration field to the duration of the stimulus.

6. Events – To be added at a later time.

5. Gaze Video – The ‘Gaze Video’ button acts as a toggle that controls the visibility of the eye video captured by Gazepoint Control. Simply pressing the ‘Gaze Video’ button will remove the eye video from the upper left corner of the preview area. Pressing the button again will bring the eye video back.

6. Thinkaloud – The ‘Thinkaloud’ option will allow you to record the participant with a webcam and save their voice recordings. This option is not currently supported with Gazepoint Analysis Pro (available in Gazepoint Analysis UX).

7. Show Cursor – The ‘Show Cursor’ button acts as a toggle that controls the visibility of the mouse cursor on the experiment screen. Simply pressing the ‘Show Cursor’ button will remove the image of the cursor from preview screen. Pressing the button again will make the cursor reappear on the preview screen.

NoteThis option will not affect the visibility of the cursor for the

participant.

8. Show AOI – The ‘Show AOI’ button will bring up a set of options that lets you control the visibility of any AOIs you have created:

FIGURE 3.14 depicts the AOI Vizualization Settings of the Gazepoint Analysis program.

20

Page 25: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

There are three options that control the visibility of the AOIs you have created. If all three options are toggled off, the AOIs will not be displayed in the display window, but will still function as a normal AOI. The three options include:

21

Page 26: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

1. Show AOI – The ‘Show AOI’ option will toggle the colored box indicating the locations of the AOIs on or off.

2. Show Label – The ‘Show Label’ option will toggle the text indicating the names of the AOIs on or off.

3. Show Stats – The ‘Show Stats’ option works as a toggle that can show the AOI’s stats over the AOI in the display window.

Familiar options to control the ‘Opacity’ and ‘Text Size’ of the AOI labels are also available in the ‘AOI Visualization Settings’ window.

NoteUsing the ‘Show Stats’ option will clutter your preview screen with the statistics for the AOI in question. It is recommended to view

your stats in the ‘AOI List’ box. If you adjust the text size, you may be able to include the stats on the image you export using this

option. Exporting images will be described in Section 3, page 23.Also,

Changing the visibility of the AOI will not affect the experiment screen in any way.

Analyze DataAfter collecting your data, you may wish to view your participants’ gaze point data. To do so, simply click the ‘Analyze Data’ button to bring up the data analysis toolbar, as shown in the following figure:

22

Page 27: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

FIGURE 3.15 depicts the Analyze Data toolbar of the Gazepoint Analysis program.

Important: There is no need to turn on Gazepoint Control if you are only analyzing data you have previously collected. Remember, starting up Gazepoint Control will turn on the IR lights, camera, and server inside the eye tracker. Thus, if there is no participant needing to be recorded, there is no need to turn on the eye tracker.

Analyze Data Toolbar1. Visualization - Pressing the ‘Visualization’ button will bring up the same

window as the ‘Visualization’ button in the Collect Data Toolbar. To read through the functions of the Visualization Settings, refer to page 17.

2. Gaze Video - The ‘Gaze Video’ button acts as a toggle that controls the visibility of the eye video captured by Gazepoint Control. Simply pressing the ‘Gaze Video’ button will remove the eye video from the upper left corner of the preview screen. Pressing the button again will bring the eye video back.

NoteIf you are seeing strange patterns in a participant’s gaze data, be sure to analyze their eye video. You may be able to link an odd

string of data to a movement of the head, a misreading from the eye tracker, or a shift in the participant’s position.

3. Thinkaloud - The ‘Thinkaloud’ option will allow you to playback the participant’s thinkaloud data. This function is currently not available with the lab’s current GP3 set up.

23

!

Page 28: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

4. Show Cursor - The ‘Show Cursor’ button acts as a toggle that controls the visibility of the mouse cursor on the experiment screen. Simply pressing the ‘Show Cursor’ button will remove the image of the cursor from preview screen. Pressing the button again will make the cursor reappear on the preview screen.

5. Show AOI - Pressing the ‘Show AOI’ button will bring up the same window as the ‘Show AOI’ button in the Collect Data Toolbar. To read through the functions of the AOI Visualization Settings, refer to page 20.

6. Export – Pressing the ‘Export’ button will allow you to export your participants’ raw data as a CSV file, AOI statistics in an Excel file, and images or videos of your participant’s fixation maps or heat maps. The various options of the ‘Export’ button will be described on the following pages.

Visualizing the Recorded DataTo view your participants’ gaze point data for a stimulus, follow these steps:

1. Select and highlight the stimulus you wish to view with the data in the ‘Media List’ box. This will display the stimulus in the display window.

2. In the ‘Recording List’ box, select which participants’ data you would like to visualize over the stimulus by checking the boxes in the first column associated with each recording. This allows you to separate recordings to view data for a specific subset of your sample or to determine if a participant’s data may be flawed.

3. Press the ‘Play’ button on the bottom toolbar. This will display the stimulus you selected and play out the recording of the gaze point data for each participant that was selected. You can then modify the presentation of the gaze data by using the ‘Visualization’ options.

Export DataExporting your data from Gazepoint Analysis is a processes, rather than a one click operation. This process lets you pick as much or as little data you want to export,

24

Page 29: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

determine how your exported images will render, and choose if you would like videos of your data rendered. Following the steps below, you will be able to export your data quickly and efficiently.

Step 1: Selecting the Data to ExportTo begin the export process, you will need to select which user data you want to export. This selection process functions exactly like choosing a user data recording to analyze in Gazepoint Analysis. As seen below, two users are selected to be exported:

FIGURE 3.16 depicts the ‘Recording List’ of the Gazepoint Analysis program. Two recordings are selected, thus each data set will be exported.

Selecting a user means that the entire recording will be exported. This includes data for each stimulus, and as such, the stimulus currently in the display window does not affect the export. However, if you wish to export images or videos of your participants’ data, you will need to select the stimulus and set up the display window, described in Step 2.

If you do not need to export images of your stimuli, proceed to Step 3.

Step 2: Setting Up the Display Window for Image/Video ExportImageTo export an image of a stimulus, you must set up the display window exactly how you would like the image to look. First, select the stimulus you would like an image of. Then, drag the time slider to a desired point. This will let you decide at what moment you want the image to portray. For example, I want a fixation map of my participant’s data, superimposed over a roller coaster video they watched, at the 6 second mark.

25

Page 30: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

FIGURE 3.17 depicts setting up the export image for the example.

1. I first select the stimulus of interest. So, I select the Roller Coaster stimulus in the ‘Media List’ box.

2. Next, I drag the time slider to the 6 second mark. Alternatively, I can type in 6 seconds in the ‘Time’ field at the end of the time slider.

3. Finally, I decide that I want both the gaze video and cursor in the image when I export it. So, I keep the ‘Gaze Video’, AOIs, and ‘Show Cursor’ toggles on.

a. Alternatively, if you do not want to render the gaze video, AOIs, or cursor, simply toggle them off and they won’t be included in the exported image.

This static image currently in the display window can now be exported by continuing to step 3.

VideoTo export a video of the stimulus, simply select the stimulus of which you want to render a video then proceed to step 3. The same rules no not apply for the gaze video, cursor, and AOIs: These overlays will be visible on every video you export from Gazepoint Analysis.

NoteYou can only export one image or video at a time. To export

another image or video for another stimulus, repeat the process with the new stimulus and only choose to export the image or video

26

1

2

3

Page 31: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

in step 3.

Step 3: Name and Export Your Desired FilesAfter you have selected your data and set up your display window, you simply press the ‘Export’ button on the Analyze Data Toolbar. The following box will appear:

FIGURE 3.18 depicts the Export window of the Gazepoint Analysis program.

NoteBy default, the files that you export are saved in the ‘Result’ folder

of your project’s directory.

First, you will want to input a name for the files you are exporting. To do so, type in the name in the ‘Custom export file label’ field. Next, you have to decide what you want to export. The available options are detailed below.

Export CSV

To export the raw data of the recordings you selected, press the ‘Export CSV’ button. This will create up to three files in the result folder: XXXX-user.csv, XXXX-user-fix.csv, and CurrentAOIStaticticsX.csv.

XXXX-user.csv – This file contains every data record recorded.

XXXX-user-fix.csv – This file contains a subset of the data in which only the fixations are listed. This file often simplifies the data analysis by targeting the information we are likely to pick out ourselves later.

CurrentAOIStaticticsX.csv – This file contains a list of the AOIs and their statistics on a per user basis, as well as the average over all users selected.

27

Page 32: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

Important: If you want to export the AOI Statistics, you need to first run the statistics in the Gazepoint Analysis program. If the statistics are not run in

Gazepoint Analysis, the

CurrentAOIStaticticsX.csv file will not be created.

Export Image

To export the image you set up in the display window during Step 2, simply press the ‘Export Image’ button. This will create a file called ‘CurrentImageX.png’ in the result folder. You can repeat the export process to export another image. The X value of the file name will be incremented by 1 for each export to avoid overwriting any previous exports.

Export Video

To export a video of the stimulus currently visible in the display window, press the ‘Export Video’ button. This will create a file called ‘CurrentVideoX.avi’ in the result folder.

NoteIf you would like to view the video after exporting the file, be sure

you have a media player that supports the *.avi file type.Checking the ‘Copy Audio’ box will ensure that the audio portion of the video is exported as well. If this is not checked, the video’s audio will not be exported.

Experiment: Part 3If requested, details outlining the creation, running, and analysis of an example experiment will be highlighted in this section.

28

!

NoteConcerning the CurrentAOIStatisticsX.csv file name: The X is a

placeholder for a number that indicates what recording was exported. This placeholder will be incremented by 1 each time an export is performed to avoid overwriting previously exported files.

Page 33: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

The ‘How To…’ GuideThis section will offer step-by-step instructions on how to use various features of GP3 software, Gazepoint Control and Gazepoint Analysis. Instructions to export data to .mat files and utilizing the iMap software with GP3 data will be listed.

ith the basics under your belt, you may be interested in some of the more complex functions of the GP3 eye tracker and its accompanying software, Gazepoint Control and Gazepoint Analysis. I will highlight some of the

more interesting functions that we have discovered to date. W

S E C T I O N 4

Dynamic AOIsUtilizing Offsets Web Page StimuliUsing Data In iMap

How to Create Dynamic AOIsThis portion of the ‘How To…’ guide will detail how to create a dynamic, moving, AOI. The process, while simple, can be time consuming if you have stimuli with long durations and arced trajectories of motion. This section assumes that you have already uploaded the stimulus to which you wish to add a dynamic AOI into your ‘Media List’.

Creating a Dynamic AOISTEP ONE: Pick a stimulus that has some type of moving element. Often this will be a video stimulus (or even a Screen Capture stimulus that is running a 3rd party program with moving elements) that has an object moving across the screen.

29

Section

4

Page 34: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

Select the stimulus that you want to have a dynamic AOI in the ‘Media List’ to bring it up in the display window.

STEP TWO: Click the ‘+’ button below the ‘AOI List’ to add a new AOI. Give the AOI an appropriate name such as “Ball”, “Bird”, or “Hazard” that describes the moving object for which you are creating the dynamic AOI, as this will be the only method to identify your AOI when the data is exported. You can also select the color of the AOI at this time or change it later.

STEP THREE: Move the time slider to the moment when the object you wish to track is in the display window. This could be at the very beginning, middle, or near the end of the file depending on your stimulus. Either way, the time slider needs to be set at the precise moment when the object should be tracked with a dynamic AOI.

STEP FOUR: In the display window, click near the object and drag a box around the object. You will see the box is tinted the same color as the AOI Appearance color in the AOI Setting window. You can adjust the size and location of the box by dragging the edges and center of the AOI box, respectively. You will now see the ‘Frame Time’ list contains the same value as the ‘Time’ field at the end of the time slider. This is the time when the AOI will begin recording statistics for that region.

STEP FIVE: So far, you have created a static AOI that will not move until the end of the stimulus. To create the movement effect, click the time slider to select it. You can now use the arrow keys to move the time slider forward or backwards. In this case, we want to move the time slider forward to a point where the object has moved in some direction IN A STRAIGHT LINE.

Important: It is important that the object you are tracking with a dynamic AOI only moves in straight line segments. This is due to Gazepoint Analaysis interpolating the distance between two AOIs as moving in a straight line.

STEP SIX: Now that you have found the point in time where your object stops moving, disappears, or begins to angle in a different direction, you need to put another AOI over the object. If your object has not changed in size between the two points in time, you can press the ‘Copy Prev’ button in the AOI Settings window to make Gazepoint Analysis copy the previous AOI box over to this point in time. This AOI box will have the exact same dimensions and location as the previous AOI box and you can simply drag it over to cover your object in its new position. If the object changed size, you may want to draw a new AOI box around the object in its new position. Gazepoint Anlaysis will adjust the size of the AOI box accordingly during the movement.

30

!

Page 35: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

STEP SEVEN: You will notice that another value was added to your ‘Frame Time’ list in the AOI Settings window. The second value shows the time where the second AOI was put in place. Gazepoint Analysis will now start the AOI recording at the time indicated by the first value of the ‘Frame Time’ list and end the AOI recording at the time indicated by the second value in the list. In between these two times, Gazepoint Analysis will move the first AOI location in a linear fashion to the second AOI location.

NoteAt this point, it would be a good idea to set the time slider before your dynamic AOI appears and press play to watch the AOI move

from one location to another. If the distance between two locations is large, you may notice that the object you are tracking does not

line up perfectly with Gazepoint Analysis’s movement of the AOI. In this case, it would be advised that you reduce the distance you

cover and add more AOIs between the two points.STEP EIGHT: If your objects keep moving, repeat steps 5, 6, and 7 to add more AOIs at different time points.

Important: If your object moves then remains stationary until the end of the experiment, follow the steps above, except, you may want to add another frame time to keep the AOI on the screen until the stimulus is finished playing, otherwise it will disappear at the last ‘Frame Time’ value. To do this, simply move the time slider to the end of the stimulus and create another AOI box over the object. This sets the AOI to stop recording at the end of the stimulus.

A Note on Dynamic AOIs for Curved MovementsAs you could imagine, an object that moves in a curved trajectory would be hard to track with a dynamic AOI due to Gazepoint Analysis moving the AOI in straight segments. It can be done, but the task is quite tedious. Luckily, you only need to set it up once for each stimulus in an experiment!

To mimic the arc path, you will have to only move the time slider in small increments (I used 0.100 seconds for the Angry Birds example). This process basically boils down to:

1. Draw the first AOI box2. Move the slider by 0.100 s3. Draw the next AOI box or copy the previous box and move it into position4. Move the slider by 0.100 s again5. Repeat steps 1-4 until finished

How to Use Offsets to Correct Gaze DataSometimes, you may notice that a participant’s data (or even all of your participants’ data!) is generally shifted in a particular direction. For example, when a participant was reading the banner of a webpage, it may appear as if they

31

!

Page 36: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

are looking at the space below the banner. This means, that Gazepoint Control is placing the participant’s point of gaze lower than it should be. If you have your participant present when you notice this shift, you can recalibrate, adjust the eye tracker, or the participant’s positioning to alleviate the problem.

However, what could you do to fix the shift if you recorded your data at some other time and only now noticed the shift when you went to perform your analysis?

You can apply an offset, unique to each participant, which adjusts their gaze data to fall in line with the stimulus being presented. To add an offset to a participant’s data, follow these steps:

STEP ONE: Double click on the participant recording or highlight it by clicking on the name and pressing the ‘User Settings’ button in the lower right of the ‘Media List’ box to which you would like to apply the offset. This will open the ‘User Data Settings’ window.

STEP TWO: Judge the severity of the shift in data and input values into the ‘Data Offset X’ and ‘Data Offset Y’ fields to shift the data to a more reasonable position. The quadrant system used by Gazepoint Analysis does not follow the norm that you would assume when asked for X and Y values. Below is a chart showing which sign, positive or negative, will yield a particular direction:

FIGURE 4.1 shows the direction of change depending on the sign used, positive or negative numbers.

Press OK to get a short glimpse of your data offset before the stimulus unselects itself. To reevaluate your newly shifted gaze data, replay the gaze data over the stimulus and decide if the offset needs to be adjusted again.

NoteUnfortunately, the offsets are measured in pixels, thus without a

32

Page 37: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

way to measure the desired distance in pixels, you will have to guess, check, and modify the offset values until you feel the gaze

data is following a more reasonable set of positions.

Utilizing a Calibration ImageOne way to ease the pain of judging the fit of a data offset is to include “Calibration Stimuli” in your experiment. This stimulus would ask the participant to move their gaze between two points multiple times, just for the sake of getting a well-defined gaze path between two defined points. Here is an example of using a calibration image:

INSERT IMAGE HERE

FIGURE 4.2 shows an example calibration image where the participant was asked to alternate his gaze between the two eyes of the face in the stimulus.

As you can see, this calibration image yeilds fixations at each of the eyes and a scan path between each eye. You can also see that the data is shifted slightly to the top of the screen, so we will want to shift the data down until the fixations land directly on the eyes. This simple change will then adjust the participant’s gaze data for each stimulus.

How to Analyze Data when using a Web Page StimulusOne of the most interesting features the GP3 eye tracker brings to the table is the ability to collect and analyze eye tracking data for web sites. A number of studies based on advertisement effectiveness and web page usability can be completed using the GP3 eye tracker. This ‘How to…’ will highlight some of the features of collecting and analyzing data when using web page stimuli.

The Web Page StimulusOne of the possible stimulus types that you can add to an experiment is the Web Page stimulus. When adding the stimulus, you simply type the URL of the web page in the appropriate field and Gazepoint Analysis will present the website on the experiment screen when prompted. What is Gazepoint Analysis really doing behind the scenes?

First, Gazepoint Analysis is utilizing the control computer’s Internet Explorer engine. It then utilizes its own browser, known as the Analysis Browser, that limits what the participant can do while surfing the web. This removes the possibility of the participants interacting with elements of the web browser when they should be focusing on the web page stimuli.

Note

33

Page 38: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

It is a good idea to keep your Internet Explorer up to date with the latest version for best operation.

Another thing to keep in mind when using web page stimuli is that the recorded data does not include the browser’s frame. As such, you will not have the option to see if your participant is looking at the address bar or one of the browser’s buttions.

Pre-Defined Web Pages vs Surfing the WebIt is possible to allow the participant to surf the web freely during the experiment. If you would like the participant to surf the web, create a web page stimulus with a starting page such as Google and give them access to a keyboard and mouse. They can freely change the web address in the address bar and surf for the duration of the experiment. Doing so will create a new recording with the same participant for each web page visited, allowing you to analyze each web page separately.

However, when you look at the recording list, you will see your list of web pages you are interested in analyzing mixed in a mess of recordings that have very short durations. This is due to Gazepoint Analysis creating a new recording each time a webpage loads or changes while loading data. You will have to sift through these recordings and pick out the ones you wish to keep.

When using pre-defined web pages, the browser will only create a new recording when the participant moves on to the next stimulus, thus removing the loading screen recordings.

Analyzing Multiple Participants’ Data on the Same Web Page

To be added at a later time.

34

Page 39: University of South Dakota - Technical …apps.usd.edu/coglab/schieber/eyetracking/Gazepoint/p… · Web viewGazepoint 3 Eye Tracker Manual Note Prior to reading this manual, it is

G A Z E P O I N T 3 E Y E T R A C K E R M A N U A L

A Final WordIn this manual, I provided you with a complete understanding of how the GP3 eye tracker and its accompanying software, Gazepoint Control and Gazepoint Analysis, can be used to collect and analyze eye tracking data. I also highlighted various functions and listed tips and tricks I learned along the way. If you have a suggestion for a new procedure or element to add to this manual, please let me know.

However, if you need help with more advanced modifications or run into a snag in the program, feel free to send an e-mail detailing your questions/concerns to [email protected].

35