3d semi-immersive vr air traffic management … semi-immersive vr air traffic management pr oject...

15
3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic Management Project Report on progress during Phase V (To November 2005) Marcus Lange Matthew Cooper NVIS, ITN, University of Linköping Campus Norrköping 601 74 Norrköping Sweden Vu Duong Marc Bourgois Eurocontrol Innovative Research Group Eurocontrol Experimental Research Centre Brétigny-sur-Orge France 16 November 2005 Version 1.0

Upload: trancong

Post on 28-May-2018

229 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

1 of 15

3D semi-immersive VR Air Traffic Management Project

Report on progress during Phase V

(To November 2005)

Marcus Lange

Matthew Cooper

NVIS, ITN, University of Linköping

Campus Norrköping

601 74 Norrköping

Sweden

Vu Duong

Marc Bourgois

Eurocontrol Innovative Research Group

Eurocontrol Experimental Research Centre

Brétigny-sur-Orge

France

16 November 2005

Version 1.0

Page 2: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

2 of 15

Table of Contents

1 Abstract ....................................................................................................................... 3

2 User interaction........................................................................................................... 4

2.1 Interaction handler and devices .......................................................................... 4

2.2 Remote ‘tablet’ interaction device. ..................................................................... 4

2.2.1 Tablet mouse interaction............................................................................. 6

2.2.2 Tablet gesture input..................................................................................... 6

2.2.3 Tablet text input. ......................................................................................... 6

2.2.4 Integration with Voice interaction .............................................................. 7

2.2.5 Existing problems with the tablet device. ................................................... 7

3 Information representation.......................................................................................... 9

3.1.1.1 Virtual Windows..................................................................................... 9

3.1.1.2 Reference Views ................................................................................... 10

3.2 Interactive predicted flight future display through ‘scratching’ ....................... 10

3.3 Data reduction methods .................................................................................... 12

3.3.1 Levels of Information. .............................................................................. 12

3.3.2 Reduced trajectory information ................................................................ 14

4 Conclusions............................................................................................................... 15

Page 3: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

3 of 15

1 Abstract NVIS has been working with Eurocontrol’s INO group for the last five years to explore the usability of 3D display and ‘Virtual Reality’ technologies in the sphere of Air Traffic Management and Control. NVIS’ task has been to explore the potential of this approach from the viewpoint of information visualization and interaction and has produced five versions, each building on the experiences with previous development, of an interactive, semi-immersive 3D visualization system for evaluation by INO.

Apart from the many internal changes made to the software to extend and improve its function, common to any large software development which is under constant redesign and extension, the work over the last year has focussed on two main areas: improvements in the information display and extensions and improvements in the interaction mechanisms present in the system. One particular extension, which is not visible to the user, is the extension and generalization of the inter-process communication scheme to control configuration and use of interaction devices. This new scheme makes it possible to combine and use multiple interaction devices with a running instance of the system or for more than one user to work within the same space producing a truly collaborative environment. We are also exploring the use of the same inter-process communication features to permit the interconnection of multiple instances of the visualization application to permit a networked collaborative environment.

The major additions visible to the user this year are in the introduction of sophisticated new tools for interaction, such as the moded tablet interface, and the data reduction mechanisms which reduce the visual clutter and make the system more usable in the high traffic density scenarios towards which the system is targeted.

Page 4: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

4 of 15

2 User interaction The existing speech recognition, tracked wand pointer and graphical user interfaces provide an easy to use and intuitive command scheme to control the system but questions have been raised over its usability in an environment where the controller must also use voice interaction to control the flights under their control. In extended use the additional voice interaction and the wand pointer may also be tiring to use, shortening the length of a shift cycle which each controller can sustain. To avoid this potential problem, alternative interaction schemes have been examined to spread the load across the user’s faculties.

2.1 Interaction handler and devices The mechanism for interfacing user interaction devices with the system, as developed over the last four years, had become extremely complex. As is the nature of research systems, whose specification tends both to be explicitly redefined and to drift over time as demands change, the system had become unwieldy and inflexible, and so difficult to extend. This was rectified during this development phase with a major revision of the interaction device API being undertaken. The system now relies on management sub-processes to provide interaction control for each of the different device types through a standard communication protocol. This moves all of the existing interaction devices (2D mouse, 2D graphical user interface, head tracking, ‘wand’ pointer and buttons, data-glove and gestures, and voice recognition) into separate processes which communicate with the visualization and simulation processes of the system to control the interaction. Separating the interaction management and visual display has a number of benefits. Firstly, it permits new interaction modes to be created easily since there now exists a standard protocol by which they can communicate with the existing system. Secondly it permits new interaction devices to be added or removed from the running application. New processes start, connect with the running system and can be used as desired. When stopped they detach from the running system and can be reattached later if required. Finally, and most useful for our extensions this year, the interaction management processes can be running anywhere, not necessarily on same computer system as the visual display and simulation system. This last feature opens up the possibility of a decentralized system where interaction devices can be attached to computer systems which manage specialized interaction mechanisms and communicate with the main system over a network. This has a number of further benefits in that it permits both collaborative interaction and, potentially, collaborative visualization.

2.2 Remote ‘tablet’ interaction device. Recent developments in low power computing devices and new lightweight but high power battery technologies have made possible the creation of tablet PCs. These are fully functional portable computers running near-standard operating systems and which have been offered as a replacement for the common ‘laptop’ computer system. The attempt to market tablets for this purpose has largely failed as the needs of the travelling computer user are not so easily met without a fully configured keyboard but the arrival of tablet devices has opened up a quick and easy way to develop powerful user interface tools for

Page 5: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

5 of 15

complex user interaction. We have made use of a standard tablet PC system, equipped with wireless networking, to construct a portable user interface for our application.

The tablet PC uses a pointer device to interact through a graphical user interface, written in C# and running on the Microsoft WindowsCE platform, which is made up of three large screen areas. See Figure 1, below. Each of these areas has a specific purpose within the display. To make the system easy to use without looking at the tablet device, we have kept the interface as clear and uncluttered as possible, simply dividing the whole screen into areas and not using any small buttons or widgets. The interface is designed for the user to be able to control the system without having to look at the tablet’s screen. The pointer, combined with a small number of finger buttons on the pointer and on the edge of the tablet device, should be sufficient to provide the full interface.

Figure 1: User interface layout on tablet screen. The three screen areas provide ‘2.5D’ mouse input, gesture recognition and text recognition.

Page 6: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

6 of 15

2.2.1 Tablet mouse interaction. The largest area is the top half of the screen which is used as a mouse interaction area. Any user familiar with a 2D mouse metaphor will have little difficulty using this tool as it can be simply thought of as a 3D abstraction of the 2D mouse. The two buttons provided by the ‘pen-like’ tablet pointer (one provided at the tip, one as a button on the barrel of the device) control navigation through this area. Moving the pointer while pressing against the screen (activating button 1) allows the user to rotate the display around the central focus point of the view, left/right and up/down. Using the second button allows the user to control zooming of the display. Using the pointer in this main area in combination with a third button (on the side of the tablet device), permits selection of objects (aircraft and waypoints) and moving them. The user’s mouse pointer is visible in the main ATC display at all times despite the two systems not being directly connected. Communication with the main application is provided through a network connection which, in the case of the tablet, is usually wireless. This use of the system can be thought of simply as a portable mouse pad but the system offers a great deal more.

2.2.2 Tablet gesture input. The second area of the tablet user interface is the area in the lower left of the tablet display which is used for gesture input. Gestures are symbolic marks drawn by the user and which the user interface system identifies and interprets to cause some operation to occur. The possibilities of this input are largely open at the moment. We have made some use of it to include simple commands to control navigation and other features of the display system and to switch between different modes for some of the more sophisticated interaction methods.

Simple examples of the gesture input include the use of recognizable symbols such as a gesture which is like a small ‘w’ to switch off the display of aircraft waypoints and a similar, but larger, gesture to switch them on. The drawing of a ‘teardrop’ shape, here representing a raindrop, to toggle the inclusion of weather information in the 3D display. The range of possibilities for gestures is enormous but it is essential to ensure that the shapes used are both memorable and easily distinguished by the recognition system. This approach is something which we have only begun to explore this year and will continue to examine in the future.

2.2.3 Tablet text input. When using the gesture interface it is frequently useful to be able to include additional information about the operation that the user wishes to carry out. For example if the controller wishes to move the display to a particular conflict then being able to specify which one with a text entry is useful. We have allowed this through the inclusion of a third area of the screen which is used to perform text recognition. Thus, if the controller wishes to move the viewpoint to conflict four a number ‘4’ is written in the text recognition pad before using a gesture to indicate the command “focus on conflict”, in this case a gesture like a letter ‘C’ is used. The text entry of “4” is included in the command sent to the visualization system and the viewpoint is shifted accordingly. Similarly the call-sign of the flight to be focussed upon can be entered by text entry into the text recognition area, perhaps “SK254”, before a gesture for “focus on flight”, a

Page 7: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

7 of 15

stylized ‘f’ symbol, is entered. Again, the call-sign information is included in the control to the display and the system reacts accordingly.

Once again, this component of the user interface obviously requires a substantial evaluation and technical development to establish the most appropriate way to interact with the system.

2.2.4 Integration with Voice interaction The tablet device, despite being a relatively low powered computing device by modern standards, has sufficient processing capabilities to be able to handle the speech recognition interface developed in previous development work on this project. The standard USB DSP headset, introduced in 2004, can be connected to the tablet PC and the same speech recognition can be installed on it to carry out the voice interaction tasks which, previously, were implemented using a relatively static laptop or desktop PC. The speech recognition performance is as effective when used through the tablet PC and the use of the graphical interface on the same tablet PC does not affect it. The user can thus combine the two modes of interaction, pointer-based and voice-based, to provide a highly featured and very flexible interface which is portable and reliable.

Figure 2: User working with the immersive stereoscopic 3D display using voice

recognition and tablet screen interface to interact with the system.

2.2.5 Existing problems with the tablet device. The major limitations affecting use of the tablet device are still its weight and battery lifetime which are still too high and too short, respectively. The device weighs 1.3 kilos and the battery lifetime is approximately 2.5 hours when the wireless network interface is

Page 8: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

8 of 15

in use. The weight is not prohibitive but it is more comfortable to use while sitting, where the user can rest the device on their hip or thigh to support the weight. When using the device while standing, the user’s arm can grow fatigued or even painful in less than an hour. Both these problems can be removed by connecting the device to its mains supply and removing the battery from the system. The battery is the heaviest single component of the system without which it weighs less than a kilogram making it much more comfortable to hold for extended periods. Of course the cable reduces its portability somewhat but the cable is lightweight and can be quite long. Ultimately, we can expect battery technology to reduce or remove these problems of weight and lifetime in the future through technological improvements.

A second, more unexpected, problem with the tablet device is the sheer number of possibilities which the device opens up for us. We have initially implemented a simple user interaction interface, based on combinations of three separate ‘pointer’-based interaction modes, but the device offers an enormous range of interaction opportunities. The moded interface, where the display is changed as the user switches between tasks so that the interface reflects their needs at all times presents enormous opportunities and challenges in interface design. The ability to combine the user interaction we have provided with information display on the tablet, perhaps specific to the current task or currently selected flight(s), extends the possibilities still further. We feel there are many exciting new ideas to be explored with this tool. It certainly must be thought of as being much more than simply a portable mouse-pad.

Page 9: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

9 of 15

3 Information representation 3.1 Virtual Windows The user may often find that, despite its great flexibility and manoeuvrability, the 3D immersive display is still not sufficient to monitor all the activity which they would like. To help resolve this we have taken ideas from works on data and sensor fusion to enhance the display by including multiple views within the display. These alternative views are included as alternative viewpoints, possibly with different camera settings, which are present as sub-windows in the display. These sub-windows, which we have dubbed ‘Virtual Windows’, can be used to contain a wide range of alternative viewpoints onto the immersive 3D world. They can be defined in a wide variety of ways and we are aware that we have only explored a few of these so far.

Possibilities we have implemented for the virtual windows include:

• An ‘overview’ mode, where a virtual window contains a wide angle view of the scene which the user can use to navigate to interesting places or features or just to maintain a global view of the data

• Closer views of one or more airport approaches so that these can be monitored more closely while the controller’s main display deals with other activity.

• Predicted conflicts occurring in the scenario can pop up as virtual windows allowing the user to keep track of them and quickly assess them before deciding which ones to resolve first.

• Bookmarked views, described in section 3.2, can be stored as virtual windows either to be monitored until no longer of interest or until the user wishes to switch back to them.

The virtual windows can also act as more than just alternative views of the display. They can also be used as buttons which, when selected, swap the current main view with that of the virtual window. Thus the controller can quickly switch to a conflict, resolve it, and switch back or can switch through a series of preferred reference views as useful in their work.

Virtual windows can be created dynamically in the running system and can be deleted when no longer required.

Page 10: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

10 of 15

Figure 3: Main display including three virtual windows showing an overview of the area being surveyed in the main display, a predicted conflict which the system has detected and a reference view of Stockholm’s Arlanda airport.

3.2 Reference Views Methods to help the user to orientate have been examined. The flexibility and manoeuvrability of the camera in the display can leave the user disorientated with respect to the scene. One approach we have implemented is to define a set of ‘reference’ views, standard for each user, to which they can return with a single command, gesture or button press. This will ensure that, should they become lost, the controller can easily locate themselves and so restore confidence in the display they are seeing. The user is also able to add to their set of reference views by ‘bookmarking’ a specific view during runtime. This view is stored and can be returned to at any later time. This approach permits the user to build up a set of useful views which they frequently use as starting points for monitoring the data and also allows them to build special safety views for use should they become confused. These reference views can also be included in the virtual windows (see section 3.1) displayed on the main screen. The user can then switch between views using that interface as well.

3.3 Interactive display of predicted flights through ‘scratching’ The facility for our system to precisely predict and locate conflicts in the planned trajectories of aircraft was introduced into the system in 2004. The ability to detect these potential problems long before they become a problem is obviously a benefit but the ability to understand how they can most efficiently be resolved remains a problem. Knowing that a conflict will occur in the next few seconds obviously requires the controller to act immediately to resolve the problem. Knowing that a conflict is likely to

Page 11: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

11 of 15

occur in 10 or 20 minutes time does not require immediate action. In fact it may be the case that immediate action results in a less than optimal solution to what is, of course, an enormously complex problem. The ability to examine how the conflict will come about in the future, playing the scenario backwards and forwards to determine the optimal moment to act could be an enormously useful ability for the controller. To achieve this we have implemented the facility for ‘scratching’, using the analogy of the DJ’s trick of distorting playback of a piece of music from a vinyl recording by spinning the disc backwards and forwards with their hand. We have implemented a similar approach using a slider to allow the controller to advance and reverse the evolution of the simulation.

Figure 4: A series of images as the ‘scratching’ control is applied. The top left shows the unscratched display where a conflict has been detected. The others show views as the controller slides forwards in time to examine how the conflict comes to be.

The user can play the scenario back and forth, inserting markers at specific times allowing the user to reset the scenario to each marker if they wish. The user can also send out ‘probes’, exploring the future flight paths. These probes are connected to their source aircraft by a line, making it possible for the user to see which probe belongs to which aircraft. The view is reminiscent of video playback, where a ‘jog-shuttle’ control is used to rapidly move backwards and forwards through the playback, and gives the user an excellent opportunity to examine how complex events come to happen and, hopefully, identify the best moment to intervene to avoid a particular problem developing in the future.

Page 12: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

12 of 15

3.4 Data reduction methods 3.4.1 Levels of Information. One of the criticisms which has frequently been levelled at the system is that the display can rapidly become cluttered since many new features for information display have been added during each development phase. This has been done in order to examine means with which information can be added and the benefits which it may have to offer to the controller and it is obviously not appropriate to use all of these display methods at all times but only when and where they are required. To help with this problem we have examined some mechanisms by which the amount of information being included in the display can be reduced by focussing on those features which the controller will find most useful based on various criteria. Such criteria include the current centre of focus of their attention and the nature of their current task. An example of a mechanism which we employed in 2004 to try and help with this problem is the inclusion of means to identify and emphasize critical weather information automatically from the data supplied to us. This proved quite effective and formed a starting point for our work this year.

Our first idea was to make use of graphics hardware, specifically the ability to include so-called ‘fragment shaders’, to remove complex data far from the current centre of view. This can be thought of as an extension of the ‘clip-box’ idea which was introduced during our 2003 development. The idea with the basic clip-box was that it allowed the controller to focus only on their area of interest and ignore the other parts of the display. We have carried this idea further by using the graphics hardware to gradually reduce the amount of detailed information with distance from the centre of the users view. As we move away from that centre the level of detail in the map, the amount of information carried by the flights (using the flags display introduced in 2004), and even the aircraft models can fade so that the only information remaining is basic display of the distant flights (trajectories) and detailed information of those close to the centre of view. As the user moves their point of view by moving their head in the tracked 3D environment, the display is updated in real time since the graphics hardware is quite capable of keeping up with the speed of the user’s movements.

Page 13: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

13 of 15

Figure 5: Four views showing data reduction in action. In the top left the cluttered unreduced view. In the top right the old-style clipbox reduces the data by simply

cutting it off. The lower left image shows the reduced view: only those flights close to the centre of view are displayed. In the bottom right a conflict has been detected in the

distance and so the flights involved and the conflict position are displayed despite being remote from the centre of view.

The reduced view allows the user to concentrate upon the objects closest to their centre of view but if a problem such as a potential conflict is detected in a remote location it would also not be displayed. Consequently we have included the ability for the system to override the display settings when a remote emergency situation is detected and so the flights involved and any other pertinent information, such as the position of a potential conflict, can be included in the rendered image despite being outside of the area of interest which the user has defined. Example views of the data reduction system are shown in Figure 5.

It is, of course, important for the controller to know where the boundaries of their area of interest are and hence what data is being excluded. We are currently addressing this problem through the use of an approach we have called ‘spotlight-marker’. In this approach the colouration of the area of the map outside of the area of interest is greyscale while that where the level of information is high is displayed in colour. Other effects are also being explored.

Page 14: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

14 of 15

3.4.2 Reduced trajectory information As the number of flights included in the display increases, the number of different inbound, outbound and other trajectories will become unmanageable and the display is likely to become an unintelligible mass of lines of various colours. In order to address this we have considered a number of approaches to deal with the number and extent of the trajectory lines in the display.

The ‘fading’ effect described in section 3.4.1 could be extended to include the trajectories but this would result in an empty display for these remote areas which would remove any indication of the presence of aircraft rendering the display useless in those areas. We have, therefore, combined this with an extension of the ‘lookahead’ function described in earlier reports on this work where the trajectory of each flight is visible only for the next few minutes of the simulation. The flight trajectories are included in the display but only for a short period into the future. This reduced view retains the presence and future behaviour of the flights but with less detail leaving more space to include more flights in the scene. This view effectively declutters the display without removing all the useful information. When the user points at an aircraft, the lookahead trajectory is highlighted, making it easy for the user to distinguish between that and other aircraft’s flight paths.

Figure 6: Long lookahead display of flight trajectories. The trajectories are, here,

shown for ten minutes into the future but this can be changed by the user.

Alternative displays using intelligent merging of trajectories based on distance so that the number of lines is drastically reduced and the aircraft are mapped to these super-trajectories have also been considered but, at present, this method has not been implemented in the display system.

Page 15: 3D semi-immersive VR Air Traffic Management … Semi-Immersive VR Air Traffic Management Pr oject – Phase V final report Printed: 16/11/2005 1 of 15 3D semi-immersive VR Air Traffic

3D Semi-Immersive VR Air Traffic Management Project – Phase V final report Printed: 16/11/2005

15 of 15

4 Conclusions In this development phase we have explored a number of new approaches to interaction and visualization of the data and have found many interesting results which await detailed evaluation of their effectiveness for the operational controller. We have also explored mechanisms to reduce the available information in the display to help avoid the problem of information overload which are a problem when exploring these new opportunities in this complex display and interaction environment. Some of the interaction mechanisms, in particular has thrown up a huge number of new possibilities which we hope to explore in future work in collaboration with experts at Eurocontrol.

We have had frequent interaction with experts at Eurocontrol and through our own contacts with the Swedish Civil Aviation Authority, and have received a great deal of feedback from both groups but many of the features still remain to be examined through a formal evaluation process. We have recently increased the level of effort at NVIS through the creation of a new postdoctoral position, funded by Eurocontrol’s INO group, specifically to address this issue. We expect to carry out a number of evaluation experiments over the coming year and hope to gain a great deal more insight into how 3D immersive display and interaction could be of benefit to the controller in the future.