Control of depth movement for visual display with layered screens

Information

  • Patent Grant
  • 7724208
  • Patent Number
    7,724,208
  • Date Filed
    Friday, August 18, 2000
    24 years ago
  • Date Issued
    Tuesday, May 25, 2010
    14 years ago
Abstract
A multi-level visual display system has a plurality of screens spaced in the depth direction. A user can move a visual indicator such as a cursor between the screens, via an input device such as a mouse button. In drawing applications a visual link such as a line can be created between two screens. In game applications a user can move an image both within and between screens by dragging a cursor while moving it between the screens, to provide an illusion of three dimensional movement. The screens may comprise layered liquid crystal displays.
Description
TECHNICAL FIELD

This invention relates to a visual display system.


BACKGROUND ART

Particularly, the present invention relates to a visual display system including multi-level screens which are placed physically apart.


Such screens are described in PCT Application Nos. PCT/NZ98/00098 and PCT/NZ99/00021.


These devices are created by combining multiple layers of selectively transparent screens. Each screen is capable of showing an image. In preferred embodiments the screen layers are liquid crystal display. Preferably the screens are aligned parallel to each other with a pre-set distance between them.


With this device images displayed on the screen furthest from view (background screen) will appear at some distance behind the images displayed on the screen closer to the viewer (foreground screen). The transparent portions in the foreground screen will allow viewers to see images displayed on the background screen.


This arrangement allowing multiple screens allows images to be presented at multiple levels giving the viewer true depth without use of glass or lens.


Up until now, software has been written to create visual sequences on the multi-level screens. These sequences have been mainly passive, mainly for viewing rather than for interaction.


While the visual effect of these sequences is spectacular, it will be desirable if potential uses of a multi-level screen display could be explored further.


It is an object of the present invention to address this problem, or at least to provide the public with a useful choice.


Aspects of the present invention will now be described by way of example only with reference to the following description.


DISCLOSURE OF INVENTION

According to one aspect of the present invention there is provided a visual display system including


multi-level screens spaced physically apart,


wherein each screen has a two-dimensional plane,


a visual indicator,


an input device,


a user selectable input,


the visual display system being characterised in that


the user can use the user selectable input to move the visual indicator via the input device out of the two-dimensional plane of a particular screen.


According to another aspect of the present invention there is provided a method of using a visual display system which has multi-level screens spaced physically apart,


wherein each screen has a two-dimensional plane,


the visual display system also including


a visual indicator,


an input device,


a user selectable input,


the method characterised by the step of


the user using the selectable input to move the visual indicator out of the two-dimensional plane of a particular screen and on to another screen and on to another screen.


One aspect of the present invention there is provided media containing instructions for the operation of visual display system as described.


In preferred embodiments of the present invention the multi-level screens are similar to that described in PCT Application Nos. PCT/NZ98/00098 and PCT/NZ99/00021. although this should not be seen as limiting.


The term two-dimensional plane refers to the effective viewing plane on a particular screen, similar to that seen on a normal display screen.


The visual indicator may be any type of indicator, for example a cursor, image, icon or screen image. It is envisaged that the visual indicator is something which can move in response to the user of the system via some input mechanism.


The input device may be any suitable input device, for example a mouse, tablet data glove, keyboard, touch screen, joystick, trackball, pen, stylus, touch pad, voice and so forth.


The user selectable input is preferably an input the user can make to effect the operation of software running the display device via the input device.


For example, if the input device is a mouse, then the user selectable input may be a mouse button. If the input device is a joystick, then the user selectable input may be the trigger. If the user input is a keyboard, then the user selectable input may be arrow keys. And so forth.


We envisage that the present invention could be used extensively by those in the graphics industry. Therefore one embodiment in the present invention is envisaged that by having the input device as a pen or stylus, the present invention could be utilised in theses industries to its fullest.


In some embodiments, the user selectable input may actually be a software button on a touch screen that may be independent of the input device. This allows standard input devices and drivers to be used without modification.


In further embodiments of the present invention, the input device shall be referred to as a mouse and the user selectable input shall be referred to as a mouse button. The mouse button may be an existing button on the mouse, or in some embodiments may be a dedicated button for use with the present invention.


This should not be seen as limiting.


The visual indicator shall now be referred to as a cursor, although this should not be seen as limiting.


The user can use a mouse to move a cursor around a display screen as can be achieved with usual software. However, with one embodiment of the present invention, the user can then click a particular mouse button to cause the visual indicator to move from one screen to another screen. In one embodiment the applicant uses the centre button or a configurable button on a three button mouse, but this should not be seen as limiting


An preferred embodiments the software controlling the cursor position is supplemental to usual mouse drives.


Therefore a program can run as usual with standard mouse drive commands but the cursor position between screens can change as a consequence of the interaction of the supplemental program responding to the additional input from the mouse.


This ability enables the user to actually interact with different screens and work on separate screens in terms of having an input device which can interact with whichever screen has been selected. The advantages of this feature are self apparent.


In some embodiments, the movement from the two-dimensional plane of one screen to another screen may be discrete and it may appear that the visual indicator merely jumps from one screen to the other and be at the same x-y coordinate with the only change being in the z axis.


In other embodiments, there may be more of a linear movement perceived as a consequence of the movement from one screen to the other.


For example, the present invention may be used in conjunction with a drawing package. The person drawing may start drawing on the front screen of the visual device using the mouse and cursor.


The person then may wish to take advantage of the three dimensional quality allowed by the present invention and effectively draw in the z axis (the x and y axis having already been drawn in on the two-dimensional screen). This may be achieved by the user clicking the mouse button and dragging the cursor effectively so it appears to pass from one screen to the other screen with an image (say a line) appearing to provide a visual bridge between the front screen and another screen or screens in the background.


In other embodiments of the present invention this ability may be used with particular total screen images. For example, the present invention may be used with an interactive game which gives the impression that the user is moving deep within a scene. For example, the user may be flying a craft in the game and as the user moves forward in the game, the images may pass from the background screen or screens to the foreground screen giving the illusion of full movement. In this embodiment the visual indicator may be the images and the input device a joy-stick.


Aspects of the present invention will now be described with reference to the following drawings which are given by way of example only.





BRIEF DESCRIPTION OF DRAWINGS

Further aspects of the present invention will become apparent from the following description which is given by way of example only and with reference to the accompanying drawings in which:



FIG. 1 illustrates one embodiment of the present invention, and



FIG. 2 illustrates a second embodiment of the present invention, and



FIG. 3 illustrates a third embodiment of the present invention.





BEST MODES FOR CARRYING OUT THE INVENTION


FIGS. 1
a and 1b illustrate a stylised version of one embodiment of the present invention at work. These figures have foreground screens 1 and background screens 2.


It should be appreciated that the reference to just two screens is by way of example only and the present invention may work in relation to multiple numbers of screens.



FIG. 1
a shows the positioning of the visual indicator 3 in the form of a cursor arrow on the front foreground screen 1.


In this embodiment of the present invention a simple click of a mouse button causes the cursor 3 to appear in exactly the same x y coordinates as on the foreground screen one, but, positioned on the background screen 2.


Thus in this embodiment, the user selectable input merely does a direct transpose in the z axis between screens.



FIG. 2 likewise has a foreground screen 1 and a background screen 2. In FIG. 2a, a triangle 4 has been drawn on the x y two-dimensional plane of the foreground screen 1.


In FIG. 2b, to give the triangle 4 depth, the user has selected and dragged the image in the x y direction to give not only the image of a triangle 5 on the background screen 2, but also a plane in the z axis 6 for finding a solid-looking representation. As the screens are physically quite separate, the illusion of the solid wall 6 is accomplished by sophisticated software shading techniques.



FIG. 3 again has a foreground screen 1 and background screen 2.


This embodiment of the present invention can be used for moving through three-dimensional landscapes. For example, in FIG. 3a, there is pictured a flower 7 on the foreground screen, tree 8 along with a cloud 9 are positioned on the background screen 2.


The user may then use the input device to effectively move through the scene visually. This causes the flower depicted in FIG. 3a to disappear from the foreground screen as shown in FIG. 3b. This also causes the tree 8 to move from the background screen 2 to the foreground screen 1. The cloud 9 being in the far background stays on the background screen 2.


Thus it can be seen that the present invention allows considerable amount of interaction between the user and the screens.


Aspects of the present invention have been described by way of example only and it should be appreciated that modifications and additions may be made thereto without departing from the scope of the appended claims.

Claims
  • 1. A system comprising: a multi-component display comprising: a first display screen comprising a first plurality of pixels, wherein said first display screen is configured to display a visual indicator using said first plurality of pixels; anda second display screen comprising a second plurality of pixels, wherein said first and second display screens overlap, and wherein each of said first and second display screens is partially transparent; anda user interface component comprising a user-selectable input component, wherein said user-selectable input component is configured to move said visual indicator from a first plane to a second plane in response to a first user interaction with said user-selectable input component, and wherein said first plane corresponds to said first display screen.
  • 2. The system of claim 1, wherein said second plane corresponds to said second display screen.
  • 3. The system of claim 1, wherein said user interface component is selected from a group consisting of a mouse, a keyboard, a joystick, and a tablet data glove.
  • 4. The system of claim 1, wherein said user interface component is selected from a group consisting of a touchscreen and a touch pad.
  • 5. The system of claim 1, wherein said user interface component is selected from a group consisting of a pen and a stylus.
  • 6. The system of claim 1, wherein said user interface component is a voice-activated user interface component.
  • 7. The system of claim 1, wherein said user-selectable input comprises a button of said user interface component.
  • 8. The system of claim 1, wherein said user interface component is configured to move said visual indicator on said second plane in response to a second user interaction with said user interface component.
  • 9. The system of claim 8, wherein said user interface component is further configured to move said visual indicator on said second plane after movement of said visual indicator from said first plane to said second plane.
  • 10. The system of claim 1, wherein said visual indicator is selected from a group consisting of an icon, a cursor, an image and a screen image.
  • 11. The system of claim 1, wherein said visual indicator is associated with an application selected from a group consisting of a gaming application, a drawing application and a graphical application.
  • 12. The system of claim 1, wherein said first and second plurality of pixels overlap.
  • 13. The system of claim 1, wherein said user-selectable input component is further configured to display an image between said visual indicator on said first plane and said visual indicator on said second plane.
  • 14. A method of using a multi-component display, said method comprising: displaying a visual indicator using a first plurality of pixels of a first display screen of said multi-component display, wherein said multi-component display further comprises a second display screen, wherein said first and second display screens overlap, wherein each of said first and second display screens is partially transparent, and wherein said second display screen comprises a second plurality of pixels;detecting a first user interaction with a user interface component, wherein said user interface component comprises a user-selectable input component, and wherein said detecting further comprises detecting a first user interaction with said user-selectable input component; andin response to said detecting a first user interaction, moving said visual indicator from a first plane to a second plane, wherein said first plane corresponds to said first display screen.
  • 15. The method of claim 14, wherein said second plane corresponds to said second display screen.
  • 16. The method of claim 14, wherein said user interface component is selected from a group consisting of a mouse, a keyboard, a joystick, and a tablet data glove.
  • 17. The method of claim 14, wherein said user interface component is selected from a group consisting of a touchscreen and a touch pad.
  • 18. The method of claim 14, wherein said user interface component is selected from a group consisting of a pen and a stylus.
  • 19. The method of claim 14, wherein said user interface component is a voice-activated user interface component.
  • 20. The method of claim 14, wherein said user-selectable input comprises a button of said user interface component.
  • 21. The method of claim 14 further comprising: in response to detecting a second user interaction with said user interface component, moving said visual indicator on said second plane.
  • 22. The method of claim 21, wherein said moving said visual indicator further comprises moving said visual indicator on said second plane after movement of said visual indicator from said first plane to said second plane.
  • 23. The method of claim 14, wherein said visual indicator is selected from a group consisting of an icon, a cursor, an image and a screen image.
  • 24. The method of claim 14, wherein said visual indicator is associated with an application selected from a group consisting of a gaming application, a drawing application and a graphical application.
  • 25. The method of claim 14, wherein said first and second plurality of pixels overlap.
  • 26. The method of claim 14 further comprising: in response to said detecting said first user interaction, displaying an image between said visual indicator on said first plane and said visual indicator on said second plane.
  • 27. A computer-readable medium having computer-readable program code embodied therein for causing a computer system to perform a method of using a multi-component display, said method comprising: displaying a visual indicator using a first plurality of pixels of a first display screen of said multi-component display, wherein said multi-component display further comprises a second display screen, wherein said first and second display screens overlap, wherein each of said first and second display screens is partially transparent, and wherein said second display screen comprises a second plurality of pixels;detecting a first user interaction with a user interface component, wherein said user interface component comprises a user-selectable input component, and wherein said detecting further comprises detecting a first user interaction with said user-selectable input component; andin response to said detecting a first user interaction, moving said visual indicator from a first plane to a second plane, wherein said first plane corresponds to said first display screen.
  • 28. The computer-readable medium of claim 27, wherein said second plane corresponds to said second display screen.
  • 29. The computer-readable medium of claim 27, wherein said user interface component is selected from a group consisting of a mouse, a keyboard, a joystick, and a tablet data glove.
  • 30. The computer-readable medium of claim 27, wherein said user interface component is selected from a group consisting of a touchscreen and a touch pad.
  • 31. The computer-readable medium of claim 27, wherein said user interface component is selected from a group consisting of a pen and a stylus.
  • 32. The computer-readable medium of claim 27, wherein said user interface component is a voice-activated user interface component.
  • 33. The computer-readable medium of claim 27, wherein said user-selectable input comprises a button of said user interface component.
  • 34. The computer-readable medium of claim 27, wherein said method further comprises: in response to detecting a second user interaction with said user interface component, moving said visual indicator on said second plane.
  • 35. The computer-readable medium of claim 34, wherein said moving said visual indicator further comprises moving said visual indicator on said second plane after movement of said visual indicator from said first plane to said second plane.
  • 36. The computer-readable medium of claim 27, wherein said visual indicator is selected from a group consisting of an icon, a cursor, an image and a screen image.
  • 37. The computer-readable medium of claim 27, wherein said visual indicator is associated with an application selected from a group consisting of a gaming application, a drawing application and a graphical application.
  • 38. The computer-readable medium of claim 27, wherein said first and second plurality of pixels overlap.
  • 39. The computer-readable medium of claim 27, wherein said method further comprises: in response to said detecting said first user interaction, displaying an image between said visual indicator on said first plane and said visual indicator on said second plane.
Priority Claims (1)
Number Date Country Kind
337332 Aug 1999 NZ national
PCT Information
Filing Document Filing Date Country Kind 371c Date
PCT/NZ00/00160 8/18/2000 WO 00 2/6/2002
Publishing Document Publishing Date Country Kind
WO01/15132 3/1/2001 WO A
US Referenced Citations (141)
Number Name Date Kind
3863246 Trcka et al. Jan 1975 A
4239349 Scheffer Dec 1980 A
4294516 Brooks Oct 1981 A
4333715 Brooks Jun 1982 A
4371870 Biferno Feb 1983 A
4423929 Gomi Jan 1984 A
4443062 Togashi et al. Apr 1984 A
4472737 Iwasaki Sep 1984 A
4523848 Gorman et al. Jun 1985 A
4556286 Uchida et al. Dec 1985 A
4562433 Biferno Dec 1985 A
4568928 Biferno Feb 1986 A
4648691 Oguchi et al. Mar 1987 A
4649425 Pund Mar 1987 A
4712869 Claxton Dec 1987 A
4768300 Rutili Sep 1988 A
4927240 Stolov et al. May 1990 A
4947257 Fernandez et al. Aug 1990 A
5049870 Fitzgerald et al. Sep 1991 A
5050965 Conner et al. Sep 1991 A
5091720 Wood Feb 1992 A
5112121 Chang et al. May 1992 A
5113272 Reamey May 1992 A
5124803 Troxel Jun 1992 A
5198936 Stringfellow Mar 1993 A
5255028 Biles Oct 1993 A
5255356 Michelman et al. Oct 1993 A
5283560 Bartlett Feb 1994 A
5289297 Bollman et al. Feb 1994 A
5317686 Salas et al. May 1994 A
5333255 Damouth Jul 1994 A
5361165 Stringfellow et al. Nov 1994 A
5367801 Ahn Nov 1994 A
5396429 Hanchett Mar 1995 A
5416890 Beretta May 1995 A
5416895 Anderson et al. May 1995 A
5418898 Zand et al. May 1995 A
5463724 Anderson et al. Oct 1995 A
5465101 Akiba et al. Nov 1995 A
5473344 Bacon et al. Dec 1995 A
5475812 Corona et al. Dec 1995 A
5479185 Biverot Dec 1995 A
5502805 Anderson et al. Mar 1996 A
5585821 Ishikura et al. Dec 1996 A
5590259 Anderson et al. Dec 1996 A
5600462 Suzuki et al. Feb 1997 A
5600765 Ando et al. Feb 1997 A
5604854 Glassey Feb 1997 A
5623591 Cseri Apr 1997 A
5638501 Gough et al. Jun 1997 A
5651107 Frank et al. Jul 1997 A
5663746 Pellenberg et al. Sep 1997 A
5664127 Anderson et al. Sep 1997 A
5675755 Trueblood Oct 1997 A
5694150 Sigona et al. Dec 1997 A
5694532 Carey et al. Dec 1997 A
5695346 Sekiguchi et al. Dec 1997 A
5721847 Johnson Feb 1998 A
5729219 Armstrong et al. Mar 1998 A
5757522 Kulick et al. May 1998 A
5764317 Sadovnik et al. Jun 1998 A
5772446 Rosen Jun 1998 A
5796455 Mizobata et al. Aug 1998 A
5805163 Bagnas Sep 1998 A
5805171 St. Clair et al. Sep 1998 A
5813742 Gold et al. Sep 1998 A
5825436 Knight Oct 1998 A
5828420 Marshall et al. Oct 1998 A
5831615 Drews et al. Nov 1998 A
5835088 Jaaskelainen, Jr. Nov 1998 A
5880742 Rao et al. Mar 1999 A
5883623 Cseri Mar 1999 A
5883627 Pleyer Mar 1999 A
5883635 Rao et al. Mar 1999 A
5890174 Khanna et al. Mar 1999 A
5923307 Hogle, IV Jul 1999 A
5924870 Brosh et al. Jul 1999 A
5999191 Frank et al. Dec 1999 A
6005654 Kipfer et al. Dec 1999 A
6016385 Yee et al. Jan 2000 A
6018379 Mizobata et al. Jan 2000 A
6031530 Trueblood Feb 2000 A
6037937 Beaton et al. Mar 2000 A
6057814 Kalt May 2000 A
6061110 Hisatake et al. May 2000 A
6072489 Gough et al. Jun 2000 A
6075531 DeStefano Jun 2000 A
6085202 Rao et al. Jul 2000 A
6097361 Rohner Aug 2000 A
6100862 Sullivan Aug 2000 A
6111614 Mugura et al. Aug 2000 A
6118427 Buxton et al. Sep 2000 A
6163318 Fukuda et al. Dec 2000 A
6181349 Bardon et al. Jan 2001 B1
6204902 Kim et al. Mar 2001 B1
6215490 Kaply Apr 2001 B1
6215898 Woodfill et al. Apr 2001 B1
6239852 Oono et al. May 2001 B1
6246407 Wilks et al. Jun 2001 B1
6269173 Hsien Jul 2001 B1
6282551 Anderson et al. Aug 2001 B1
6300990 Yamaguchi et al. Oct 2001 B1
6317128 Harrison et al. Nov 2001 B1
6327592 Yoshikawa Dec 2001 B1
6341439 Lennerstad Jan 2002 B1
6351298 Mitsui et al. Feb 2002 B1
6356281 Isenman Mar 2002 B1
6369830 Brunner et al. Apr 2002 B1
6377229 Sullivan Apr 2002 B1
6418426 Schlesinger Jul 2002 B1
6438515 Crawford et al. Aug 2002 B1
6443579 Myers Sep 2002 B1
6466185 Sullivan et al. Oct 2002 B2
6468157 Hinami et al. Oct 2002 B1
6496832 Chi et al. Dec 2002 B2
6505209 Gould et al. Jan 2003 B1
6525699 Suyama et al. Feb 2003 B1
6538660 Celi, Jr. et al. Mar 2003 B1
6587094 Anderson Jul 2003 B2
6587118 Yoneda Jul 2003 B1
6593904 Marz et al. Jul 2003 B1
6609799 Myers Aug 2003 B1
6610102 Aldred et al. Aug 2003 B1
6661425 Hiroaki Dec 2003 B1
6693692 Kaneko et al. Feb 2004 B1
6721713 Guheen et al. Apr 2004 B1
6725422 Bauchot et al. Apr 2004 B1
6760003 Sase Jul 2004 B1
6771327 Sekiguchi Aug 2004 B2
6845578 Lucas Jan 2005 B1
6859907 McGarry Feb 2005 B1
20010026625 Azima et al. Oct 2001 A1
20020067373 Roe et al. Jun 2002 A1
20020091728 Kjaer et al. Jul 2002 A1
20020093516 Brunner et al. Jul 2002 A1
20020163728 Myers Nov 2002 A1
20020163729 Myers Nov 2002 A1
20030069074 Jackson Apr 2003 A1
20030132895 Berstis Jul 2003 A1
20030184665 Berstis Oct 2003 A1
20040239582 Seymour Dec 2004 A1
Foreign Referenced Citations (87)
Number Date Country
8248298 Sep 1998 AU
2480600 Jul 2000 AU
2453800 Aug 2000 AU
6821901 Dec 2001 AU
1011678 Dec 1999 BE
2009960 Sep 1990 CA
2075807 Aug 1991 CA
2320694 Aug 1999 CA
1293805 May 2001 CN
1294695 May 2001 CN
2730785 Jan 1979 DE
29912074 Nov 1999 DE
19920789 May 2000 DE
0389123 Sep 1990 EP
454423 Oct 1991 EP
595387 May 1994 EP
0703563 Mar 1996 EP
0802684 Oct 1997 EP
0935191 Aug 1999 EP
1057070 Aug 1999 EP
0999088 May 2000 EP
1151430 Aug 2000 EP
1177527 Nov 2000 EP
1093008 Apr 2001 EP
1287401 Mar 2003 EP
2145897 Apr 1985 GB
2312584 Oct 1997 GB
2347003 Aug 2000 GB
2372618 Aug 2002 GB
93472 Nov 1994 IL
62-235929 Oct 1987 JP
63-65795 Mar 1988 JP
63-100898 May 1988 JP
1-229591 Sep 1989 JP
2-90127 Mar 1990 JP
2-146087 Jun 1990 JP
3021902 Jan 1991 JP
3174580 Jul 1991 JP
3-226095 Oct 1991 JP
3226095 Oct 1991 JP
4191755 Jul 1992 JP
6-274305 Sep 1994 JP
6-314181 Nov 1994 JP
63-39299 Dec 1994 JP
7-44349 Feb 1995 JP
8-30243 Feb 1996 JP
08-036375 Feb 1996 JP
8-83160 Mar 1996 JP
8095741 Apr 1996 JP
09-033858 Feb 1997 JP
9-230825 Sep 1997 JP
09-282357 Oct 1997 JP
9308769 Dec 1997 JP
10003355 Jan 1998 JP
10039782 Feb 1998 JP
10039821 Feb 1998 JP
10105829 Apr 1998 JP
10228347 Aug 1998 JP
10-260784 Sep 1998 JP
10-334275 Dec 1998 JP
11205822 Jul 1999 JP
2000-142173 May 2000 JP
2000-347645 Dec 2000 JP
2000-99237 Oct 2001 JP
2001-215332 Apr 2002 JP
2001-56675 Sep 2002 JP
2002-350772 Dec 2002 JP
2002-099223 Oct 2003 JP
1005868 Oct 1997 NL
20005178 Apr 2001 NO
343229 Apr 2001 PL
9112554 Aug 1991 WO
9627992 Sep 1996 WO
9847106 Oct 1998 WO
9942889 Aug 1999 WO
0036578 Jun 2000 WO
0048167 Aug 2000 WO
0068887 Nov 2000 WO
0101290 Jan 2001 WO
0115127 Mar 2001 WO
0115128 Mar 2001 WO
0195019 Dec 2001 WO
0235277 May 2002 WO
02084637 Oct 2002 WO
02091033 Nov 2002 WO
03003109 Jan 2003 WO
9703025 Nov 1997 ZA