This invention relates to the field of display of a multidimensional space, specifically apparatus for allowing a user to control navigation and viewing of a multidimensional space, or controlling the display of selected portions of a multidimensional space to a user and adapted for use with computer systems in virtual reality environments.
Computer visualization and interaction systems such as that described by Maples in “Muse, A functionality-based Human-Computer Interface,” Journal of Virtual Reality, Vol. 1, Winter, allow humans to interact with multidimensional information represented in a multidimensional space. Such information can represent many types of virtual reality environments, including the results of scientific simulations, engineering analysis, what-if scenarios, financial modeling, three dimensional structure or process design, stimulus/response systems, and entertainment.
In many of the applications, the multidimensional space contains too much information for the user to view or assimilate at once. Displaying different aspects of the multidimensional space can also aid human understanding. Consequently, the user must select portions of the space for viewing, usually by changing the position and orientation of the human's viewpoint into the multidimensional space. The human must navigate to different what-if scenarios, to visualize different parts of a simulation or model result, to visit different parts of a structure or process design, and to experience different stimulus/response situations or different entertainment features. While the ubiquitous mouse has all but conquered navigation in two-dimensional spaces, navigation in higher dimensions is still problematic.
The mouse and joysticks have seen use as multidimensional display controllers. They are inherently two-dimensional devices, however, and are not intuitive to use when adapted for use in more dimensions.
A three-dimensional spaceball has also seen use as a multidimensional display controller. A spaceball remains stationary while the user pushes, pulls, or twists it. The spaceball does not provide intuitive control of motion because the spaceball itself cannot move. A spaceball can control relative motion, but is ill-suited for large displacement or absolute motion. Booms and head mounted displays combine visualization display with multidimensional display control and can be intuitive to use in multidimensional applications. Booms and head mounted displays can be expensive, however, and the physical limits of the boom structure can limit intuitive navigation. For example, booms typically require an additional input device to control velocity. Booms can control relative motion, but are ill-suited for large displacement or absolute motion.
Other motion devices such as treadmills and stationary bicycles have seen use in multidimensional display control. These are often expensive and too bulky for desktop use. They are also intrusive, often requiring the user to be strapped in to the device. Changing directions in the dimensions using a treadmill or bicycle can also be non-intuitive.
Multi-dimensional tracked objects have also seen use as multidimensional display controllers. These can be intuitive since they can move in multiple dimensions, but they do not allow nonvisual feedback to the user. Tracking can also be difficult when, for example, an electromagnetically tracked device is used near large metal items or an acoustically tracked device is used in settings where line of sight is difficult to maintain.
There is an unmet need for multidimensional display controllers that are intuitive to use, suitable for desktop use, and robust enough for use in a wide range of multidimensional display situations.
The present invention provides a multidimensional display controller adapted for use with multidimensional information, especially for use in virtual reality or other computer displays. The display controller allows a user to establish a base viewing location and a base viewing orientation. The user can also establish a relative viewing orientation. The display controller combines the base viewing orientation and relative viewing orientation to determine a desired viewing orientation. The display controller depicts an aspect of the multidimensional space visible along the desired viewing orientation. The user can establish the base viewing location and base viewing orientation by moving a user-defined point (or multiple points, which define an object) relative to the multidimensional space or relative to a separate reference frame, or by some other type of input such as by changing an input object. The user can change the relative viewing orientation by changing the location, orientation, deformation, or other property of an input object. The relative viewing orientation can also be changed by tracked user body motions, for example by tracked motion of the user's head or eyes.
Advantages and novel features will become apparent to those skilled in the art upon examination of the following description or may be learned by practice of the invention. The objects and advantages of the invention may be realized and attained by means of the instrumentalities and combinations particularly pointed out in the appended claims.
The accompanying drawings, which are incorporated into and form part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
The present invention provides a display controller adapted for use with multidimensional information, especially for use in virtual reality or other computer displays.
The user can move the base viewing location and base viewing orientation by moving a user-defined point relative to the multidimensional space or relative to a separate reference frame. For example, the base viewing location can be translated through the multidimensional space in response to user translation of a device such as that described in U.S. Pat. Nos. 5,506,605 and 5,296,871, incorporated herein by reference. The base viewing location and base viewing orientation can also navigated through the multidimensional space by other user input such as voice commands. The display controller 1 can establish a separate reference frame. The separate reference frame can correspond to allowable directions and velocities of motion of the base viewing location and base viewing orientation. The direction of base viewing location motion can be determined from user motion commands or can be set relative to the base viewing orientation. Force or other feedback means can help make user motion of the base viewing location and base viewing orientation more intuitive. Representing the base viewing location and base viewing orientation as the location and orientation of a user-navigable craft can make navigation thereof intuitive. A user navigable craft can correspond to a vehicle separate from a representation of a character, or can correspond to a representation of a character. See, e.g., Anderson I pp. 10-11.
For control of the base viewing location, a tracked device can be used to move a user point U2 relative to reference frame F2. Force, visual, or other feedback can be used to indicate the position of the user point U2 relative to the reference frame F2. The base viewing location can be moved in a direction derived from the base viewing orientation and the location of the user point U2 relative to the reference frame F2. The base viewing location can be moved at a velocity corresponding to the distance of the user point U2 from the reference frame F2 or the force applied by the user to the tracked device. The user can thus control the base viewing location as though the user were in a craft capable of motion in any direction.
Reference frame F2 can be communicated to the user in various ways. It can be displayed. It can conform to the frame of the navigable or multidimensional space, or to a reference frame corresponding to a navigable entity surrounding the user. The reference frame can be displayed as a sphere, ellipsoid, or polyhedron (in three dimensions) on the dashboard of a navigable entity, or can be displayed as a spatial form hovering near the user's head or where the user might expect to find a steering wheel in a conventional craft. The reference frame displayed can change under user control, or multiple reference frames can be displayed for the user to select.
Control from the user can be accepted in various other ways, including, for example, from force applied by the user to a pointer, from sound commands from the user, from pressure on a pressure sensitive input means, or from tracking selected user movements. The feedback to the user of the position of the user point relative to the reference frame can be done visually. It can also be accomplished with sound, for example by changing pitch or intensity as the desired viewing location and orientation change. It can also be accomplished by force feedback, for example by applying progressive resistance to movement away from a base viewing location or orientation. It can also be accomplished by other methods such as by varying the temperature of an input device, the speed of air flow over the user, or by varying vibrations in an input device, for example. The implementation of suitable sensor communication and control software is known to those skilled in the art.
Rotating the relative viewing orientation to the left, without changing the base viewing location or base viewing orientation, will cause another aspect S32 of the multidimensional space to be displayed to the user in display D3. The control panel display C3 can continue to display the original control panel C31 when the relative viewing orientation is changed, corresponding to a fixed instrument panel like in a convention automobile. Alternately, the control panel display C3 can change to display the controls in control panel C32, corresponding to a cockpit that moves with the user or a heads up display.
Rotating the relative viewing orientation to the right, without changing the base viewing location or base viewing orientation, will cause another aspect S33 of the multidimensional space to be displayed to the user in display D3. The control panel display C3 can continue to display the original control panel C31 when the relative viewing orientation is changed, corresponding to a fixed instrument panel like in a convention automobile. Alternately, the control panel display C3 can change to display the control in control panel C33, corresponding to a cockpit that moves with the user or a heads up display.
Allowing separate user control of the relative viewing orientation has several benefits. The modification of viewing orientation separate from the control panel or other indicators of viewing position can help the user retain a spatial reference. In some applications, the user desires to change the viewing orientation much more rapidly than the viewing location (as when looking around when driving a car); using a free hand to control relative viewing orientation provides a low overhead way of accommodating the desired viewing orientation changes.
The relative viewing orientation can be changed by the user by changing the location, orientation, deformation, or other property of an input object. For example, the user can rotate a tracked object to rotate the relative viewing orientation. The user can also apply torque to an object to rotate the relative viewing orientation. Changes in other properties of an object can also be used to change the relative viewing orientation; for example, translation or deformation of an object can correspond to rotation of the relative viewing orientation. The relative viewing orientation can also be changed by tracked user body motions, for example by tracked motion of the users hand, head or eyes.
The display controller also comprises appropriate driver software to accept user input for establishment of the relative viewing orientation 211. Those skilled in the art will appreciate suitable driver software corresponding to the input device employed. The user input can indicate a change in relative viewing orientation 212. If it indicates no change 213, then the current relative viewing orientation is still valid, pending new user input 211. If the relative viewing orientation has changed 213, then the display controller determines the new relative viewing orientation. Determination of the new relative viewing orientation can be based on numerous types of user input; those skilled in the art will appreciate methods for determining the relative viewing orientation based on the input device employed and the desired user responsiveness characteristics. The new relative viewing orientation is communicated to the display software 215.
The display software interacts with the multidimensional data to select the aspect visible from the base viewing location along a viewing orientation determined from a combination of the base viewing orientation and the relative viewing orientation. Those skilled in the art will appreciate methods of selecting aspects of multidimensional data for display. The display controller displays the selected aspect to the user 231.
A display controller according to the present invention was implemented using a Silicon Graphics Indigo II High Impact workstation running the IRIX 6.2 operating system. A PHANTOM™ from SensAble Technologies of Cambridge, Mass. was used as the means for allowing the user to set a user point, and for communicating force feedback to the user. Rotation of encoders on the PHANTOM™ was used for viewing orientation input. The PHANTOM™ was connected to the workstation's EISA communications port. Torque encoders on a spaceball, U.S. Pat. No. 4,811,608, from Spacetec were used to sense torque applied by the user to determine changes in relative viewing orientation desired by the user. The display controller was operated with a virtual reality environment like that described by Maples in “Muse, A functionality-based Human-Computer Interface,” Journal of Virtual Reality, Vol. 1, Winter 1995.
The representation of a user point presented to a user can comprise a graphical element such as a dot, an arrow, or a more complex graphical element such as a character or vehicle. The user point can comprise multiple points (the aggregation termed a “user object” for convenience of description), as described in Anderson I on p. 3 lines 6-8, p. 5 lines 4-7, and p. 7 lines 6-7. Such an aggregation of points can allow the position of the user point relative to the reference frame to include distances from the multiple points (or components of the object), which inherently allows “the position of the user object” to represent a multidimensional position; e.g., three dimensional position and orientation of the user object relative to the reference frame.
A reference frame can be established relative to the multidimensional space, for example, a representation of a vehicle, character, or other navigable entity can be presented to the user as part of the display of the multidimensional space. See, e.g., Anderson I p. 8 lines 25-27. The user can then position a user point within the multidimensional space, and the position of the user point relative to the reference frame used to determine a base viewing location and base viewing orientation. In a simple example, the position of the user point relative to the reference frame can directly correspond to the base viewing location (e.g., the base viewing location can appear to follow any apparent motion of the reference frame relative to the multidimensional space). The direction from the user point to some aspect of the reference frame, for example to the center of the representation of the navigable entity, can be established as the base viewing orientation. Additionally, the base viewing orientation can be controlled by the user point within limitations such that the angle of the base viewing orientation relative to the reference frame can be constrained to be within a maximum and minimum amount.
The motion of the user point, which directs the apparent motion of the vehicle by change of the portion of the multidimensional space display, can be controlled by the user with one or more hand-manipulable input devices such as joysticks. As an example, a first joystick can be used to indicate motion of the user point forward or backward along the base viewing orientation, allowing the controller to adjust the display to give the perception of motion along the base viewing orientation. A second joystick can be used to indicate motion of the user point around the reference frame, allowing the controller to adjust the display to provide displays of the multidimensional space visible at various angles to the vehicle's apparent motion. Separate control of base and relative views is also depicted in Davidson
The position of the user point can be further communicated to the user using force feedback. See, e.g., Davidson col. 3 lines 37-39; Anderson I p 6 lines 18-27, p. 9 lines 12-24. For example, when the user point, or reference frame apparently moving responsive to the user point, encounters certain conditions (e.g., obstacles) in the multidimensional space, force feedback such as varying vibrations or directional forces can be communicated to the user. See, e.g., Davidson col. 3 lines 37-39, col. 4 lines 28-34; Anderson I p 6 lines 18-27, p. 9 lines 12-24. While shown in
A reference frame can be established in relation to a multidimensional space, and communicated intuitively to the user as part of a display of the multidimensional space, e.g., as a representation of a vehicle, or as representations of objects in the multidimensional space. See, e.g., Davidson col. 4 lines 5-16; Anderson I p. 8 lines 25-27. The user can position a user point relative to the reference frame, for example by using a first input device, and the relative position used to determine a base viewing orientation 814, as illustrated schematically in
The present invention can be combined with various other methods of navigating through a multidimensional space. For example, as illustrated schematically in
A base viewing location and base viewing orientation into a multidimensional space can be communicated to the user with a character or vehicle metaphor. See, e.g., Davidson col. 3 lines 54-58, col. 4 lines 5-16. The location of the base viewing location in the multidimensional space can be presented as the location of a character or vehicle, generally one that is moveable by the user. The direction of the base viewing orientation can be presented as the direction of motion, or the direction of next motion if the base viewing location is currently at rest, in the multidimensional space. The direction of motion can be communicated by changes in the display of the multidimensional space, and can be communicated by indicators such as wheels, a rudder, a pointer, or some aspect of a representation of a character or vehicle that corresponds with or indicates a direction of motion. See, e.g., Anderson I p. 8 line 25-p. 9 line 11; Anderson II p. 11 lines 6-11. The metaphor can be reinforced by displaying a representation of the character or vehicle (sometimes called a “third person” view). The display can instead display only the portion of the multidimensional space visible from the character or vehicle (sometimes called a “first person” view). The user can control the motion of the character or vehicle relative to the multidimensional space in a variety of ways; e.g., the user can manipulate an input device to affect such motion, aspects of the application can affect such motion (e.g., the character can appear to be pushed in some direction), or a combination thereof. Once a base viewing location and base viewing orientation have been established, the present invention allows the user to control a relative viewing orientation. As an example, the user can manipulate a second input device to control a relative viewing orientation, extending the character metaphor to allow the character to look to one side while moving. The relative viewing orientation can be combined with the base viewing orientation to determine a direction in the multidimensional space, and a view of the multidimensional space along that direction communicated to the user. Combining the base and relative viewing orientations can foster more intuitive control by the user: the base motion of the character or vehicle is controllable, as is the viewing orientation relative to the base, in a manner resembling behavior practiced in the real world. The present invention can also allow the user to specify a relative viewing location, which can be combined with the base viewing location, and base and relative viewing orientations, to determine a location for the view presented to the user. As an example, the user can control a relative viewing location to move the starting location for the view along the combined viewing orientation, giving the appearance of moving behind or in front of the character or vehicle.
The user can control a relative viewing orientation in the above example by an input device such as a joystick. Another joystick can be used to control the apparent motion (the base viewing location and orientation) through the multidimensional space. Separate control of base and relative views is also depicted in Davidson
As another example, a reference frame can comprise a representation of a vehicle 1001, as shown schematically in
The particular sizes and equipment discussed above are cited merely to illustrate particular embodiments of the invention. It is contemplated that the use of the invention may involve components having different sizes and characteristics. It is intended that the scope of the invention be defined by the claims appended hereto.
This application claims priority as a continuation-in-part of U.S. patent application Ser. No. 09/785,696 (“Anderson III”), filed on Feb. 16, 2001, incorporated herein by reference, which claimed the benefit of U.S. Provisional Application 60/202,448 (“Anderson II”), filed on May 6, 2000, incorporated herein by reference, and was a continuation-in-part of U.S. patent application Ser. No. 08/834,642 (“Anderson I”) and Ser. No. 08/834,616 (“Davidson”), now U.S. Pat. No. 6,208,349, each of which was filed on Apr. 14, 1997, each of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
60202448 | May 2000 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09785696 | Feb 2001 | US |
Child | 11244584 | Oct 2005 | US |
Parent | 08834642 | Apr 1997 | US |
Child | 11244584 | Oct 2005 | US |
Parent | 08834616 | Apr 1997 | US |
Child | 11244584 | Oct 2005 | US |