The disclosures herein relate generally to input devices, and more particularly, to input devices for information handling systems (IHSs).
Information handling systems (IHSs) process, transfer, manipulate, communicate, compile, store or otherwise handle information. IHSs include, but are not limited to, mainframes, minicomputers, microcomputers, nanocomputers, desktop computers, portable computers, laptop computers, notebook computers, personal digital assistants (PDAs), servers, networked systems, telephone devices, communication devices, and microcontroller systems.
An input device typically couples to an IHS to provide input information thereto. Many different types of input devices can provide position information to IHSs. For example, the conventional computer mouse that moves laterally on a flat surface can provide position information in two dimensions, namely the x and y axes. A tablet input device also provides x and y coordinate information to the IHS when a user moves a stylus in the x and y plane of the tablet. Joystick input devices also provide position information to IHSs. For example, a typical analog joystick input device provides pitch information when the user moves the joystick from front to back and from back to front. The analog joystick input device also provides yaw information when moved from side to side, i.e. from left to right and from right to left. Game controller input devices are known that include four buttons arranged so that the user can move a cursor on a display from left to right, from right to left, or backward and forward somewhat like a joystick.
The mouse, tablet and joystick discussed above are examples of input devices that employ an actuated control mode because these devices transfer the position of an actuator (e.g. joystick, stylus/tablet) into a corresponding effect in virtual space. Input devices are also available that employ a direct (kinematic) control mode. In direct control mode input devices, the position in virtual space is a direct function of the position coordinates of the input device itself in real space. A virtual glove is one example of a direct control mode input device. When a user wears a virtual glove input device, movement of the virtual glove by the user in real space causes movement of a locus in virtual space. Unfortunately, the user may have difficulty moving the virtual glove into some locations, for example under an object such as a chair or other difficult to reach location. The user may experience further difficulty in moving the virtual glove to some locations because the virtual glove may be tethered to a computer which limits motion of the virtual glove.
What is needed is a method and apparatus that addresses the problems discussed above.
Accordingly, in one embodiment, a method is disclosed for operating an input device to provide position information that includes both location information and spatial orientation information of the input device. The method includes determining, by a location sensor in the input device, the absolute location of the input device in real space, thus providing the location information. The method also includes determining, by a spatial orientation sensor in the input device, the spatial orientation of the input device in real space, thus providing the spatial orientation information. The method further includes processing, by a processor, the location information and the spatial orientation information of the input device in real space to determine an image view from the perspective of the input device in virtual space. In one embodiment, the input device provides location information which defines the location of the input device in an x, y, z coordinate system. In another embodiment, the input device provides spatial orientation information that defines the spatial orientation of the input device in terms of yaw, pitch and roll.
In another embodiment, an input device is disclosed that provides position information including location information and spatial orientation information of the input device. The input device includes a location sensor that determines the absolute location of the input device in real space to provide the location information. The input device also includes a spatial orientation sensor that determines the spatial orientation of the input device in real space to provide the spatial orientation information. The input device further includes a processor, coupled to the location sensor and the spatial orientation sensor, that processes the location information and the spatial orientation information of the input device in real space to determine an image view from the perspective of the input device in a virtual space.
The appended drawings illustrate only exemplary embodiments of the invention and therefore do not limit its scope because the inventive concepts lend themselves to other equally effective embodiments.
In one embodiment display device 105 couples to a server system 115 so that server system 115 can augment the local image processing abilities of display device 105 to display the view specified by the position information it receives from input device 100. More particularly, connector 145 couples a processor 140 of input device 100 to display device 105. Connector portion 145A of input device 100 mates with connector portion 145B of display device 105 to achieve this coupling. Server system 115 receives the position information that display device 105 receives from input device 100. Server system 115 renders or manipulates the real time position information into image information representative of the real time view seen be a hypothetical observer located on input device 100. Server system 115 supplies the image information to display device 100 for viewing by the user. Other embodiments are possible wherein display device 105 includes a processor having sufficient computational power to perform image processing or image rendering locally rather than offloading that function to server system 115. Yet another embodiment is possible wherein input device 100 includes sufficient computational processing power to perform the above described image processing as will be discussed below in more detail with reference to
Input device 100 includes a printed circuit board 120 which couples a location sensor 125, a heading sensor 130, and a tilt sensor device 135 to a processor 140. In this particular embodiment, input device 100 employs a Model PIC16F628 microcontroller made by Microchip Technology Inc. as processor 140, although input device 100 may employ other microcontrollers and processors as well. Processor 140 mounts on printed circuit board 120 as shown. Location sensor 125, such as a Global Positioning System (GPS) receiver, determines the x, y and z location coordinates of input device 100 and provides this location information to processor 140. GPS receiver 125 thus keeps processor 140 informed of the current absolute position of input device 100 in real time. In one embodiment, GPS receiver determines the x and y coordinates and ignores the z coordinate. In such an embodiment, input device 100 can ignore the z value and assume that input device 100 is located at a fixed height, z, above the xy plane. In other words, in this simplified embodiment, GPS receiver 125 provides the absolute location information of input device 100 relative to the xy plane as defined in
Heading sensor 130 determines the current absolute heading or direction of input device 100 in real time. In other words, heading sensor 130 determines the direction that input device 100 currently points in real time. Heading sensor 130 provides absolute heading information to processor 140 in one embodiment. The Model HMC6352 digital compass manufactured by Honeywell produces acceptable results when employed as digital compass 130.
Tilt sensor 135 determines the pitch and roll of input device 100 in real time. In other words, tilt sensor 135 determines when the user pitches input device up and down. Tilt sensor 135 also determines when the user rolls input device clockwise to the right or counter clockwise to the left. Tilt sensor 135 provides pitch information and roll information to processor 140 in real time. Pitch information and roll information are types of spatial orientation information. The Model ADXL202E accelerometer manufactured by Analog Device, Inc. produces acceptable results when employed as tilt sensor 135. This particular accelerometer is a dual axis accelerometer. Input device 100 employs one axis of dual axis tilt sensor 135 to measure positive and negative pitch. Positive and negative pitches define one type of tilt exhibited by input device 100 when the user tilts input device 100 upward and downward. Input device 100 employs the remaining axis of dual axis tilt sensor 135 to measure roll. Input device 100 exhibits another type of tilt, namely roll, when the user tilts input device 100 clockwise or counter clockwise. In one embodiment, input device 100 ignores the roll information that tilt sensor 135 provides.
however, input device 200 integrates many of these elements in a common housing.
and roll defines rotational motion about the y axis. When the user moves input device 200 in the x-y plane, GPS receiver 125 determines the coordinates of input device 200 in the x-y plane. In one embodiment, GPS receiver 200 also provides z axis information with respect to input device 200. In this manner, GPS receiver 125 provides the absolute position of input device 200 to processor 140. When the user rotates input device 200 to the right in the xy plane, heading sensor 130 detects this as positive yaw. When the user rotates input device to the left in the xy plane, heading sensor 130 detects this as negative yaw. However, when the user rotates or tilts input device 200 upward in the yz plane, tilt sensor 135 detects this as positive pitch. Conversely, if the user rotates or tilts input device downward in the yz plane, tilt sensor 135 detects this as negative pitch. When the user rotates input device 200 about the y axis in a clockwise direction, tilt sensor 135 detects this as positive roll. However, when the user rotates input device 200 about the y axis in a counter clockwise direction, tilt sensor 135 detects this action as a negative roll. Processor 140 receives all of this position information, namely the x, y, z location information and the yaw, pitch and roll spatial orientation information as a serial data stream. Display device 105 displays an image in virtual space that corresponds to the location and spatial orientation of input device 200 in real space.
As seen in
In one embodiment, input device 300 couples by wire or wirelessly to an external IHS 355. In such a configuration, device 300 acts as a location and spatial orientation sensing device for IHS 355. IHS 355 includes a display (not shown) that displays the rendered image received form input device 300.
IHS input device 300 loads application software 360 from nonvolatile storage 330 to memory 315 for execution. The particular application software 360 loaded into memory 315 of IHS input device 300 determines the operational characteristics of input device 300. In one embodiment, application software 360 controls the processing of the location and spatial orientation information that input device 300 receives from location sensor 341, heading sensor 342 and tilt sensor 343 as discussed in more detail below with reference to the flowchart of
In one embodiment, input device 400 may be configured as a personal digital assistant (PDA) that provide a virtual view from a particular location to allow a user to effectively see at night, in fog, through water or from a higher elevation than the user's current location. In another application, input device 400 may provide orientation, tilt and/or location information as input to a gaming device.
Those skilled in the art will appreciate that the methodology disclosed, such as seen in the flow chart of
In one embodiment, the disclosed methodology is implemented as an application 360, namely a set of instructions (program code) in code modules which may, for example, be resident in the system memory 315 of system 400 of
The foregoing discloses a method and apparatus that, in one embodiment, determines a virtual position, virtual orientation and virtual velocity as a direct function of the real position coordinates, orientation and velocity of the input device itself. One embodiment of the input device enables a user to move the input device in real time and space to affect the desired virtual movement independent of the user's hand position on the input device. This allows the user to move the input device in a fashion that can provide an alternative and independent perspective that is not generally achievable with some input devices such as a glove type input device, for example. In one embodiment, the disclosed input device is more intuitive than a joystick or other type of actuated controller. For example, a user can move the input device in real space to a position which corresponds to a space below a chair in virtual space displayed on the input devices display. This creates a “bug's eye view” of a chair leg, a position which is very awkward for a virtual glove and cognitively challenging with a joystick actuator. In one embodiment, the input device itself maps its own motions in 3D real space to 3D virtual space that displays on the input device's own on-board display.
Modifications and alternative embodiments of this invention will be apparent to those skilled in the art in view of this description of the invention. Accordingly, this description teaches those skilled in the art the manner of carrying out the invention and is intended to be construed as illustrative only. The forms of the invention shown and described constitute the present embodiments. Persons skilled in the art may make various changes in the shape, size and arrangement of parts. For example, persons skilled in the art may substitute equivalent elements for the elements illustrated and described here. Moreover, persons skilled in the art after having the benefit of this description of the invention may use certain features of the invention independently of the use of other features, without departing from the scope of the invention.