The importance for the consumer electronic device industry to continuously strive to produce devices that are convenient to use cannot be overstated. No doubt this is one of the reasons for making devices that contain more storage capacity, more processing capacity, and offer more user options. For example, the functionality of one or more devices such as digital televisions, digital video disk (DVD) players, video cassette recorder (VCR) players, compact disk (CD) players, set-top boxes, stereo receivers, media centers, personal video recorders (PVR), and so forth, may be combined into one device having combined functionality.
Convenience of use for such a device having combined functionality may decrease if the graphical user interface (GUI) for that device contains too many selections to conveniently use with a typical remote control. For example, a typical remote control used today for interactive televisions have a number of color coded buttons to navigate and select among many options. Due to the limited ability to navigate and select, many button pushes are often required and/or multiple screens are presented to the user. The many button pushes and/or multiple screens are often too much information for the user to remember over time. An additional constraint of the typical remote control is the so called “10-foot” user interface to the remote user interface.
The invention may be best understood by referring to the following description and accompanying drawings that are used to illustrate embodiments of the invention. In the drawings:
A method and system for activating a graphical user interface (GUI) via a laser beam are described. Here, at least some of the problems described above with devices having increased funcationality may be alievated by allowing a user to interact with a GUI displayed on a screen of such a device by using a laser beam to activate the GUI. In an embodiment of the invention, a laser pointer may be incorporated into the remote control of the device. In the following description, for purposes of explanation, numerous specific details are set forth. It will be apparent, however, to one skilled in the art that embodiments of the invention can be practiced without these specific details.
In the following detailed description of the embodiments, reference is made to the accompanying drawings that show, by way of illustration, specific embodiments in which the invention may be practiced. In the drawings, like numerals describe substantially similar components throughout the several views. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments may be utilized and structural, logical, and electrical changes may be made without departing from the scope of the present invention.
The example GUI 100 shown in
For example, via program selections 102, the user may select to view his or her options regarding cable programs 102a, recorded programs 102b, satellite programs 102c and pay-per-view programs 102d. Via music selections 104, the user may select from AM radio 104a, FM radio 104b, satellite radio 104c and CDs 104d. The user, via picture selections 106, may view family pictures 106a, vacation pictures 106b and work-related pictures 106c. Via home appliance control selections 108, the user may control his or her thermostat via thermostat control 108a, turn on or off the building lights via lights control 108b, lock or unlock the doors via door lock control 108c, lock or under the windows via window lock control 108d, control the alarm system via alarm system control 108e and control the pool features via control 108f. Audio may also be controlled by the user via speaker control selections 110 and may include media room speaker control 110a, pool area speaker control 110b and library speaker control 110c. In an embodiment of the invention, GUI 100 may also include a “back” option or “abort” option that the user may activate to go back to the previous GUI (if applicable).
Referring to
In an embodiment of the invention, housing unit 204 may house a projector 208, a processor 210, a GUI module 212, a laser beam detector 214 and a laser beam processing module 216. Other embodiments of the invention may include more or less components as described in
At a high level and in an embodiment of the invention, laser beam detector 214 detects a laser beam directed at screen 202. Laser beam detector 214 is directed at the back of screen 202 and detects the laser beam as it goes through screen 202. Once a laser beam is detected, laser beam detector 214 waits for a period of time and continues to scan screen 202 to ensure that the user is actually trying to interact with the GUI displayed on screen 202. Laser beam processing module 216 then calculates the position of the laser beam on screen 202. If module 216 can determine the position of the laser beam on screen 202, then the position of the laser beam is sent to processor 210 and GUI module 212 to process the selection or interaction with the GUI in a normal fashion.
Screen 202 may display a GUI, such as GUI 100 of
In an embodiment of the invention, laser pointer 206 may be a typical laser pointer that is well known in the art. In another embodiment, laser pointer 206 may represent a remote control that incorporates laser beam technology. In this embodiment, the remote control with laser beam technology may also incorporate typical remote control buttons and/or functionality. For example, one or more control buttons on the remote control may be implemented as a hard button or switch. One or more control buttons on the remote control may also be implemented as a soft button, for example, implemented via a liquid crystal display (LCD) touch screen on the remote control. These example implementations and/or functions of laser pointer 206 are provided as illustrations only and are not meant to limit the invention.
In an embodiment of the invention, projector 208 may be a typical projector that is well known in the art and used for rear projection televisions. Projector 208 may display objects on screen 202 as directed by processor 210. Processor 210 interacts with GUI module 212 to display one or more GUIs on screen 202 to use when interacting with rear projection device 200.
Laser beam detector 214 detects a laser beam from the rear of screen 202 as the laser beam is projected onto screen 202 via laser pointer 206. In an embodiment of the invention, laser beam detector 214 may be a video camera that views screen 202 and measures the narrow frequency band of laser light in a raster scan over screen 202. In an embodiment of the invention, laser beam detector 214 is mounted inside of device 200 to get the best view of screen 202. Detector 214 may also be off-axis to projector 208 and the raw images captured by detector 214 may be warped through graphic transforms to account for the warping effect of laser beam detector 214 being off-axis. In an embodiment of the invention, the position of the laser beam is measured in x/y pixel locations relative to screen 202 and is processed by laser beam processing module 216 as is described in more detail with reference to
In another embodiment of the invention, laser beam detector 214 may also be embedded in screen 202 and implemented as a photo sensor (e.g., photo diode or photo transistor array). Here, screen 202 may be a LCD or Plasma screen. The photo sensor may be “deposited” onto the screen directly and the x/y position of the laser beam may be detected by virtue of the array itself.
Referring to
At a high level and in an embodiment of the invention, laser beam detector/processing module 308 detects a laser beam directed at screen 302. Laser beam detector/processing module 308 is directed at the front of screen 302 and detects the laser beam as it is reflected off of screen 302. Once a laser beam is detected, laser beam detector/processing module 308 waits for a period of time and continues to scan screen 302 to ensure that the user is actually trying to interact with the GUI displayed on screen 302. Laser beam detector/processing module 308 then calculates the position of the laser beam on screen 302. If module 308 can determine the position of the laser beam on screen 302, then the position of the laser beam is sent to projector 304 to process the selection or interaction with the GUI in a normal fashion.
As with screen 202 of
In an embodiment of the invention, laser beam detector/processing module 308 may include all of the functionality as laser beam detector 214 and laser beam processing module 216 as described above. In an embodiment of the invention, laser beam detector/processing module 308 may be a video camera that is mounted to projector 304 to get the best view of screen 302. Laser beam detector/processing module 308 may also be off-axis to projector 304 and the raw images captured may be warped through graphic transforms to account for the warping effect of laser beam detector/processing module 308 being off-axis. Laser beam detector/processing module 308 may also be embedded in screen 202 and implemented as a photo sensor, as described above with reference to laser beam detector 214.
Operations for the above components described in
In processing block 404, once a laser beam is detected the laser beam detector receives two or more raw images of the screen by performing raster scans of the screen. The two or more raw images are received over a period of time to ensure that the user is actually trying to interact with the GUI displayed on the screen. The laser beam processing module (such as module 216 in
At decision block 408, if enough noise can be eliminated from the raw images (i.e., the laser beam processing module can determine one consistent position on the screen) then the process continues at block 412 in
At processing block 412 in
In an embodiment of the invention, laser pointers 206 and 306 (
In processing block 704, once a laser beam is detected the laser beam detector receives two or more raw images of the screen by performing raster scans of the screen. The two or more raw images are received over a period of time to ensure that the user is actually trying to interact with the screen and to capture enough raw images to combine the laser beams to create a gesture command. The laser beam processing module (such as module 216 in
At decision block 708, if enough noise can be eliminated from the raw images (i.e., the laser beam processing module can determine position(s) on the screen) then the process continues at block 709. Otherwise, the process continues at block 710 where the laser beam is ignored. The process goes back to processing block 702 where the laser beam detector views the screen for the next laser beam.
At processing block 709, the laser beam processing module combines the two or more raw images to produce a combined raw image. At processing block 712 in
If a gesture command has been performed in decision block 714, then the gesture command is sent to the processor to display the appropriate GUI on the screen or to execute the appropriate command at processing block 716. Otherwise, at processing block 718, a message is displayed on the screen that informs the user that an invalid gesture command has been drawn on the screen. In either event, the process then continues at block 702 (
In an embodiment of the invention, the screen of a device may be divided into two areas. One area of the screen is used to display an active GUI and the other area is used for gesture commands. Here, one laser beam detector scans the area with the active GUI for a laser beam and a second laser beam detector scans the area of the screen used for gesture commands for a laser beam. The side of the screen used for the active GUI is processed according to
Embodiments of the present invention may be implemented in software, firmware, hardware or by any combination of various techniques. For example, in some embodiments, the present invention may be provided as a computer program product or software which may include a machine or computer-readable medium having stored thereon instructions which may be used to program a computer (or other electronic devices) to perform a process according to the present invention. In other embodiments, steps of the present invention might be performed by specific hardware components that contain hardwired logic for performing the steps, or by any combination of programmed computer components and custom hardware components.
Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). These mechanisms include, but are not limited to, a hard disk, floppy diskettes, optical disks, Compact Disc, Read-Only Memory (CD-ROMs), magneto-optical disks, Read-Only Memory (ROMs), Random Access Memory (RAM), Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), magnetic or optical cards, flash memory, a transmission over the Internet, electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.) or the like.
Some portions of the detailed descriptions above are presented in terms of algorithms and symbolic representations of operations on data bits within a computer system's registers or memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to convey the substance of their work to others skilled in the art most effectively. An algorithm is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the above discussions, it is appreciated that discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or the like, may refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
It is to be understood that the above description is intended to be illustrative, and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reading and understanding the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.