The present application relates generally to remote controls (RC) for controlling audio video display devices (AVDD) such as TVs.
Modern TVs such as the Sony Bravia (trademark) present native user interfaces (UI) to allow viewers to select an audio video (AV) input source, to launch non-broadcast TV applications such as video telephone applications (e.g., Skype), and so on. As understood herein, many viewers of TVs may prefer to access application-based UIs, with which many viewers may be as or more familiar than they are with native TV UIs, and which increase a viewer's range of choices by allowing a user to view application-based content such as Internet video.
In any case, users continue to expect to control TVS using remote controls (RC). Conventionally, user input to consumer electronics products is mainly through buttons and a mouse except one with touch screen. As understood herein, however, user gestures and touch input are a convenient, easy and intuitive way for user to provide input specifically for devices offering entertainment like TV, set top box (STB), and devices supporting applications without touch screen. Since these devices are not hand held devices, they don't have touch screen but have remotes.
A remote control (RC) for a video display device (VDD) uses touch gestures as a solution for ease of operation of entertainment devices. Both absolute touches are used, in which a track pad area of the RC is mapped to a screen area of the VDD and the track pad simulates screen display (touch screen) for the user, allowing the user to touch specific areas on the screen by touching the corresponding area on track pad. Touch inputs such as tap, press, etc. are sent to the VDD and the VDD processes the inputs as if they come from the (non-touch) display of the VDD.
Additionally, various gestures can be derived based on movement of a user finger over the RC touch pad and can be mapped to various events depending on the application involved.
Accordingly, a remote control (RC) includes a portable hand held housing, a touch sensitive surface on the housing, and a processor in the housing communicating with the surface. A wireless transmitter is controlled by the processor. A computer readable storage medium is accessible to the processor and bears instructions executable by the processor to configure the processor to receive a signal representing a touch on the surface, and determine a type of touch based on the signal representing a touch on the surface. The processor determines a location of the touch on the surface and transmits a signal representing the type of touch and the location of the touch to a video device.
The location can be a geometric location on the surface, and specifically can be a location on a matrix grid system, and the signal sent to the display device indicates the geometric location. The type of touch may be a tap, a click characterized by greater finder pressure on the surface than a tap, a double tap, a long push characterized by pressure against an area of the surface for a period exceeding a threshold period, or a pinch.
In another aspect, a remote control (RC) includes a portable hand held housing, a touch sensitive surface on the housing, and a processor in the housing communicating with the surface. A wireless transmitter is controlled by the processor. A computer readable storage medium is accessible to the processor and bears instructions executable by the processor to configure the processor to send touch-generated signals to a video device. The housing supports, in addition to the touch sensitive surface, a navigation rocker manipulable to move a screen cursor up, down, left, and right, a home key, a play key, a pause key, and a guide key.
In another aspect, remote control (RC) has a touch pad and user touches on the pad are correlated to pad positions. The positions are sent to a remote display device and mapped to corresponding locations on the display of the display device as though the user were touching the display of the display device, not the touch pad of the RC.
The details of the present invention, both as to its structure and operation, can be best understood in reference to the accompanying drawings, in which like reference numerals refer to like parts, and in which:
Referring initially to the exemplary embodiment shown in
The device 16 also includes an audio-visual (A/V) interface 28 to communicate with other devices such as the game console 12 and disk player 14 in accordance with present principles. The A/V interface may be used, e.g., in a high definition multimedia interface (HDMI) context for communicating over an HDMI cable through an HDMI port on the display device 16 with, e.g., the game console 12. However, other A/V interface technologies may be used in lieu of or in conjunction with HDMI communication to implement/execute present principles, as may be appreciated by those within the art. For instance, e.g., cloud computing, IP networks, national electrical code (NEC) communication, coaxial communication, fiber optic communication, component video communication, video graphics array (VGA) communication, etc., may be used.
Still in reference to
Furthermore, it is to be understood that the processor 18 and processor 36, in addition to any other processors in the system 10 such as in the game console 12 and 14, are capable of executing all or part of the logic discussed herein as appropriate to undertake present principles. Moreover, software code implementing present logic executable by, e.g., the processors 18 and 36 may be stored on one of the memories shown (the computer readable storage mediums 20 and 34) to undertake present principles.
Continuing in reference to
The RCs 42 and 44 have respective processors 46 and 48, respective computer readable storage mediums 50 and 52, and respective one or more input devices 54 and 56 such as, but not limited to, touch screen displays and/or cameras (for sensing user gestures on a touch surface or imaged by a camera that are then correlated to particular commands, such as scroll left/right and up/down, etc.) keypads, accelerometers (for sensing motion that can be correlated to a scroll command or other command), microphones for voice recognition technology for receiving user commands. The RCs 42 and 44 also include respective transmitters/receivers 58 and 60 (referred to herein simply as transmitters 58 and 60 for convenience) for transmitting user commands under control of the respective processors 46 and 48 received through the input devices 54 and 56.
It is to be understood that the transmitters 58 and 60 may communicate not only with transmitters on their associated devices via wireless technology such as RF and/or infrared (i.e. the transmitter 58 under control of the processor 46 may communicate with a transmitter 62 on the display device 16 and the transmitter 60 under control of the processor 48 may communicate with a transmitter 64 on the AVAM 30), but may also communicate with the transmitters of other devices in some embodiments. The transmitters 58 and 60 may also receive signals from either or both the transmitter 62 on the display device 16 and transmitter 64 of the AVAM 30. Thus, it is to be understood that the transmitters/receivers 58 and 60 allow for bi-directional communication between the remote commanders 42 and 44 and respective display device 16 and AVAM 30.
Now in reference to
Then, at block 74 the type of touch along with the location(s) of the touch on the pad are sent to a video device (VD) such as the display device 16 or AVAM 30. The location is a geometric location on the display and in one implementation is a location on a matrix grid system, and the signal sent to the video device indicates the geometric location.
Complementary logic that is executed by a video device receiving signals from the RC is shown in
Proceeding to block 80, based on the type of touch and geometrically equivalent display location, the video device correlates the touch signal to a command, which is executed at block 82 by the video device. Thus, for example, knowing a tap was received and knowing what selector element of a user interface corresponds to the geometrically equivalent display position determined at block 78, the video device knows what the user manipulating the RC and viewing the display intended to select by the touch, and by the nature (type) of the touch knows which one of potentially multiple commands, each associated with a type of touch, the user intended by the selection of the selector element.
Also supported on the housing is a subtitle key 96 manipulable to cause a video device in wireless communication with the RC to present subtitles on a display. Moreover, an input key 98 is manipulable to cause a video device in wireless communication with the RC to change a content input to a display. A microphone 99 may be supported on the housing for voice command input. Above the input key 98 are side-by-side power keys 100 for energizing and deenergizing a controlled display device and an associated amplifier. Additional keys may include a back key 102 for causing a controlled device to return to a previous menu or screen and letter keys A-D 104, each with a distinctive geometric boundary as shown, for inputting respective control signals typically in response to a display prompting input of a particular letter for a particular command or service. All of these keys are also contained on the RC 44 in
As also shown in
While the particular REMOTE TOUCH GESTURES is herein shown and described in detail, it is to be understood that the subject matter which is encompassed by the present invention is limited only by the claims.
This application claims priority to U.S. provisional application 61/621,658, filed Apr. 9, 2012, incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61621658 | Apr 2012 | US |