Various devices can be used by users to provide input to different systems. Input devices such as mice, keyboards, keypads, touch pads, joysticks, and other devices, for example, allow users to control one or more devices by interaction with the input devices. In addition, as such technology improves, touch screens have become more prevalent as input devices. In a typical application, a touch screen and a display are integrated so that a graphical user interface (GUI) displayed on the touch screen provides visual indicators of how user input can be provided to interact with the GUI. The GUI may, for instance, include selectable options so that a user can see on the display where to touch the touch screen to select a displayed option. In many instances, input devices that incorporate touch are separated from a display on which a GUI is displayed. Many notebook computers, for example, include touch pads. In typical uses, a user touches a touch pad and moves one or more fingers to cause a cursor displayed on the GUI to move accordingly. A button proximate to the touch pad or sometimes the touch pad itself can be tapped to cause a GUI element to be selected. Other ways of interacting with the touch pad and any buttons with the touch pad may be used to interact with the GUI accordingly.
Despite the numerous ways users are able to interact with devices using various input devices, existing devices, whether devices being controlled or input devices used to control other devices, do not take full advantage of technologies that have been developed. In addition, many devices are designed in a manner that makes use of developed technologies cumbersome. Televisions, for example, often are configured to provide high-quality displays. At the same time, the use of touch-based input devices with televisions can be awkward. Users, for example, often view televisions from a large enough distance that incorporation of a touch screen input device with the television display is impractical. Similar issues exist for many display devices, such as computer monitors. Thus, while touch-based input has proven to be beneficial, many benefits of touch-based input are often unattained.
The following presents a simplified summary of some embodiments of the invention in order to provide a basic understanding of the invention. This summary is not an extensive overview of the invention. It is not intended to identify key/critical elements of the invention or to delineate the scope of the invention. Its sole purpose is to present some embodiments of the invention in a simplified form as a prelude to the more detailed description that is presented later.
Techniques of the present disclosure provide for the interaction of graphical user interfaces using input devices that incorporate touch and/or proximity sensing. Such techniques provide advantages including advantages of some embodiments that allow users to obtain a touch user-input experience with displays that are not necessarily touch-input enabled. In an embodiment, a computer-implemented method of manipulating a display is described. The method includes detecting a set of one or more user appendages proximate to and out of contact with a set of one or more corresponding sensor locations of a sensing region of a remote control device; determining, based at least in part on a mapping of the sensing region to a display region of a remote display device, a set of one or more display locations of the display region; and transmitting a signal that causes the display device to display a set of one or more representations of the detected set of one or more user appendages according to the determined set of one or more second locations. The mapping may be an absolute mapping.
Variations of the method are also considered as being within the scope of the present disclosure. For example, in an embodiment, the method further includes: calculating at least one measurement that corresponds to distance of the detected appendage from the sensing region. In this embodiment, displaying the representation of the detected appendage may include displaying the representation of the detected appendage with one or more color characteristics that are based at least in part on the measurement. The color characteristics may be, for instance, brightness, hue, opacity, and the like. The display may display a graphical user interface and displaying the set of one or more representations may include overlaying the one or more representations on the graphical user interface or otherwise visually distinguishing locations corresponding to user interaction with the sensing region from other locations. In an embodiment, the graphical user interface includes one or more selectable options that each correspond to a selection region of the sensing region. In this embodiment, the method may further comprise detecting a contact event by at least one appendage of the set of one or more appendages. The detected contact event may correspond to a contact location of the sensing region. When the contact location corresponds to a selection region of a corresponding selectable option of the graphical user interface, the graphical user interface may be updated according to the corresponding selectable option.
The displayed set of one or more representations may be changed upon detection of the contact event for which the contact location corresponds to the selection region of the corresponding selectable option. Changing the displayed set of one or more representations may include removing the set of one or more representations from the display. At least one of the representations may resemble the corresponding appendage and the displayed set of one or more representations may include at least two representations of different forms, such as two different fingers.
In accordance with another embodiment, a computer-implemented method of manipulating a display is disclosed. The method, in this embodiment, includes calculating measurements that correspond to distances of a user appendage from a sensing region of a remote control device as the user moves the user appendage relative to the sensing region; and taking one or more actions that cause a display device to display a representation of the appendage such that the representation has one or more color characteristics that vary based at least in part on the calculated measurements.
As with all methods disclosed and suggested herein, variations are considered within the scope of the present disclosure. For example, taking the one or more actions may include transmitting remotely generated signals to the display device. As another example, the representation may have a transparent appearance when the user appendage is out of contact with the sensing region and an opaque appearance when the user appendage is in contact with the sensing region. The method may also include determining location changes of the sensing region with which the user appendage is proximate or in contact. In such instances, taking the one or more actions may include updating locations of the representation on the display.
In accordance with yet another embodiment, a user input system is described. The user input system may be a set of one or more devices that collectively operate to change a display according to user input. In this embodiment, the user input system includes one or more processors and memory including instructions that, when executed collectively by the one or more processors, cause the user input system to cause a display device update according to user input. For instance, the display device may display a representation of a user appendage on a display of the display device and change one or more color characteristics of the representation based at least in part on changes in distances of the user appendage from a sensing region of a remote control device.
Variations of the user input system are also considered as being within the scope of the present disclosure. For example, the instructions may further cause the user input system to cause the display device to change a location of the representation based at least in part on movement of the user appendage relative to the sensing region. The instructions may also further cause the user input system to cause the display device to update the display according to a predefined action of multiple user appendages in connection with the sensing region. The display device may be separate from the user input system. For instance, the display device may be a television and the user input system may be a remote control device (or remote control system) that operates the television.
In accordance with another embodiment, another user input system is described. The user input system allows for user interaction with a graphical user interface. The user input system may include, for example, one or more processors and memory including instructions that, when executed collectively by the one or more processors, cause the user input system to cause a display device to display information according to user input. The display device may, for instance, display a representation of a user appendage at a display location of a display of the display device where the display location is based at least in part on an absolute mapping of display locations of the display device to sensing locations of a sensing region of a sensing device. At least when the appendage moves relative to and out of contact with the sensing device, the display device may change the display location based at least in part on the absolute mapping.
Variations of the user input system considered as being within the scope of the present disclosure include, but are not limited to, the instructions further causing the user input system to cause the display device to change a location of the representation based at least in part on movement of the user appendage relative to the sensing region. The instructions may further cause the user input system to identify the user appendage from a set of potential user appendages and/or cause the user input system to update the display based at least in part on detection of an event that is uncausable using at least one other of the potential user appendages. The sensing device may be a component of a device that is physically disconnected from the display device and/or the sensing device may be a remote control device for the display device.
In accordance with another embodiment, a display device is disclosed. The display device includes one or more processors and memory including instructions that, when executed collectively by the one or more processors, cause the user input system to display information according to user input. The display may, for instance, display a graphical user interface and receive signals corresponding to user interaction with a sensing region of a sensing input device, the signals being based at least in part on a number of dimensions of user interaction that is greater than two, and change the graphical user interface according to the received signals. The signals may be generated by an intermediate device that receives other signals from a remote control device. The sensing input device may be separate from the display device. Changing the graphical user interface may include updating an appearance characteristic of a representation of an object used to interact with the sensing input device. Changing the graphical user interface may also include updating, on the graphical user interface, a location of a representation of an object used to interact with the sensing input device. The user interaction may include contactless interaction with the sensing input device.
For a fuller understanding of the nature and advantages of the present invention, reference should be made to the ensuing detailed description and accompanying drawings.
In the following description, various embodiments of the present invention will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the embodiments. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details. Furthermore, well-known features may be omitted or simplified in order not to obscure the embodiment being described.
When a user utilizes an environment, such as the environment 100, one or more devices may utilize the content appliance 102 in some manner. To accomplish this, the various devices shown in
As another example of how the content 102 is able to communicate utilizing various protocols, the content appliance 102 includes various ports which may be used to connect with various devices. For example, in an embodiment, the content appliance 102 includes an HDMI OUT port 120 which may be used to provide content through an HDMI cable to another device. For example, as illustrated in
An ethernet port 124 may be provided with the content appliance 102 to enable the content appliance 102 to communicate utilizing an appropriate networking protocol, such as illustrated in
In an embodiment, the content appliance 102 includes one or more universal serial bus (USB) ports 126. The USB ports 126 may be utilized to communicate with various accessories that are configured to communicate utilizing a USB cable. For example, as shown in
Other ports on the content appliance 102 may include RCA ports 130 in order to provide content to devices that are configured to communicate using such ports and an HDMI end port 132 which may be used to accept content from another device, such as from the set top box 114. Generally, the content appliance 102 may have additional ports to those discussed above and, in some embodiments, may include fewer ports than illustrated.
Various devices in communication with the content appliance may be used to control the content appliance and other devices in the environment 100. For example, the remote control 116 may communicate with the content appliance 102 utilizing radio frequency (RF) communication. As described in more detail below, the remote control 116 may include a touch screen that may be used in accordance with the various embodiments described herein.
A keyboard 118 may also communicate with the content appliance 102 utilizing RF or another method (and possibly one or more other devices, either directly, or through the content appliance 102). The keyboard may be used for various actions, such as navigation on a interface displayed on the television 104, user input by a user typing utilizing the keyboard 118, and general remote control functions. For example, an interface displayed on the television 104 may include options for text entry. The user may type text utilizing keyboard 118. Keystrokes that the user makes on the keyboard 118 may be communicated to the content appliance 102, which in turn generates an appropriate signal to send over an HDMI cable connecting the HDMI OUT port 120 to the AV receiver 110. The AV receiver 110 may communicate with television 104 over HDMI or another suitable connection to enable the television to display text or other content that corresponds to the user input. The keyboard 118 may also include other features as well. For example, the keyboard 118 may include a touchpad, such as described below or generally a touchpad that may allow for user navigation of an interface displayed on a display device. The touchpad may have proximity sensing capabilities to enable use of the keyboard in various embodiments of the present disclosure.
In an embodiment, the mobile device 108 is also able to control the content appliance 102 (and possibly other devices, either directly, or through the content appliance 102). The mobile device may include a remote control application that provides an interface for controlling the content appliance 102. In this particular example from
In an embodiment, the content appliance 102 is also configured to utilize various services provided over a public communications network, such as the Internet 128. As example, the content appliance 102 may communicate with a router 134 of home network. The content appliance 102 and the router 134 may communicate utilizing a wired or wireless connection. The router 134 may be directly or indirectly connected to the Internet 128 in order to access various third-party services. For example, in an embodiment, a code service 136 is provided. The code service in an embodiment provides codes for the content appliance 102 to control various devices to enable the content appliance to translate codes received from another device (such as the remote control 116, the keyboard 118, and/or the mobile device 108). The various devices to control may be identified to the content appliance 102 by user input or through automated means. The content appliance 102 may submit a request through the router 134 to the code service 136 for appropriate codes. The codes may be, for example, IR codes that are used to control the various devices that utilize IR for communication. Thus, for example, if a user presses a button on the remote control 116, keyboard 118, or an interface element of the mobile device 108, a signal corresponding to the selection by the user may be communicated to the content appliance 102. The content appliance 102 may then generate a code based at least in part on information received from the code service 136. As an illustrative example, if the user presses a play button of the remote control 116, a signal corresponding to selection of the play button may be sent to the content appliance 102 which may generate a play IR code, which is then transmitted to the television 104 or to another suitable appliance, such as generally any appliance that is able to play content.
Other services that may be accessed by the content appliance 102 over the Internet 128 include various content services 138. The content services may be, for example, any information resource, such as websites, video-streaming services, audio-streaming services and generally any services that provide content over the Internet 128.
It should be noted that the environment illustrated in
In an embodiment, the television 204 may display an interface which is navigable by a user utilizing the remote control 202. For example, the television 204 in
In this example, the television 204 (or a network of devices that includes the television 204) is configured to utilize one or more other devices, such as a DVD player, music player, a gaming device, and devices that allow communication over the Internet. For example, in an embodiment the users are able to utilize the television 204 to check an email account and/or stream a movie from a remote streaming service. Generally, the television 204 or a network of devices that includes the television 204 may be configured for use with any device involved in providing content, either from the devices themselves, or from other sources, including remote sources accessible over the Internet or other communications network.
In an embodiment, the remote control 202 includes a touch screen 210. As noted, while the remote control is described as having a touch screen 210, embodiments may utilize a remote control with a touch pad instead of or in addition to a touch screen. The touch screen 210 or touch pad in an embodiment operates using capacitive proximity sensing. The touch screen 210 (or touch pad) may be configured, for example, according to the disclosure of U.S. application Ser. No. 13/047,962, referenced above. The touch screen 210 (or touch pad), however, may utilize any technique for proximity sensing. As discussed below, the touch screen 210 (or touch pad) may be used as an input device by a user. The input may be input for controlling the remote control device 202 and/or the television 204. Other mechanisms for input may be used in addition to the touch screen 210. For instance,
In an embodiment, there is an absolute mapping between locations on the touch screen or touch pad 210 and locations on the graphical user interface on the television (or other display device). Thus, each location on the touch screen or touch pad 210 is mapped to at least one location on the user interface on the display. The absolute mapping may be a surjective mapping from the locations of the user interface on the display to the locations of the touch screen. The mapping may be a surjective mapping from the locations of the touch screen to the locations of the user interface if there are more locations on the touch screen or touch pad than the user interface. A mapping may also be a one-to-one mapping between touch screen sensor locations and pointer locations on the interface. Generally, the mapping may be any mapping that is configured such that, from the user perspective, each location on the touch screen has a corresponding location on the user interface. It should be noted, however, that some embodiments of the present disclosure may utilize a relative mapping between the touch screen and user interface.
Mappings between a region of a remote control device and a display device may be determined in various ways. For example, in an embodiment, a device in connection with the display device utilizes extended display identification data (EDID) or extended EDID (E-DID) received over a high-definition multimedia interface (HDMI) or other connection to determine display parameters for the display device. The data may, for example, specify a maximum horizontal image size and maximum vertical image size which, allows for a mapping of the sensing region to the display. Other ways of determining the display size for generating a mapping may be used. For instance, a user may input the display size during a setup process. The user may alternatively enter an identifier (model number, e.g.) for a display device and a database (which may be a remote database) may be referenced to determine the display size based on the model identifier. Alternatively, some other method may be used to determine an identifier for the display device (e.g., knowledge of the legacy remote control used by the display device), and the identifier may be used to determine the display size (possibly using a remote database). Generally, any suitable for determining the display dimensions and generating a mapping may be used.
In an embodiment, the user interacts with the touch screen or touch pad 210 in order to navigate the user interface displayed on the television 204. Generally, the user may interact with the touch screen 210 (or touch pad) by using one or more appendages (such as fingers) to touch and/or hover over the touch screen 210 and/or move the one or more appendages relative to the touch screen 210. The manner in which the user interacts with the touch screen 210 may be sensed by the touch screen 210 (or touch pad) to generate signals. The generated signals may be interpreted by one or more processors of the remote control 202 to generate one or more other signals corresponding to user input and are transmitted by the remote control 202 to another device, such as directly to the television 204 or to a content appliance, such as described above, or in any suitable manner. For example, if the user touches the touch screen 210 (or touch pad) with his or her finger and moves the finger upward while in contact with the touch screen 210 (touch pad), one or more processors of the remote control 210 may interpret signals generated according to such touch and movement and generate and transmit one or more other signals that enable another device to update a device on which a GUI is displayed accordingly. Alternatively, signals generated by the touch screen 210 or signals derived therefrom may be transmitted to another device (such as the television 204 or a content appliance) and interpreted by the other device. Generally, any manner in which signals generated by the touch screen 210 (or touch pad) are interpreted as user input may be used.
It should be noted that
In addition,
As mentioned, various embodiments of the present disclosure utilize touch and proximity sensing technology to allow a user to interact with a graphical user interface.
In an embodiment, when the user interacts with the touch screen 310, visual indicators of such interaction appear on the interface displayed on the television 304. For example, interaction with the touch screen 310 in an embodiment includes touching the touch screen 310 and, generally, performing actions in close proximity to the touch screen 310. For example, as shown in
Accordingly, in
In the illustrative example of
The visual feedback of user interaction with the touch screen 310 may be provided in a varying manner. For example, as shown in
The representation 322 of the right thumb 318, on the other hand, appears bright and opaque, thereby indicating to the user that the right thumb is touching the touch screen 310. In this manner, the user can hover over the touch screen with an appendage and, based on the location of the representation on the interface 306, knows where to move his or her appendage to navigate the interface as desired. Because, in this example, the representation is transparent when the corresponding appendage is hovering, the representation does not obscure the user interface. For example, as shown in
Other indications of user interaction with an interface may also be shown in addition to the representations of the appendages. For example, in
Other variations not illustrated in this figure, but described below, may also be used. For example, the amount by which a representation is transparent may vary according to a distance by which an appendage is hovering over the touch screen 310. Further, while the example in
In addition, while the illustrative examples in
In an embodiment, the process 400 includes displaying 402 an interface, such as described above. As used herein, an interface may mean actually displaying the interface or taking one or more actions that cause an interface to be displayed. For example, referring to
In an embodiment, a proximate appendage is detected 404. Detecting a proximate appendage may be done in any suitable manner, such as utilizing the techniques in U.S. application Ser. No. 13/047,962 and U.S. application Ser. No. 12/840,320, described above. The appendage may be detected, for example, upon the user moving an appendage within a certain distance of a touch screen and by detecting the proximate appendage. Depending on a particular environment in which the process 400 is performed, detecting the appendage may be performed in various ways. For example, detecting the appendage may be performed by detecting the appendage directly or receiving a signal from another device, such as a remote control device that detected the appendage.
Once the appendage is detected 404 in an embodiment, an interface display location is determined 406 based at least in part on a mapping of input device locations to interface display locations, which may be an absolute mapping, as described above. Upon determining the interface display location, in an embodiment, the representation of the appendage is overlaid 408 on the interface display at the determined interface display location. As discussed above, other ways of providing representations may be performed, although overlays of representations are used for the purpose of illustration. As the user moves his or her appendage relative to the touch screen, the position of the representation on the interface may be updated 410 according to movement of the appendage. For example, if the user moves the appendage to the left, the representation of the appendage may move to the left as well. If the user moves the appendage up or down relative to the touch screen, then the representation may remain in the same place, but change color characteristics such as described above. Determining how to update the position of the representation may include multiple detections of the appendage and corresponding determinations of the interface display location based on the mapping.
In an embodiment, at some point during interaction with the touch screen, the user may make contact with the touch screen. In such instances, when the user touches the touch screen, a touch event of the appendage is detected 412. When a touch event is detected 412, an operation according to the touch event type and/or location of touch is performed 414. The way in which the user touches a touch screen may indicate, for example, how the user wishes to navigate a user interface. For instance, touching the touch screen and moving the appendage while in contact with the touch screen may indicate a drag operation in the user interface. If the initial touch was on an element and is dragable in the user interface, the element may move accordingly. Similarly, if the user touches the touch screen and subsequently raises the appendage away from the touch screen, losing contact with the touch screen, such an event may indicate selection of an option at the location that was touched. A double tap on the touch surface 210 may also be appropriately interpreted (for example as a double click on an icon on the display).
Determining whether an appendage is detected may be performed in various ways. For example, in some embodiments, the determination may simply be a determination of whether signals from a touch screen indicate the presence of an appendage proximate to the touch screen. However, the determination may be more complex and may include other determinations. For example, determining whether an appendage is detected may include determining how many appendages are detected. In addition, in an embodiment, for any appendages detected, the detected appendages may be matched to actual appendages. For instance, referring to
Matching detected appendages to actual appendages may be done in various ways. For example, in an embodiment, when an appendage is detected, the appendage will generally cause different portions of the touch screen to generate different signals. For example, with capacitive pressure sensors, the capacitance measurements for the touch screen may increase for locations that are proximate to the detected appendage. The locations for which the capacitance changes may be used to determine which appendage a detected appendage corresponds to. For example, a thumb will generally affect a larger region of the touch screen than other fingers due to the thumb's larger relative size. In addition, regions of affected locations in the touch screen may generally be oriented differently depending on the appendage being sensed. Referring to FIG. 3, for example, a region of locations on the touch screen 310 for the left thumb would generally point upward and to the right whereas a right thumb would point upward and to the left. Some suitable techniques for matching detected appendages to fingers are described in U.S. application Ser. No. 13/047,962 and U.S. application Ser. No. 12/840,320, referenced above.
Returning to the process 500, in an embodiment, if no appendage is detected, the touch screen continues to be monitored, as illustrated. however, if it is determined 506 that an appendage is detected, then a determination may be made whether a touch event criteria has been satisfied. (The touch screen may continue to be monitored, when a touch event is detected, for example, to detect further touch events.) Touch event criteria may be criteria that, when met, indicate a touch event. For example, criteria for a selection of a touch event corresponding to selection of a user interface element may be that the user may contact the touch screen and then may lose contact over a predetermined period of time. Other criteria can be simpler, the same complexity, and/or more complex. Criteria may take into account information about timing of various activities, such as how long the user has touched the touch screen, how many appendages or other objects touched the screen, whether or not the user moved while in contact with the touch screen, a certain amount and the like.
In an embodiment, the touch event criteria takes into account matches to appendages that have been detected. Different touch events may correspond to different actions by a user using different subsets of his or her fingers (or other appendages or objects). For example, in an embodiment, when a middle finger and an index finger are detected, a right click event may be generated at a user interface (UI) location corresponding to a touch screen location selected by the index finger. The location of the UI may correspond to a particular right-click menu of selectable interface options that may be displayed upon detection of the right click event. As another example, when a thumb and right index finger are detected, a scroll left event may be detected for a UI element (scroll bar, icon, etc.) selected by the right index finger. Similarly, when a thumb and right ring finger are detected, a scroll right event may be detected for a UI element selected by the right ring finger. When the thumb finger is detected, a scroll left or right event may be generated for a UI element selected by the index finger. The direction of scroll may be determined in many ways, such as by the left or right thumb detected, by the direction of movement of the detected thumb, and the like. Generally, any way of matching actions of sets of one or more fingers (or other objects) to events may be used.
If it is determined that touch event criteria are satisfied, a determination may be made 510 of the touch event type. If, for example, it is determined that the touch event type was an object selection, then the interface may be updated 512 according to the selection. As noted, updating the interface may be done in any suitable way, such as changing a display of the interface on a display device providing a completely new display or generally changing the interface in accordance with its programming logic. An overlay of representations on the interface may be removed in accordance with an embodiment. If the event touch type is another type of touch type, such as a drag, scroll or other event, then the user interface may be updated if applicable. For example, if the user interface is scrollable, the user interface may be scrolled. If a scroll event is detected with an interface object selected, the object may be moved on the interface accordingly. As another example, if the user touches the touch screen at a location corresponding to a portion of the user interface that is not manipulable through user interaction and moved within that area while in contact with the touch screen, the user interface may be left as is, although representations of detected appendages may be updated with movement accordingly.
As illustrated in
As noted, representations of user appendages or other devices used to interact with a touch screen may change according to a manner in which the interaction is performed. One way of changing the manner in which the interaction is performed is by changing the representation based at least in part on a distance of a detected appendage and, in particular, changing the representation based at least in part on the distances of multiple locations of a detected appendage (or other object). As described above, the color characteristics of a representation of an appendage may be changed based at least in part on the distance the appendage is from the touch screen.
The representation 608 of the finger 604 appears on the interface at a location corresponding to the location at which the finger 604 hovers over the touch screen 602. In addition, in this particular example, the representation 608 resembles an outline of a finger and this outline is oriented according to the orientation of the finger 604 over the touch screen 602. As shown in
Further, while various embodiments of the present disclosure describe embodiments in terms of touch screens, any input device that is able to detect proximity and touch may be used. For example, touch screens used in some embodiments may not themselves display any information or may display information different from that which is displayed on another device, such as a television. In addition, proximity sensing techniques may be used in other contexts, not just planar touch sensitive areas, such as those illustrated above. For example, a remote control device with buttons (such as physically displaceable buttons) may incorporate proximity sensing technology. Proximity sensors may be incorporated with the remote control device. When a user's appendage becomes close to a button (such as by making contact with the button), a representation of the appendage may appear on a display. When the user presses the button, an action may be taken. The action may correspond to the button, the location of the representation on the display, or otherwise. As an illustrative example, if a user's appendage becomes close to a “play” button on a remote control, a representation of the appendage may appear over a “play” button on a display, such as described above. When the user presses the button, a play function may be performed. For instance, if watching a DVD, a DVD player may be put in a play state. In an embodiment where displaceable or other buttons are used in connection with proximity sensing techniques, movement of the representation of an appendage on a display may correspond to user movement of the appendage relative to the remote control, manipulation of an input device (such as a joystick), or otherwise.
In addition to the above, other variations are within the scope of the present disclosure. For example, additional techniques may be incorporated with those above. As an example, buttons of a remote control (such as physically displaceable buttons) may be force sensing. The above described hovering effects may be produced upon light presses of a button and selection of an object may be performed upon more forceful presses of the button. As other examples, sound, vibration, or other additional feedback may be provided to the user based on the user's interaction with a remote control device. A remote control device may, for instance, vibrate upon making a selection using the techniques described herein. The remote control or another device (such as a television or audio system) may make a sound upon selection.
Other variations are within the spirit of the present invention. Thus, while the invention is susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific form or forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention, as defined in the appended claims.
The use of the terms “a” and “an” and “the” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “having,” “including,” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted. The term “connected” is to be construed as partly or wholly contained within, attached to, or joined together, even if there is something intervening. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate embodiments of the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
Preferred embodiments of this invention are described herein, including the best mode known to the inventors for carrying out the invention. Variations of those preferred embodiments may become apparent to those of ordinary skill in the art upon reading the foregoing description. The inventors expect skilled artisans to employ such variations as appropriate, and the inventors intend for the invention to be practiced otherwise than as specifically described herein. Accordingly, this invention includes all modifications and equivalents of the subject matter recited in the claims appended hereto as permitted by applicable law. Moreover, any combination of the above-described elements in all possible variations thereof is encompassed by the invention unless otherwise indicated herein or otherwise clearly contradicted by context.
All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.
This application incorporates by reference for all purposes the full disclosure of U.S. application Ser. No. 13/284,668, entitled “Remote Control System for Connected Devices,” and filed concurrently herewith. This application also incorporates by reference the full disclosure of U.S. Application No. 61/480,849, entitled “Remote Control for Connected Devices,” and filed on Apr. 29, 2011. This application also incorporates by reference for all purposes the full disclosure of U.S. Provisional Application No. 61/227,485, filed Jul. 22, 2009, U.S. Provisional Application No. 61/314,639, filed Mar. 17, 2010, U.S. application Ser. No. 12/840,320 entitled “System and Method for Remote, Virtual On Screen Input,” and filed on Jul. 21, 2010, and U.S. application Ser. No. 13/047,962, entitled “System and Method for Capturing Hand Annotations” and filed on Mar. 15, 2011.
Number | Date | Country | |
---|---|---|---|
61480849 | Apr 2011 | US |