Remote control with feedback for blind navigation

Information

  • Patent Grant
  • 9591250
  • Patent Number
    9,591,250
  • Date Filed
    Wednesday, October 19, 2011
    13 years ago
  • Date Issued
    Tuesday, March 7, 2017
    7 years ago
Abstract
A user is provided with an indication of a button's functionality on a viewing device before the button is actually selected. The user receives haptic/tactile feedback to let the user know that they are about to select a button and that its functionality indicator is shown on the viewing device. Thus, the haptic or tactile feedback in combination with on-screen display elements is utilized in a remote control device to eliminate the need for shifting focus away from a viewing device.
Description

This application claims the benefit, under 35 U.S.C. §365 of International Application PCT/US2011/056836, filed Oct. 19, 2011, which was published in accordance with PCT Article 21(2) on Apr. 25. 2013, in English.


BACKGROUND

Current television remote controls employ the use of labeled buttons and/or screen displays to inform the user of corresponding control functions. This requires the user to look down at the remote device. Other remote control designs incorporate different button shapes or “feels” or adjust spacing between buttons to allow the user to differentiate buttons without looking at the remote. The disadvantage to this approach is that the button functions may still have to be memorized especially when the functions are used relatively infrequently.


SUMMARY

Haptic or tactile feedback in combination with on-screen display elements is utilized in a remote control device to eliminate the need for shifting focus away from a viewing device. A user is provided with an indication of a button's functionality on a viewing device before the button is actually selected. The user receives haptic/tactile feedback to let the user know that they are about to select a button and that its functionality indicator is shown on the viewing device. In this manner the user keeps their eyes focused on the viewing device, never needing to look down at the remote control device to control the viewing experience.


The above presents a simplified summary of the subject matter in order to provide a basic understanding of some aspects of subject matter embodiments. This summary is not an extensive overview of the subject matter. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the subject matter. Its sole purpose is to present some concepts of the subject matter in a simplified form as a prelude to the more detailed description that is presented later.


To the accomplishment of the foregoing and related ends, certain illustrative aspects of embodiments are described herein in connection with the following description and the annexed drawings. These aspects are indicative, however, of but a few of the various ways in which the principles of the subject matter can be employed, and the subject matter is intended to include all such aspects and their equivalents. Other advantages and novel features of the subject matter can become apparent from the following detailed description when considered in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an example remote control system using a remote control device that is populated with capacitive-touch buttons.



FIG. 2 illustrates an example where a user has switched from hovering over/touching a button for a “home” button to hovering over/touching a “record” button.



FIG. 3 shows an example of selecting functionality of a remote control device having a programmable button.



FIG. 4 is a flow diagram of a method of controlling a viewing experience with a remote control device that employs feedback prior to a button selection.





DETAILED DESCRIPTION

The subject matter is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the subject matter. It can be evident, however, that subject matter embodiments can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the embodiments.


As used in this application, the term “component” is intended to refer to hardware, software, or a combination of hardware and software in execution. For example, a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, and/or a microchip and the like. By way of illustration, both an application running on a processor and the processor can be a component. One or more components can reside within a process and a component can be localized on one system and/or distributed between two or more systems. Functions of the various components shown in the figures can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.


When provided by a processor, the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor (“DSP”) hardware, read-only memory (“ROM”) for storing software, random access memory (“RAM”), and non-volatile storage. Moreover, all statements herein reciting instances and embodiments of the invention are intended to encompass both structural and functional equivalents. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure).


Haptic or tactile feedback on a remote control device allows a user to concentrate on what they are viewing rather than a control device. The remote control device can include, but is not limited to, traditional dedicated remote controls, mobile devices such as cell phones, and/or mobile tablets/computers and the like. Haptics allow a user to receive sensory information, such as for example, a buzzing sensation in the fingertips, etc. Similarly, it could include hot and/or cold sensations via the fingertips as well. Induced pain sensations and other sensations could also be used by one skilled in the art to promote feedback sensations to the user. A remote control device can provide a smooth surface with haptic type “buzz” feedback as the button area is touched. That is, the button gives a sensory type of feedback to the user before it is selected. Alternatively, a remote control device can provide physical buttons, indentations or other variations in the surface to facilitate in physically locate button areas.


In one example 100 illustrated in FIG. 1, a surface 102 of a remote control device 104 is populated with capacitive-touch buttons. This allows a user 106 to be able to touch the buttons to find them and then use a separate “click” action to activate them. The haptic feedback can occur twice in this sequence: once for an initial touch and then once for a selection “click.” The initial touch, like a soft buzz or mild click for example, can be used to find the button by feel (i.e., sensory feedback). This only indicates to the user 106 that the button has been found but not actuated. When the button is pressed, the select action results in a distinctive “click” and then the action is taken.


An on-screen icon/text indicator 108 can be displayed on a content viewing device 110 as a button is “hovered” over by the user 106. As buttons (haptic or tactile) are touched the remote control system provides an icon and/or text indicator 108 related to the function of the button currently touched. This allows the user to find button functions without shifting focus off of the viewing device 110. The touch function is distinct from a “select” function. Once a button is discovered by the above steps, the user 106 can select the current option by pressing or selecting. A distinct haptic feedback or click can, for example, indicate a selection.



FIG. 2 illustrates an example 200 where the user 106 has switched from hovering over/touching a button for a “home” button (and its on-screen indicator 108) to hovering over/touching a “record” button with its on-screen indicator 202. The user, who desires to record a program, then clicks or presses downward, etc. 204 on the remote control device 104 to execute the function indicated on the viewing device 110. In this manner, the user 106 can maintain visual contact with the viewing device 110 and does not need to divert their attention to the remote control device 104 for either 1) determining which button they are about to select and 2) determining if they have successfully selected the button's functionality.


Oftentimes, a remote control device 308 can have programmable buttons. The on-screen aspect of this system can be used to configure programmable buttons as well. FIG. 3 shows an example 300 of selecting functionality of a programmable button 302. In the case of an unprogrammed button, an on-screen assistant on the viewing device 110 provides a menu of functions 304 to choose from. The user can then select a function 306, and the button is then configured to execute that function 306 on subsequent presses. To reconfigure the button, it can be pressed and held to bring up the options once again.


In view of the exemplary embodiments shown and described above, methodologies that can be implemented in accordance with the embodiments will be better appreciated with reference to the flow charts of FIG. 4. While, for purposes of simplicity of explanation, the methodologies are shown and described as a series of blocks, it is to be understood and appreciated that the embodiments are not limited by the order of the blocks, as some blocks can, in accordance with an embodiment, occur in different orders and/or concurrently with other blocks from that shown and described herein. Moreover, not all illustrated blocks may be required to implement the methodologies in accordance with the embodiments.



FIG. 4 is a flow diagram of a method 400 of controlling a viewing experience. The method starts 402 by interacting with a button on a remote control device that is in communication with a viewing device, the interaction occurring before selection of the button 404. The interaction can include, but is not limited to, a user hovering over the button and/or a user lightly touching a button and the like. The interaction just needs to be such that the user knows which button they are in proximity with without invoking the function of the button. This can include some type of haptic feedback and/or other tactile feedback such as a button shape, depression, etc. A function associated with the button is then displayed on the viewing device 406, ending the flow 408. In another instance, haptic feedback can also be provided to the user when the button function is selected. This type of haptic feedback can be of another type than the feedback provided for interacting with the button. If the feedback types are different, the user can easily determine if the function has been selected or not.


In yet another example, the above method can be extended for unprogrammed buttons on a remote control device. In this scenario, a list of functionality indicators (e.g., text, icons, etc.) is displayed on a viewing device when a button is being interacted with but has not been previously programmed. The list could, for example, include such functions as STOP, PLAY, REVERSE, FAST FORWARD, PAUSE, VOLUME UP, VOLUME DOWN, CHANNEL UP, CHANNEL DOWN and the like. The methods disclosed herein applied to any type of functionality. The functionality can then be selected via the remote control device or some other means. This programs the button to that functionality.


In other words, the above techniques can be applied as the user moves and touches buttons to cause an indicator to show up on a viewing device, the indicator denoting the assigned function of the remote control device's button. Thus, icons and/or textual prompts are displayed on the viewing device as buttons are touched on the remote control device. This allows the user to “see” the button labels on the viewing device as they touch specific button areas on the remote. The button action is not executed until the user clicks the button. If the button is not currently programmed, the user is able to assign a function at that time.


The remote can be constructed such that there is a single button with an internal processor that detects the particular area that has been touched to discriminate between buttons. As the user touches button areas, a distinct vibration can be created allowing the user to sense when their finger is in position. When the user presses the button, a different “click” is offered as feedback. Physical dimples or bumps in the remote's surface can substitute for the touch feedback and a physical switch for the “select” feedback. This allows the user to detect where they are on the remote control device by feel. By employing haptic touch to “feel” the buttons along with on-screen assist, allows the user to maintain focus on the viewing device without shifting attention to the remote control device itself.


What has been described above includes examples of the embodiments. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the embodiments, but one of ordinary skill in the art can recognize that many further combinations and permutations of the embodiments are possible. Accordingly, the subject matter is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims. Furthermore, to the extent that the term “includes” is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term “comprising” as “comprising” is interpreted when employed as a transitional word in a claim.

Claims
  • 1. A control device that controls a viewing experience, the control device comprising: at least one button and a plurality of interaction modes for interacting with the at least one button, the plurality of interaction modes including a select mode and a proximity mode, the control device is configured to:initiate a display of at least one on-screen indicator on a viewing device, when the proximity mode is detected via a button on the control device, wherein the at least one on-screen indicator represents a function associated with the button;generate a first haptic feedback, when the proximity mode is detected via the button; andgenerate a second haptic feedback different from the first haptic feedback and execute the function associated with the button, when activation of the button is detected;wherein when a given button is programmable and not configured, and the proximity mode or select mode of the given button has been activated, the control device is further configured to display on the viewing device one or more possible functions of the given button in a programmed state.
  • 2. The control device of claim 1, wherein detection of the proximity mode includes detection of a hovering over the button or a touch of the button without depressing the button.
  • 3. The control device of claim 1, wherein the at least one on-screen indicator on the viewing device indicates one or more functions for controlling the viewing experience, the one or more functions being associated with the button of the control device.
  • 4. A method for controlling a viewing experience with a control device having at least one button, which comprises a plurality of interaction modes for interacting with the at least one button, including a select mode and a proximity mode, the method comprising: initiating, by the control device, a display of at least one on-screen indicator on a viewing device when the proximity mode of a button on the control device is detected, wherein the at least one on-screen indicator represents a function associated with the button;generating, by the control device, a first haptic feedback, when the proximity mode is detected via the button;generating, by the control device, a second haptic feedback different from the first haptic feedback, when activation of the button is detected;executing, by the control device, the function associated with the button, when activation of the button is detected; anddisplaying on the viewing device one or more possible functions of a given button in a programmed state, when the given button is programmable and not configured, and the proximity mode or select mode of the given button has been activated.
  • 5. A system that controls a viewing experience, the system comprising: a control device for selecting a function associated with controlling the viewing experience, the control device having at least one button and a plurality of interaction modes for interacting with the at least one button, the plurality of interaction modes including a select mode and a proximity mode, the control device is configured to:initiate a display of at least one on-screen indicator on a viewing device when the proximity mode is detected via a button on the control device, wherein the at least one on-screen indicator represents a function associated with the button;generate a first haptic feedback, when the proximity mode is detected via the button;generate a second haptic feedback different from the first haptic feedback and execute the function associated with the button, when activation of the button is detected; andwhen the proximity mode is activated and the button is programmable and not configured, display on the viewing device a list of possible functionality indicators of the button, wherein the proximity mode is activated without depressing the button.
  • 6. The system of claim 5, wherein the control device initiates the on-screen indicator on the viewing device and the haptic feedback when hovering is detected over the button.
  • 7. The control device of claim 1, wherein the viewing device is configured to display, when the proximity mode has been detected, the at least one on-screen indicator associated with the button on the control device.
  • 8. The system of claim 5, wherein a hovering input initiates the proximity mode, the at least one on-screen indicator providing feedback to indicate a function of a plurality of functions for controlling the viewing experience without depressing the button, and wherein the select mode executes the function associated with the controlling of the viewing experience.
  • 9. The method of claim 4, further comprising detecting a hovering over the button or a touch of the button without depressing the button.
  • 10. The method of claim 4, wherein the at least one on-screen indicator on the viewing device indicates one or more functions for controlling the viewing experience, the one or more functions being associated with button of the control device.
PCT Information
Filing Document Filing Date Country Kind 371c Date
PCT/US2011/056836 10/19/2011 WO 00 4/4/2014
Publishing Document Publishing Date Country Kind
WO2013/058748 4/25/2013 WO A
US Referenced Citations (43)
Number Name Date Kind
5311175 Waldman May 1994 A
5635958 Murai et al. Jun 1997 A
5691878 Ahn et al. Nov 1997 A
6215417 Krass et al. Apr 2001 B1
6223815 Shibasaki May 2001 B1
6317325 Patel et al. Nov 2001 B1
6574083 Krass Jun 2003 B1
6680677 Tiphane Jan 2004 B1
6881077 Thorum Apr 2005 B2
7050305 Thorum May 2006 B2
7265984 Numata Sep 2007 B2
7362578 Hornung Apr 2008 B2
7839630 Hwang et al. Nov 2010 B2
7961471 Odanaka et al. Jun 2011 B2
8245158 Schrick Aug 2012 B1
8278880 Nakajima et al. Oct 2012 B2
8545322 George et al. Oct 2013 B2
8902588 Ritter et al. Dec 2014 B2
9007773 Warren et al. Apr 2015 B2
9043725 Wakefield et al. May 2015 B2
20020007487 Matsumoto et al. Jan 2002 A1
20030169231 Rekimoto Sep 2003 A1
20050066370 Alvarado et al. Mar 2005 A1
20060181859 Thorum Aug 2006 A1
20070129046 Soh et al. Jun 2007 A1
20070236474 Ramstein Oct 2007 A1
20070285284 Matteo et al. Dec 2007 A1
20090002218 Rigazio et al. Jan 2009 A1
20090106655 Grant Apr 2009 A1
20090165073 Stallings Jun 2009 A1
20090167694 Tan Jul 2009 A1
20090167704 Terlizzi et al. Jul 2009 A1
20090169070 Fadell Jul 2009 A1
20090213066 Hardacker et al. Aug 2009 A1
20100231541 Cruz-Hernandez Sep 2010 A1
20110001697 Mao Jan 2011 A1
20110084867 Friedlander Apr 2011 A1
20110205710 Kondo et al. Aug 2011 A1
20110206239 Wada et al. Aug 2011 A1
20120243166 Burton et al. Sep 2012 A1
20120249474 Pratt Oct 2012 A1
20120327025 Huska Dec 2012 A1
20130063895 Ritter et al. Mar 2013 A1
Foreign Referenced Citations (16)
Number Date Country
1826046 Aug 2006 CN
1917755 Feb 2007 CN
101828380 Sep 2010 CN
101859488 Oct 2010 CN
0399763 Mar 1997 EP
1511314 Mar 2005 EP
H07-86471 Mar 1995 JP
199865385 Mar 1998 JP
2000-269671 Sep 2000 JP
200122506 Jan 2001 JP
2001352497 Dec 2001 JP
2002247670 Aug 2002 JP
200370082 Mar 2003 JP
2005-005424 Jan 2005 JP
2005-078642 Mar 2005 JP
2008-131512 Jun 2008 JP
Related Publications (1)
Number Date Country
20140232944 A1 Aug 2014 US