A variety of control mechanisms may be used to control electronic devices. For example, touch screen inputs allow a user to interact directly with a screen on which commands and/or controls are displayed. Touch screens have been incorporated into an increasing number of electronic devices, which may be due, at least in part, to decreases in manufacturing costs as well as increases in functionality of the touch screen inputs. The touch screen inputs may be configured to provide a variety of layouts having a number of different functions. For example, a touch screen may visually represent and functionally perform actions associated with an application program. Furthermore, a touch screen layout may be quickly adjusted in response to an adjustment of the electronic device, increasing the interactivity and ease of use of the device.
However, touch screen inputs do not provide haptic feedback, such as that provided by directional pads, joysticks and the like. Such haptic feedback may be helpful to enable quick and accurate interaction with a control mechanism, as haptic feedback may allow a user to learn to associate various inputs with specific haptic responses. This may allow the user to operate the control mechanism without visual observation, and to determine the timing of various inputs with more accuracy than with touch screen inputs.
Accordingly, various embodiments related to the provision of haptic feedback with a touch screen input are disclosed. For example, one disclosed embodiment provides a control apparatus for an electronic device. The control apparatus comprises a haptic input mechanism configured to provide haptic feedback responsive to a push input and an integrated touch sensitive display forming a surface of the haptic input mechanism, the touch sensitive display comprising a touch-sensing mechanism.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Haptic feedback allows a user of the control apparatus 12 to associate a haptic response with a specific function associated with actuation of the control apparatus 12, allowing for quick and accurate operation. In some examples, the user may operate the control apparatus 12 without visual aid (i.e. direct observation) utilizing haptic feedback as tactile aid.
The control apparatus 12 further comprises an integrated touch sensitive display 16 forming a surface of the haptic input mechanism 14. In one example, as illustrated in
Continuing with
In some configurations, the touch input triggers a first function in the electronic device and the push input triggers a second function in the electronic device. Triggering a function in an electronic device may include sending command signals, implementing commanded actions in an application program executed by the electronic device, activating one or more electronic components, etc.
In one example, the application program is a gaming application program. During operation of the gaming application program, an orbital touch input may control a steering function, while a push input may control a toggle function (e.g. firing of a weapon, actuation of a horn, etc.), a scrolling function, etc. In other examples, the touch and push inputs may trigger the same function.
Returning to
The control apparatus 12 may further include a second display 22 spaced apart from the control apparatus 12. The second display 22 may be used as a primary display to present a game or other interactive content to a user. The second display 22 may be any suitable display, including but not limited to an OLED display, a liquid crystal display (LCD), light emitting diode (LED) display, as well as any of the other displays mentioned above. The second display may be configured to receive data (e.g. image data) from a controller 24, which may comprise a processor and memory containing instructions executable by the processor to perform this and other functions described herein.
It will be appreciated that the second display 22, controller 24, and control apparatus 12, may form a single electronic device sharing a common housing, each component being in electronic communication. For example, the second display 22, controller 24, and control apparatus 12 may be included in a portable electronic device such as a handheld gaming console, portable media player, etc. Likewise, other configurations of these components are possible. For example, each of the electronic components may be separate components having distinct housings, while maintaining electronic communication. Examples of such configurations include, but are not limited to, desktop computing systems, gaming consoles, etc.
Among other functions, the controller 24 may be configured to control the display of indicia on the touch sensitive display 16.
Referring to
In this manner, specific symbols that inform a user of an action invoked by a direction on a directional pad (or other haptic input device) may be displayed for specific application programs used with the control apparatus 12. This may help a user to learn a control layout for the application program more easily than where generic indicia are used to identify actuatable haptic controls, especially where the control apparatus 12 is used to control a wide variety of application programs. In this case, each program may have different actions associated with the haptic input mechanism 14 and the touch sensitive display 16. It may be difficult for the user to keep track of all the actions performed by the control apparatus 12 in many different application programs. Therefore, the displayed indicia and the layout of the displayed indicia may be changed with each application to correspond to the specific actions a particular application program assigns to the control apparatus 12.
As a specific example, referring to
The displayed indicia may be modified in response to any suitable event or occurrence. For example, the displayed indicia may be modified in response to a physical manipulation of the electronic device 10 and/or a change of an application program executed by the electronic device 10. Furthermore, the displayed indicia 28 may be actively updated during a touch or push input. As an example, in a driving video game, the displayed indicia may take the form of a steering wheel that visibly rotates upon input of an orbital touch motion by a user.
Returning to
An exemplary portable electronic device 70 is illustrated in
The multi-directional control mechanism 72 has a plurality of actuatable components 74, each actuatable component configured to provide haptic feedback in response to an input. The portable electronic device 70 also includes a second display 75 configured to display images or objects such as a video game, other audio/video content, etc. While shown in the context of a directional pad, it will be understood that the multi-directional control mechanism 72 may additionally or alternatively include a joystick, one or more buttons, etc. Furthermore, it will be understood that the second display may comprise a haptic feedback mechanism as described herein.
The portable electronic device 70 further comprises a touch sensitive display 76 configured to display one or more indicia 78, forming a surface 79 of the multi-directional control mechanism 72.
The portable electronic device 70 further may include a controller 86 electronically coupled to the multi-directional control mechanism 72 and/or the touch sensitive display 76. Among other functions, the controller 86 may be configured to modify the indicia on the touch sensitive display 76 and modify the functionality of the multi-directional control mechanisms 72 in response to an event such as a physical manipulation of the portable electronic device 70 and/or a change in an application or program executed by the device. In
The touch sensitive display 76 and the multi-directional control mechanism 72 are each configured to receive input from a user, and to provide a control signal to the controller 86 to control an action of an application, such as a game, media player, etc., executed on the portable electronic device. In some applications, the touch sensitive display 76 and the multi-directional control mechanism 72 may be configured to be actuated substantially concurrently and in response to actuation generate a single command signal (i.e. one or ore output signals that are interpreted as a single command by the controller 86), while in other applications the touch sensitive display 76 and multi-directional control mechanism 72 may provide separate control signals even when activated in a temporally overlapping manner (i.e. a plurality of output signals that are interpreted as separate commands by the controller 86).
Continuing with
While
First, at 110, method 100 includes displaying one or more indicia on the touch sensitive display. In one example, each indicium may specify a function of a corresponding actuatable component of the directional pad, as previously discussed. At 112, method 100 further includes receiving a push input via the directional pad. A push input may include depression of the moveable control mechanism, or in other embodiments, an input via a joystick or the like. Next at 114, method 100 includes providing haptic feedback in response to the push input. Such feedback may be a “snap” sensation as a directional actuator on the directional pad is pressed, or may be any other suitable feedback.
Next, as shown at 116, method includes receiving a touch input via a touch sensitive display. Next at 118, the method includes, providing feedback in response to the touch input. Such feedback may be visual and/or aural feedback provided by an application, such as a game or media player, or may be any other suitable feedback.
Next, at 120, the method includes altering the displayed indicia on the touch-sensitive display. The displayed indicia may be altered in response to physical manipulation of the electronic device, at 122. Additionally or alternatively, the displayed indicia may be altered in response to a change in the functionality of the directional pad, at 124. Altering the display indicia may include altering an appearance, location, quantity, and/or functionality of the indicia. Further, the display indicia may be altered in response to a change of an application program associated with the control apparatus. In one example, the indicia may be altered in response to start up of an application. Further still, the displayed indicia may be altered in response to physical manipulation of the device.
Additionally, in some embodiments, the method 100 may include generating a first command signal and a second command signal in response to substantially concurrent touch inputs. Alternatively, in other embodiments, the method 100 may include generating a single command signal in response to substantially concurrent touch and push inputs, as described above. Further still, in other embodiments the method may include, generating a first command signal and a second command signal in response to asynchronous touch and push inputs.
The embodiments described herein may be implemented to enable a user to efficiently and accurately operate an electronic device. Further, the above systems and methods allow for a haptic control apparatus for an electronic device to be adapted such that the control apparatus can display indicia specifically tailored to inform a user the exact function assigned to the directions of a haptic directional control apparatus. This is in contrast to other directional pads, joysticks, etc., that may use static, generic labeling to signify the actuatable directions of the controller.
It will be understood that the embodiments described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are contemplated. Accordingly, the present disclosure includes all novel and non-obvious combinations and sub-combinations of the various systems and methods disclosed herein, as well as any and all equivalents thereof.
Number | Name | Date | Kind |
---|---|---|---|
6942571 | McAllister et al. | Sep 2005 | B1 |
7084858 | Anson | Aug 2006 | B2 |
7084884 | Nelson et al. | Aug 2006 | B1 |
7199787 | Lee et al. | Apr 2007 | B2 |
20030038821 | Kraft | Feb 2003 | A1 |
20030174072 | Salomon | Sep 2003 | A1 |
20040075676 | Rosenberg et al. | Apr 2004 | A1 |
20040090416 | Biheller et al. | May 2004 | A1 |
20050162398 | Eliasson et al. | Jul 2005 | A1 |
20060040740 | DiDato | Feb 2006 | A1 |
20060073862 | Shinoda et al. | Apr 2006 | A1 |
20060121965 | MacIver | Jun 2006 | A1 |
20060170669 | Walker et al. | Aug 2006 | A1 |
20060281550 | Schena | Dec 2006 | A1 |
20070120828 | Fyke | May 2007 | A1 |
20070162875 | Paquette et al. | Jul 2007 | A1 |
20070195010 | Toriumi et al. | Aug 2007 | A1 |
20070218995 | Didato | Sep 2007 | A1 |
20070229478 | Rosenberg et al. | Oct 2007 | A1 |
20070236470 | Abanami et al. | Oct 2007 | A1 |
20070236472 | Bentsen et al. | Oct 2007 | A1 |
20080092087 | Brown et al. | Apr 2008 | A1 |
20080119237 | Kim | May 2008 | A1 |
20090073126 | Srivastava | Mar 2009 | A1 |
20110032189 | Travis | Feb 2011 | A1 |
Number | Date | Country |
---|---|---|
1025886 | Aug 2000 | EP |
2003015796 | Jan 2003 | JP |
2003241897 | Aug 2003 | JP |
2004094389 | Mar 2004 | JP |
2008516312 | May 2008 | JP |
03021922 | Mar 2003 | WO |
2008082095 | Jul 2008 | WO |
Entry |
---|
ISA Korea, International Search Report of PCT/US2009/057049, May 6, 2010, 5 pages. |
Karlson, et al., “One-Handed Touchscreen Input for Legacy Applications”, Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, 2008, pp. 1399-1408. |
Lamberti, et al., “An Accelerated Remote Graphics Architecture for PDAs”, Proceedings of the eighth international conference on 3D Web technology, Saint Malo, France, 2003, 7 pages. |
Ashley, Bryon et al., “Changing Icons on User Input Device,” U.S. Appl. No. 13/53,636, filed Jul. 19, 2012, 29 pages. |
“Cyborg M.M.O.7 Gaming Mouse,” Mad Catz Interactive Asia Limited, http://cyborggaming.com/prod/mmo.htnn, Retrieved Mar. 6, 2012, 5 pages. |
“YOYO Games,” YoYo Games, http://sandbox.yoyogames.com/games/79320#, Made available on Apr. 24, 2009, 5 pages. |
“DC Image Button 3.3,” Pelagian Softwares, http://www.uniqueidea.net/DC-Image-Button-software—13820.html, Retrieved Mar. 6, 2012, 3 pages. |
“Sleight of hand,” Bennett Ring, http://www.smh.com.au/articles/2004/12/01/1101577541136.html, Dec. 4, 2004, 3 pages. |
“Interaktionsdesign,” Tisdagen Den, http://1md001.blogspot.in/, Mar. 17, 2009, 10 pages. |
European Patent Office, International Search Report and Written Opinion of PCT/US2013/051169, Netherlands, Oct. 8, 2013, 8 pages. |
Japanese Patent Office, Office Action of Japanese Patent Application No. 2011-527906, Nov. 8, 2013, 6 pages. |
Number | Date | Country | |
---|---|---|---|
20100066681 A1 | Mar 2010 | US |