Information handling devices (“devices”) come in a variety of forms, for example laptop computing devices, tablet computing devices, smart phones, and the like. Many devices now provide touch input functionality. That is, a user may touch a portion of the device, e.g., touch screen, and provide inputs in lieu of or in addition to more conventional modes of input such as a keyboard, mouse, etc.
Certain touch inputs have functionality mapped or associated therewith depending on which portion of the touch sensitive surface a user provides input to. An increasingly common example is an edge input, e.g., a swipe or touch input provided to the edge or peripheral portion of a touch screen is associated with a special function or set of functions.
For example, the WINDOWS 8 operating system has included edge swipe gestures for a touch interface, including touch screens and touch pads. Edge swipe gestures provide functions enhancing the user experience, such as closing an application, switching between applications, displaying the system menu or toolbar, etc. Edge swipe gestures are considered to be among the most useful gestures recently added.
In summary, one aspect provides a method, comprising: capturing, using an image sensor, an image of a user; detecting, using a processor, a user gesture forming an edge within the image; capturing, using the image sensor, at least one additional image of the user, detecting, using the processor, a user gesture relating to the edge of the image; and committing, using the processor, a predetermined action according to the user gesture relating to the edge of the image.
Another aspect provides an information handling device, comprising: an image sensor; a processor, a memory device that stores instructions accessible to the processor, the instructions being executable by the processor to: capture, using the image sensor, an image of a user; detect a user gesture forming an edge within the image; capture, using the image sensor, at least one additional image of the user; detect a user gesture relating to the edge of the image; and commit a predetermined action according to the user gesture relating to the edge of the image.
A further aspect provides a product, comprising: a storage device having code stored therewith, the code comprising: code that capture, using an image sensor, an image of a user; code that detects, using a processor, a user gesture forming an edge within the image; code that captures, using the image sensor, at least one additional image of the user; code that detects, using the processor, a user gesture relating to the edge of the image; and code that commits a predetermined action according to the user gesture relating to the edge of the image.
The foregoing is a summary and thus may contain simplifications, generalizations, and omissions of detail; consequently, those skilled in the art will appreciate that the summary is illustrative only and is not intended to be in any way limiting.
For a better understanding of the embodiments, together with other and further features and advantages thereof, reference is made to the following description, taken in conjunction with the accompanying drawings. The scope of the invention will be pointed out in the appended claims.
It will be readily understood that the components of the embodiments, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations in addition to the described example embodiments. Thus, the following more detailed description of the example embodiments, as represented in the figures, is not intended to limit the scope of the embodiments, as claimed, but is merely representative of example embodiments.
Reference throughout this specification to “one embodiment” or “an embodiment” (or the like) means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearance of the phrases “in one embodiment” or “in an embodiment” or the like in various places throughout this specification are not necessarily all referring to the same embodiment.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments. One skilled in the relevant art will recognize, however, that the various embodiments can be practiced without one or more of the specific details.
While increasingly common, edge gestures can be challenging for users in many circumstances, particularly because conventional edge gestures rely on physically touching a touch sensitive surface, e.g., a touch pad or touch screen display. For example, it is challenging for users to perform edge swipe gestures on the system when the user is away from the system, e.g., a user sitting some distance away from the system, when the system does not have a touch interface, e.g., an edge input compatible OS implemented on a non-touch system, when the user is sitting in front of the system with a touch screen and performs or wants to perform a gesture on another surface, e.g., the table top, for ergonomic benefit, etc.
Accordingly, embodiments provide methods, systems and products that allow a user to provide image-based gestures, e.g., captured with an image sensor such as a camera, and have these image-based gestures mapped to edge gestures for execution by the system. This allows users to perform edge gestures when not physically interfacing, i.e., touching, the system. For example, an embodiment captures an image of a user, detects a user gesture forming an edge within the image, e.g., with a hand forming an edge shape, captures at least one additional image of the user, and detects a user gesture relating to the edge of the image.
The illustrated example embodiments will be best understood by reference to the figures. The following description is intended only by way of example, and simply illustrates certain example embodiments.
While various other circuits, circuitry or components may be utilized in information handling devices, with regard to smart phone and/or tablet circuitry 100, an example illustrated in
There are power management chip(s) 130, e.g., a battery management unit, BMU, which manage power as supplied, for example, via a rechargeable battery 140, which may be recharged by a connection to a power source (not shown). In at least one design, a single chip, such as 110, is used to supply BIOS like functionality and DRAM memory.
System 100 typically includes one or more of a WWAN transceiver 150 and a WLAN transceiver 160 for connecting to various networks, such as telecommunications networks and wireless Internet devices, e.g., access points. Additionally devices 120 are commonly included, e.g., an image sensor such as a camera. System 100 often includes a touch screen 170 for data input and display/rendering. System 100 also typically includes various memory devices, for example flash memory 180 and SDRAM 190.
The example of
In
In
The system, upon power on, may be configured to execute boot code 290 for the BIOS 268, as stored within the SPI Flash 266, and thereafter processes data.
Information handling device circuitry, as for example outlined in
An embodiment allows for edge gestures to be mimicked, e.g., a touch edge swipe, by using camera/image based gestures. This facilitates use of the edge gesture capability of certain operating systems in certain situations, e.g., when the user does not wish to reach out and touch the system, since the user is likely already familiar with touch edge swipe gestures. To offer touch gestures using camera gestures may also save the cost of touch interface provided in the system.
A challenge is that a camera gesture needs to be defined to evoke edge gestures. In particular, edge gestures are triggered around the touch interface edge.
Accordingly, an embodiment defines various image based gestures that the user can perform that are keyed to an edge gesture formed by the user. For example, as illustrated in
Referring specifically to
As another example, referring to
As illustrated in
Many such gestures may be implemented. For example, referring to
Illustrated in
In an embodiment, a user may perform the image-based gestures in a different orientation than heretofore described. For example, as illustrated in
As shown in
Likewise, a user may form an edge with the opposite hand in a table top gesture, e.g., using the right hand, and perform a single finger table top gesture with the left hand, as illustrated in
As may be understood form the foregoing, and referring to
Having the edge identified, an embodiment may proceed to detect gestures relative thereto. For example, an embodiment may capture or otherwise access another image at 703, e.g., of a user performing a single finger gesture to close an application. An embodiment may likewise use gesture recognition technology to match the movement of the user's finger relative to the edge formed by the user to a predetermined edge gesture at 704. If such a known edge gesture is detected, an embodiment may output a corresponding command, e.g., to an edge enabled operating system, such that the corresponding (touch edge gesture) action is performed at 705.
Thus, an embodiment permits a user to perform images detectable by an image sensor such as a camera of a device to operate an edge gesture enabled device. As will be understood from the description and figures herein, a user need not actually touch or physically interface with the device. This frees the user up to perform edge gestures in additional contexts and orientations, and without the need for a touch enabled device component, opening up the edge gesture capability of certain operating systems to additional devices, users and contexts.
As will be appreciated by one skilled in the art, various aspects may be embodied as a system, method or device program product. Accordingly, aspects may take the form of an entirely hardware embodiment or an embodiment including software that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects may take the form of a device program product embodied in one or more device readable medium(s) having device readable program code embodied therewith.
It should be noted that the various functions described herein may be implemented using instructions stored on a device readable storage medium such as a non-signal storage device that are executed by a processor. A storage device may be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
Program code embodied on a storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, et cetera, or any suitable combination of the foregoing.
Program code for carrying out operations may be written in any combination of one or more programming languages. The program code may execute entirely on a single device, partly on a single device, as a stand-alone software package, partly on single device and partly on another device, or entirely on the other device. In some cases, the devices may be connected through any type of connection or network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made through other devices (for example, through the Internet using an Internet Service Provider), through wireless connections, e.g., near-field communication, or through a hard wire connection, such as over a USB connection.
Example embodiments are described herein with reference to the figures, which illustrate example methods, devices and program products according to various example embodiments. It will be understood that the actions and functionality may be implemented at least in part by program instructions. These program instructions may be provided to a processor of a general purpose information handling device, a special purpose information handling device, or other programmable data processing device to produce a machine, such that the instructions, which execute via a processor of the device implement the functions/acts specified.
It is worth noting that while specific blocks are used in the figures, and a particular ordering of blocks has been illustrated, these are non-limiting examples. In certain contexts, two or more blocks may be combined, a block may be split into two or more blocks, or certain blocks may be re-ordered or re-organized as appropriate, as the explicit illustrated examples are used only for descriptive purposes and are not to be construed as limiting.
As used herein, the singular “a” and “an” may be construed as including the plural “one or more” unless clearly indicated otherwise.
This disclosure has been presented for purposes of illustration and description but is not intended to be exhaustive or limiting.
Thus, although illustrative example embodiments have been described herein with reference to the accompanying figures, it is to be understood that this description is not limiting and that various other changes and modifications may be affected therein by one skilled in the art without departing from the scope or spirit of the disclosure.
Number | Name | Date | Kind |
---|---|---|---|
6147678 | Kumar et al. | Nov 2000 | A |
8411060 | Scholler | Apr 2013 | B1 |
8558759 | Prada Gomez et al. | Oct 2013 | B1 |
20040193413 | Wilson et al. | Sep 2004 | A1 |
20050238201 | Shamaie | Oct 2005 | A1 |
20060010400 | Dehlin et al. | Jan 2006 | A1 |
20080085048 | Venetsky et al. | Apr 2008 | A1 |
20090040215 | Afzulpurkar et al. | Feb 2009 | A1 |
20090073117 | Tsurumi et al. | Mar 2009 | A1 |
20090103780 | Nishihara et al. | Apr 2009 | A1 |
20100013944 | Venetsky et al. | Jan 2010 | A1 |
20100103106 | Chui | Apr 2010 | A1 |
20100104134 | Wang et al. | Apr 2010 | A1 |
20100315413 | Izadi | Dec 2010 | A1 |
20110001813 | Kim et al. | Jan 2011 | A1 |
20110141009 | Izumi | Jun 2011 | A1 |
20110185318 | Hinckley et al. | Jul 2011 | A1 |
20110296353 | Ahmed et al. | Dec 2011 | A1 |
20120121185 | Zavesky | May 2012 | A1 |
20120275686 | Wilson et al. | Nov 2012 | A1 |
20120295661 | Kim et al. | Nov 2012 | A1 |
20120304133 | Nan | Nov 2012 | A1 |
20120309516 | Langridge et al. | Dec 2012 | A1 |
20130021374 | Miao et al. | Jan 2013 | A1 |
20130155237 | Paek et al. | Jun 2013 | A1 |
20130227477 | Yahav et al. | Aug 2013 | A1 |
20130229499 | Zhao et al. | Sep 2013 | A1 |
20130246955 | Schwesig et al. | Sep 2013 | A1 |
20130265226 | Park et al. | Oct 2013 | A1 |
20130278504 | Tong et al. | Oct 2013 | A1 |
20130283202 | Zhou et al. | Oct 2013 | A1 |
20130283213 | Guendelman | Oct 2013 | A1 |
20130293510 | Clifton | Nov 2013 | A1 |
20130293722 | Chen | Nov 2013 | A1 |
20130328766 | Igarashi et al. | Dec 2013 | A1 |
20130335324 | Kaplan et al. | Dec 2013 | A1 |
20130335361 | Wong et al. | Dec 2013 | A1 |
20130343601 | Jia et al. | Dec 2013 | A1 |
20140028567 | Park et al. | Jan 2014 | A1 |
20140062862 | Yamashita | Mar 2014 | A1 |
20140118244 | Kaplan et al. | May 2014 | A1 |
20140125598 | Cheng et al. | May 2014 | A1 |
20140168062 | Katz | Jun 2014 | A1 |
20140208274 | Smyth et al. | Jul 2014 | A1 |
20140208275 | Mongia et al. | Jul 2014 | A1 |
20140225918 | Mittal et al. | Aug 2014 | A1 |
20140232631 | Fleischmann et al. | Aug 2014 | A1 |
20140236996 | Masuko et al. | Aug 2014 | A1 |
20140247964 | Kurokawa et al. | Sep 2014 | A1 |
20140267004 | Brickner | Sep 2014 | A1 |
20140282224 | Pedley | Sep 2014 | A1 |
20140282274 | Everitt et al. | Sep 2014 | A1 |
20140282280 | Pack et al. | Sep 2014 | A1 |
20140285461 | Campbell | Sep 2014 | A1 |
20140327611 | Ono et al. | Nov 2014 | A1 |
20140361982 | Shaffer | Dec 2014 | A1 |
20150015504 | Lee et al. | Jan 2015 | A1 |
20150029095 | Gomez et al. | Jan 2015 | A1 |
20150029225 | Aigner | Jan 2015 | A1 |
20150040040 | Balan et al. | Feb 2015 | A1 |
20150062056 | Sood | Mar 2015 | A1 |
20150110460 | Choi et al. | Apr 2015 | A1 |
20150123890 | Kapur et al. | May 2015 | A1 |
20150169176 | Cohen et al. | Jun 2015 | A1 |
20150288883 | Shigeta | Oct 2015 | A1 |
Number | Date | Country |
---|---|---|
101901052 | Dec 2010 | CN |
103067782 | Apr 2013 | CN |
Entry |
---|
Colaço, Andrea, et al. “Mime: Compact, low power 3d gesture sensing for interaction with head mounted displays.” Proceedings of the 26th annual ACM symposium on User interface software and technology. ACM, 2013. |
Yeo, Hui-Shyong, Byung-Gook Lee, and Hyotaek Lim. “Hand tracking and gesture recognition system for human-computer interaction using low-cost hardware.” Multimedia Tools and Applications 74.8 (2013): 2687-2715. |
Number | Date | Country | |
---|---|---|---|
20150199564 A1 | Jul 2015 | US |