An electronic device may be attached to an input device to receive input from a user of the computing device. For example, an input device may be a keyboard or a computer mouse. Some electronic devices, such as a notebook computer, may have an integrated input device, such as an integrated keyboard.
Some examples of the present application are described with respect to the following figures:
One type of input device that is becoming more common than keyboard or computer mouse is a touch-sensitive input device. A touch-sensitive input device may translate the movements of a writing tool (e.g., a user's finger or a stylus) to a relative position in a graphical user interface (GUI). An example of a touch-sensitive input device may be a touchpad. Another touch-sensitive input device may be a touchscreen.
A touch-sensitive input device may include an input axis to translate a movement of a writing tool to a relative position in a GUI. However, the input axis may be a fixed input axis designed to face a sole, individual user. When a user is not directly facing the input axis, the touch-sensitive input device may not translate the movements from the user's finger to the correct position in a GUI. Thus, the convenience of using the touch-sensitive input device may be decreased.
Examples described herein provide a touch-sensitive input device that rotates an input axis to align the input axis to a user of the touch-sensitive input device. For example, a non-transitory computer readable storage medium comprising instructions that when executed cause a controller of an electronic device to receive, via a touch-sensitive input device of the electronic device, a touch input. The instructions also cause the controller to determine whether the touch input corresponds to a re-orientation input. The instructions further cause the controller to, in response to a determination that the touch input corresponds to the re-orientation input, rotate an input axis associated with the touch sensitive input device from a first orientation to a second orientation based on the re-orientation input.
As another example, a non-transitory computer-readable storage medium comprising instructions that when executed cause a controller of an electronic device to detect, via a light-sensitive sensor of the electronic device, a direction of a user gesture relative to a side of a touch-sensitive input device of the electronic device. The instructions also cause the controller to rotate an input axis associated with the touch-sensitive input device from a first orientation to a second orientation based on the detected direction, where a vertical axis of the input axis is aligned with the side in the second orientation. Thus, the convenience of using the touch-sensitive input device may be increased.
Controller 102 may be a central processing unit (CPU), a semiconductor-based microprocessor, and/or other hardware devices suitable for retrieval and execution of instructions stored in computer-readable storage medium 104. Controller 102 may fetch, decode, and execute instructions 108, 110, and 112 to control a process of rotating an input axis of touch-sensitive input device 106. As an alternative or in addition to retrieving and executing instructions, controller 102 may include at least one electronic circuit that includes electronic components for performing the functionality of instructions 108, 110, 112, or a combination thereof.
Computer-readable storage medium 104 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. Thus, computer-readable storage medium 104 may be, for example, Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, etc. In some examples, computer-readable storage medium 104 may be a non-transitory storage medium, where the term “non-transitory” does not encompass transitory propagating signals. As described in detail below, computer-readable storage medium 104 may be encoded with a series of processor executable instructions 108-112.
Touch-sensitive input device 106 may be a touchpad, a touchscreen, or any other electronic device suitable to translate the movements of a writing tool (e.g., a user's finger or a stylus) to a relative position in a GUI based on an input axis. An example of an input axis is described in more detail in
Touch input reception instructions 108 may receive a touch input 114 via touch-sensitive input device 106. Re-orientation input determination instructions 110 may determine whether touch input 114 corresponds to a re-orientation input. A re-orientation input may have a distinct characteristic. For example, a re-orientation input may correspond to a tapping gesture on touch-sensitive input device 106. As another example, a re-orientation input may correspond to a two finger swipe from one side of touch-sensitive input device 106 to another side of touch-sensitive input device 106.
When re-orientation input determination instructions 110 detect the distinct characteristic, re-orientation input determination instructions 110 may determine that a re-orientation input has been received. In response to a determination that touch input 114 corresponds to a re-orientation input, input axis rotation instructions 112 may rotate the input axis of touch-sensitive input device 106 from a first orientation to a second orientation based on the re-orientation input. For example, input axis rotation instructions 112 may identify a location of the re-orientation input in touch-sensitive input device 106. Input axis rotation instructions 112 may determine an anchor point based on the location and rotate the input axis based on the anchor point from a first orientation to a second orientation. A vertical axis of the input axis may be aligned with the anchor point in the second orientation.
Input axis 202 may determine how movements detected on touch-sensitive input device 106 is translated to a relative position in a GUI 208. GUI 208 may be shown in a display attached to electronic device 100. For example, input axis 202 may be in a first orientation as shown in
When a second user 218 is to use touch-sensitive input device 106, second user 218 may re-orient input axis 202 to better suit second user's 218 position relative to touch-sensitive input device 106. As shown in
Touch-sensitive input device 106 may identify that re-orientation input 220 is received at a location 222 in touch-sensitive input device 106. Location 222 may correspond to a region of touch-sensitive input device 106 where re-orientation input 220 initially makes physical contact with touch-sensitive input device 106. Touch-sensitive input device 106 may determine an anchor point 224 based on location 222. Anchor point 224 may be a location in touch-sensitive input device 106 that serves as a reference point to re-orient input axis 202. For example, anchor point 224 may be a center of location 222.
Turning to
User gesture direction detection instructions 306 may detect a direction of a user gesture relative to touch-sensitive input device 106 based on a triggering of a sensor in set of sensors 304. Input axis rotation instructions 308 may determine an anchor point based on the detected direction. Input axis rotation instructions 308 may also rotate an input axis of touch-sensitive input device 106 based on the anchor point.
Touch-sensitive input device 402 may have a square sharp or a rectangular shape. Touch-sensitive input device 402 may have a first side 412, a second side 414, a third side 416, and a fourth side 418. Each of sides 412-418 may be aligned with a distinct sensor from the plurality of sensors 404-410. For example, a first sensor 404 is aligned with first side 412, a second sensor 406 is aligned with second side 414, a third sensor 408 is aligned with third side 416, and a fourth sensor 410 is aligned with fourth side 418.
During operation, sensors 404-410 may detect a direction of a user gesture relative to touch-sensitive input device 402 so that touch-sensitive input device 402 may rotate to align with the direction of the user gesture, as described in more detail in
Touch-sensitive input device 402 may determine an anchor point 504 based on the detected direction. For example, since second sensor 406 is triggered, touch-sensitive input device 402 may determine that the direction of user's 502 hand or finger is coming from second side 414. In some examples, anchor point 504 may be located at a mid-point of second side 414. In some examples, anchor point 504 may be located anywhere in a region of touch-sensitive input device 402 that is aligned with second sensor 406. Touch-sensitive input device 402 may rotate input axis 202 from the first orientation to a second orientation to align vertical axis 206 with anchor point 504. In the second orientation, proximal side 226 may intersect with anchor point 504.
Turning to
Turning to
Indicators 508-514 may be implemented using light emitting diodes. Indicators 508-514 may provide a visual indication of which side of touch-sensitive input device 402 proximal side 226 is aligned with. Thus, a user of touch-sensitive input device 402 may determine the current orientation of touch-sensitive input device 402 using indicators 508-514. For example, proximal side 226 may be aligned with third side 416 initially. Third indicator 512 may light up. Input axis 202 may be rotated as described in
In some examples, when contiguous sensors of band of sensors 602 detect a shadow, touch-sensitive input device 600 may determine an anchor in different manners based on the number of the contiguous sensors. When an odd number of contiguous sensors of band of sensors 602 detect a shadow, the middle sensor among the contiguous sensors may define an anchor point. When an even number of contiguous sensors detect a shadow, a halfway point between the middle two sensors may define the anchor point.
The use of “comprising”, “including” or “having” are synonymous and variations thereof herein are meant to be inclusive or open-ended and do not exclude additional unrecited elements or method steps.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2016/020693 | 3/3/2016 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/151136 | 9/8/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6567101 | Thomas | May 2003 | B1 |
8402391 | Doray | Mar 2013 | B1 |
8508475 | Gear | Aug 2013 | B2 |
8553001 | Krishnaswamy | Oct 2013 | B2 |
8749493 | Zadesky et al. | Jun 2014 | B2 |
9128552 | Case | Sep 2015 | B2 |
9223340 | Locker | Dec 2015 | B2 |
9417733 | Chou | Aug 2016 | B2 |
20070063987 | Sato | Mar 2007 | A1 |
20070300182 | Bilow | Dec 2007 | A1 |
20090100343 | Lee | Apr 2009 | A1 |
20090184936 | Algreatly | Jul 2009 | A1 |
20100053109 | Narita | Mar 2010 | A1 |
20100177931 | Whytock | Jul 2010 | A1 |
20110001628 | Miyazawa | Jan 2011 | A1 |
20110016749 | Callahan et al. | Jan 2011 | A1 |
20110102333 | Westerman | May 2011 | A1 |
20110134047 | Wigdor | Jun 2011 | A1 |
20110169749 | Ganey | Jul 2011 | A1 |
20110279384 | Miller et al. | Nov 2011 | A1 |
20120001858 | Matsuda | Jan 2012 | A1 |
20120038546 | Cromer | Feb 2012 | A1 |
20120092283 | Miyazaki | Apr 2012 | A1 |
20130083074 | Nurmi | Apr 2013 | A1 |
20130125045 | Sun et al. | May 2013 | A1 |
20130127750 | Horiuchi | May 2013 | A1 |
20130222275 | Byrd et al. | Aug 2013 | A1 |
20130229361 | Brown | Sep 2013 | A1 |
20130275907 | Lau | Oct 2013 | A1 |
20140015792 | Chen | Jan 2014 | A1 |
20140168118 | Wang | Jun 2014 | A1 |
20140218290 | Meijer | Aug 2014 | A1 |
20140354695 | Sakai | Dec 2014 | A1 |
20150293616 | Cheng | Oct 2015 | A1 |
20150370414 | Innami | Dec 2015 | A1 |
20160109861 | Kim | Apr 2016 | A1 |
20170118402 | Bok | Apr 2017 | A1 |
Number | Date | Country |
---|---|---|
2437147 | Apr 2012 | EP |
WO-2015042444 | Mar 2015 | WO |
Entry |
---|
Heo, S, et al. “Designing Rich Touch Interaction Through Proximity and 2.5D Force Sensing Touchpad.” Ozchi 13, Nov. 25-29, 2013. Adelaide, Australia. |
Number | Date | Country | |
---|---|---|---|
20180329564 A1 | Nov 2018 | US |