This disclosure relates generally to electronic devices having one or more touch sensitive surfaces and more specifically to methods and devices for multi-surface gesture interaction for devices.
Electronic devices include smartphones, tablets, laptop computers, desktop computers, and smart watches. Electronic devices may have touchscreen displays on which content is displayed. The content to be displayed may not all fit on the viewing area of the touchscreen display, and thus requires scrolling. When the content to be displayed is large, scrolling through the large content may be time consuming and inconvenient. A first solution for scrolling involves the use of touch flicks. A user touches the screen, typically with the tip of a finger, swipes up or down depending on the desired direction of scrolling, then lifts the finger off the screen. The speed by which the user swipes up or down determines the amount of scrolling effected. For example, a slow swipe, and accordingly a slow touch flick, may scroll the contents of the viewing area by a few lines, whereas a faster touch flick may scroll the contents of the viewing area by tens of lines in a short period of time. However, faster touch flicks and the resulting fast scrolling of the viewing area contents, renders the content unreadable during scrolling thus making it difficult to locate information within the content. Additionally, repeated touch flicks causes wear to the touch sensing systems of the touchscreen display.
Another option to scroll the contents of a viewing area of a touchscreen display is a scrollbar in which a user can drag a scrollbar thumb along a scrollbar track to scroll display contents. Due to the limited viewing area size on the display, the scrollbar is typically thin, which can cause unintended touches on the screen when the user intends to touch the scrollbar. In some cases, the scrollbar may contain jump buttons to provide the function of jumping to the top or bottom of the content. However, this may not be helpful if the user is interesting in information which is in the middle of the content, or looking for specific contents.
Slider user interface controls are typically associated with system parameters which change in value in response to swiping a finger along a track of the slider control between a first end corresponding to a minimum value and a second end corresponding to a maximum value of the system parameter. On touchscreen displays it is sometimes difficult to change the system parameters with accuracy using slider controls due to limited display real-estate. Furthermore, slider controls may be accidentally actuated if there is an unintended swipe along the track thereof.
There is a need for a system and method for scrolling screen contents which address at least some of the aforementioned problems. There is also a need for a system and a method for controlling slider controls on touchscreen displays which address at least some of the aforementioned problems.
The present disclosure generally relates to the use of finger gestures on devices with at least two touch sensitive surfaces to allow improved scrolling of display contents and accurate manipulation of slider user interface controls.
According to an aspect of the present disclosure, there is provided a method for controlling electronic device. The method comprises recognizing a two finger input gesture on at least two touch sensitive surfaces of the electronic device, including detecting a first input gesture, including a first direction, on a first touch sensitive surface; and detecting a second input gesture, including at least one of a second direction or a second location, on a second touch sensitive surface. The first input gesture and the second input gesture happen in a same time period. The method also comprises altering a content rendered on a display in response to recognizing the two finger input gesture.
The method enables efficient altering of content rendered on a display of an electronic device using finger gestures on two touch sensitive surfaces. This may, for example allow altering display content with fewer user interactions, thereby reducing possible wear or damage to the electronic device and possibly reducing battery power consumption. User experience may also be enhanced as triggering unintended actions is reduced since it is unlikely that simultaneous gestures on two touch sensitive surfaces may accidentally occur.
In some examples of the present disclosure, the first input gestures includes a first swipe gesture in the first direction, and the second input gesture includes a second swipe gesture in the second direction at the second location.
In some examples of the present disclosure, the first touch sensitive surface may comprise a touchscreen display which is the display on which the content is rendered.
In some examples of the present disclosure, altering the content rendered on the display may comprise scrolling a viewing area of the display in the first direction of the first swipe gesture when the second direction of the second swipe gesture is consistent with the first direction of the first swipe gesture.
In some examples of the present disclosure, altering the content rendered on the display may comprise scrolling a viewing area of the display in the first direction of the first swipe gesture with a scrolling speed determined by the second location of the second swipe gesture when the second direction of the second swipe gesture is opposite to the first direction of the first swipe gesture.
In some examples of the present disclosure, altering the content rendered on the display may comprise scrolling a viewing area of the display in the first direction with a scrolling speed associated with the first touch sensitive surface.
In some examples of the present disclosure, altering the content may comprise rotating an object on the display in response to the first swipe gesture and the second swipe gesture when the first direction is opposite to the second direction.
In some examples of the present disclosure, the first input gesture includes a first swipe gesture in the first direction, and the second input gesture includes a static touch at a location.
In some examples of the present disclosure, altering the content rendered on the display may comprise manipulating a user interface control in response to the first swipe gesture and the location of the static touch gesture.
In some examples of the present disclosure, altering the content rendered on the display may comprise automatic scrolling the content rendered on the display at a preconfigured magnitude when the second direction of the second swipe gesture is opposite to the first direction of the first swipe gesture.
According to another aspect of the present disclosure, there is provided an electronic device comprising a processor and a non-transitory memory coupled to the processor and storing instructions. The instructions when executed by the processor configure the processor to recognize a two finger input gesture on at least two touch sensitive surfaces of the electronic device, including detecting a first input gesture, including a first direction, on a first touch sensitive surface and detecting a second input gesture, including at least one of a second direction or a second location, on a second touch sensitive surface. The first input gesture and the second input gesture happen in a same time period. The instructions further configure the processor to alter a content rendered on a display in response to recognizing the two finger input gesture.
In some examples of the present disclosure, the first touch sensitive surface may comprise a touchscreen display which is the display on which the content is rendered.
In some examples of the present disclosure, the first input gesture includes a first swipe gesture in the first direction, and the second input gesture includes a second swipe gesture in the second direction at the second location.
In some examples of the present disclosure, to alter the content rendered on the display, the instructions may further configure the processor to scroll a viewing area of the display in the first direction when the second direction of the second swipe gesture is consistent with the first direction of the first swipe gesture.
In some examples of the present disclosure, to alter the content rendered on the display, the instructions may further configure the processor to scrolling a viewing area of the display in the first direction with a scrolling speed determined by the second location of the second swipe gesture when the second direction of the second swipe gesture is opposite to the first direction of the first swipe gesture.
In some examples of the present disclosure, to alter the content rendered on the display, the instructions may further configure the processor to rotate an object on the display in response to the first swipe gesture and the second swipe gesture.
In some examples of the present disclosure, the first input gesture includes a first swipe gesture in the first direction; and the second input gesture includes a static touch gesture at a location.
In some examples of the present disclosure, to alter the content rendered on the display, the instructions may further configure the processor to scroll a viewing area of the display in the first direction with a scrolling speed associated with the first touch sensitive surface.
In some examples of the present disclosure, to alter the content rendered on the display, the instructions may further configure the processor to manipulate a user interface control in response to the first swipe gesture and the location of the static touch gesture.
According to yet another aspect of the present disclosure, there is provided a non-transitory computer readable medium storing instructions which when executed by a process of an electronic device, configures the electronic device to recognize a two finger input gesture on at least two touch sensitive surfaces of the electronic device, including detecting a first input gesture, including a first direction, on a first touch sensitive surface and detecting a second input gesture, including at least one of a second direction or a second location, on a second touch sensitive surface. The first input gesture and the second input gesture happen in a same time period. The instructions further configure the processor to alter a content rendered on a display in response to recognizing the two finger input gesture.
The presented methods and devices provide for efficient scrolling of display content without the use of touch flicks thus reducing pressure on the touch sensing system and reducing wear device. The device and methods also reduce the need to display a dedicated scrollbar and thus requiring a larger display to display the same content. Using a smaller display reduces battery power consumption and lowers overall electronic device cost. Using synchronous gestures on two touch sensitive surfaces prevent accidental activation of actions, which may require additional corrective actions to cancel. The gestures described are simple and are comprised of swipes which are easy to recognize without complex computations thus reducing processing resources. Multiple slider controls may be controlled without the need to display all of them simultaneously. This conserves display area and reduces the need to make the display size larger thus reducing cost and power consumption.
Reference will now be made, by way of example, to the accompanying drawings which show example embodiments of the present application, and in which:
Example embodiments are described herein that may in some applications mitigate against the shortcomings of the existing methods. The embodiments presented herein utilize two finger input gestures which involve simultaneous contact between the user's fingers and two touch sensitive surfaces. In some embodiments, the first touch sensitive surface and the second touch sensitive surface are touchscreen displays. In other embodiments, the first touch sensitive surface is a touchscreen display and the second touch surface is a touchpad. The two finger input gesture may in some cases be comprised of two swipes and in other cases of one swipe and a static touch. The two finger input gesture may be used to scroll content on a touchscreen display, or to manipulate a slider control user interface.
In this disclosure the term “electronic device” refers to an electronic device having computing capabilities. Examples of electronic devices include but are not limited to: personal computers, laptop computers, tablet computers (“tablets”), smartphones, surface computers, augmented reality gear, automated teller machines (ATM)s, point of sale (POS) terminals, and the like.
In this disclosure, the term “display” refers to a hardware component of an electronic device that has a function of displaying graphical images, text, and video content thereon. Non-limiting examples of displays include liquid crystal displays (LCDs), light-emitting diode (LED) displays, and plasma displays.
In this disclosure, a “screen” refers to the outer user-facing layer of a touchscreen display.
In this disclosure, the term “touchscreen display” refers to a combination of a display together with a touch sensing system that is capable of acting as an input device by receiving touch input. Non-limiting examples of touchscreen displays are: capacitive touchscreens, resistive touchscreens, and Infrared touchscreens and surface acoustic wave touchscreens.
In this disclosure, the term “touch sensitive surface” refers to one of: a touchscreen display, a touchpad, or any other peripheral which detects touch by a finger or a touch input tool;
In this disclosure, the term “touch sensitive surface driver” refers to one of: a touchscreen driver and a touchpad driver.
In this disclosure, the term “viewing area” or “view” refers to a region of a display, which may for example be rectangular in shape, which is used to display information and receive touch input.
In this disclosure, the term “main viewing area” or “main view” refers to the single viewing area that covers all or substantially all (e.g., greater than 95%) of the viewable area of an entire display area of a touchscreen display.
In this disclosure, the term “application” refers to a software program comprising of a set of instructions that can be executed by a processing device of an electronic device.
In this disclosure, the terms “top”, “bottom”, “right”, “left”, “horizontal” and “vertical” when used in the context of viewing areas of a display are relative to the orientation of the display when content currently displayed on the display is presented in an orientation that the content is intended to be viewed in.
The ones in the art can understand that the user can hold the electronic device in his/her right or left hand and achieve two finger input gestures with any two fingers for the user's convenience. For example, the user holds the electronic device in his/her right hand, and scroll the display with the left thumb finger on the front touchscreen area and another finger of left hand (for example, left index finger or left middle finger) or even the right index or middle finger on the back touchscreen area, which depends on the user's preference, and there is not limitation. The example embodiments of this disclosure describe the user holds the electronic device in his/her left hand and achieves the two finger input gesture with the left thumb and the left index finger.
The front touchscreen display 140 is comprised of a front display 142 on which content is rendered coupled with a front touch sensing system 144 which senses touch on the screen of the first touchscreen display.
The front touchscreen display 140 has a front touchscreen display main viewing area 146 for displaying content. Often times, the content to be displayed does not fit in its entirety in the front touchscreen display main viewing area 146.
One method of scrolling up or down display content such as the list of elements 200, includes using at least one touch flick 34. With reference to
Another method of scrolling content rendered on a display involves the use of a scrollbar, such as the scrollbar 50 shown in
In one example embodiment, shown in
In one embodiment, the electronic device has a touch sensitive surface on a back region thereof. For example, as shown in
Unintended scrolling of the front display contents, such as list of elements 200, may take place when the user accidentally swipes or touches the front touchscreen display gesture response area 148. In order to prevent the unintended scrolling, or any other unintended action for that matter, the electronic device 10 is configured to only respond to gestures in which two or more gesture response areas are engaged by the user. In some example embodiments, a pinch gesture 31 in which the user engages both the front touchscreen display gesture response area 148 and the back touchscreen display gesture response area 158 is used to trigger action on the electronic device 10
In one example embodiment, shown in
During the movement of the right thumb 30A and the right index finger 32A, the front touch sensing system 144 and the back touch sensing system 154 are detecting touch of the fingers along the respective touchscreen display gesture response areas (148, 158). The touches detected by the touch sensing systems (144, 154) are interpreted by the touchscreen driver 114 and produce touch events. Each touch event contains a number of input parameters such as the spatial location of the touch on the respective touchscreen display gesture response area (148, 158), the time stamp, and may optionally contain other information such as the pressure force magnitude of the fingers on the screen during the same direction pinch slide gesture 40. The touch events, produced by the touchscreen driver 114 are provided to the user interface (UI) module 316. The UI module 316 is configured to recognize any one of a plurality of predetermined gestures from the touch events. UI module 316 can track a user's finger movements across one or more of the display of the electronic device 10. In some embodiments, the detected input gesture may be determined, based on the input parameters of the plurality of touch events which comprise the detected input gesture.
Through the display screen, at any given point in time (e.g. each millisecond or any other suitable time unit), the UI module 316 tracks and stores at least a timestamp and a location (e.g. pixel position) of each detected touch event provided by the touchscreen driver 114. Based on at least the timestamp and location of each detected touch event over a given period, the UI module 316 can determine a type of the detected input gesture. For example, if a plurality of touch events are detected for only for one second and center around the same location on the display screen, the input gesture is likely a tap gesture. For another example, if a plurality of detected touch events linger over two seconds and appear to move across a small distance on the display screen, the input gesture is likely a touch flick. If a plurality of detected touch events lingers over more seconds and appear to move across a larger distance on the display screen, the input gesture is likely a swipe gesture. If a plurality of detected touch events lingers over more seconds and appear to remain in substantially the same location, then the gesture is likely a static touch gesture.
The UI module 316 may determine that a user performed a swipe gesture by detecting a plurality of touch events that indicate that a finger has moved across a touch sensitive surface without losing contact with the display screen. The UI module 316 may determine that a user performed a pinch or zoom gesture by detecting two separate swipe gestures that have occurred simultaneously or concurrently, and dragged toward (close pinch slide gesture) or away (open pinch slide gesture) from each other. The UI module 316 may determine that a user performed a rotation gesture by detecting two swipes that have occurred simultaneously forming either a pinch open slide gesture or a pinch close slide gesture.
For example, a plurality of touch events may form a single gesture. The UI module 316 compares the gestures with any one of the plurality of predetermined gestures. Accordingly, the UI module 316 may recognize a swiping gesture on the front touchscreen display gesture response area 148 in the upward direction 39A and recognize a swiping gesture on the back touchscreen display gesture response area 158 also in the upward direction 39A. The UI module 316 recognizes the two swiping gestures that happened in a same time period, on the front and back touchscreen display gesture response areas (148, 158) as a same direction pinch slide gesture 40. In response to recognizing the same direction pinch slide gesture 40, the OS 310 or any one of the applications 350, as the case may be, can perform an action. For example, an application 350 may scroll up the contents of the front touchscreen display main viewing area 146 in response to receiving an indication from the UI module 316 that a same direction pinch slide gesture 40 in the upward direction has been performed. Conversely, if the UI module 316 recognizes that the same direction pinch slide gesture was in the scroll down direction, then the contents of the front touchscreen display main viewing area 146 are scrolled down in response to the gesture. Advantageously, the contents of the front touchscreen display main viewing area 146 are scrolled controllably by the user while the contents are displayed. Averting or reducing the need for touch flicks reduces the wear on the front touch sensing system 144 which would otherwise need to handle many flicks which involve touching and applying some force to the front touch sensing system 144. Additionally, no display area is consumed by a scrollbar, thus reducing the need to use a larger front touchscreen display 140 to display the same content. A larger front touchscreen display 140 not only costs more, but is bulkier and consumes more battery power when in use. Furthermore, the possibility of accidental scrolling is reduced since no scrolling action is carried out unless the user is performing a swiping gesture on both the front and back touchscreen display gesture response areas (148, 158) simultaneously. Otherwise, accidental scrolling would require the user to perform unnecessary scrolls to adjust the screen contents back to their original state.
In some instances, it may be desired to jump to the top or bottom of the content to be displayed. In other instances, it may be desired to scroll the screen contents at a much higher rate. And in some other instances, it may be desired to enable auto-scrolling. Different gestures, other than the same direction pinch slide gesture 40, may be detected by the UI module 316 in which two fingers of the user simultaneously engage the front and back touchscreen display gesture response areas (148, 158). As an example, opposite direction pinch slide gestures are contemplated with reference to
Considering a case where the electronic device 10 was originally held in the user's hand left hand 35B and with the right hand 35A in a pinch gesture 31, as described earlier with respect to
With reference to
In some example embodiments, the opposite direction pinch open slide gesture 42 may be recognized by the UI module 316 and used to scroll the contents of the front touchscreen display 140 upward by a faster rate than the same direction pinch slide gesture 40. In one example, the UI module 316 recognizes that the swipe of the right thumb 30A on the screen of the front touchscreen display 140 in the upward direction 39A. In response, the UI module 316 may cause upward scrolling to be performed. Changing the position of the right index finger 32A may change the speed of scrolling. For example, the position of the right index finger 32A at or near the middle of the back touchscreen display gesture response area 158 may indicate that the scrolling is to be done at normal speed. As the right index finger 32A is swiped in the downward direction 39B and is at a position closer to the lower end of the back touchscreen display gesture response area 158, an application 350 may interpret that positon of the right index finger 32A to indicate that scrolling is to be done at a slower speed, such as half the speed of normal scrolling. Similarly, the opposite direction pinch close slide gesture 44 may be used to scroll the contents of the front touchscreen display 140 down by a faster rate than the same direction pinch slide gesture 40. In this case, the right thumb 30A starts in a position near the top of the front touchscreen display gesture response area 148 and the right index finger 32A starts in a position near the bottom of the back touchscreen display gesture response area 158. The right thumb 30A is swiped along the front touchscreen display gesture response area 148 in the downward direction 39B towards the middle of the front touchscreen display gesture response area 148. Concurrently with the swiping of the right thumb 30A, the right index finger 32A is swiped along the back touchscreen display gesture response area 158 in the upward direction 39A towards the middle of the back touchscreen display gesture response area 158. The position of the right thumb 30A indicates, to the UI module 316, the scrolling speed to be used when scrolling down. For example, the initial position of the right thumb 30A near the bottom of the back touchscreen display gesture response area 158 may indicate that scrolling down is to be done at slow speed. As the right thumb 30A is swiped towards the middle of the back touchscreen display gesture response area 158, the scrolling down speed is increased. The scrolling down speed is highest when the right index finger 32A is at or near the top of the back touchscreen display gesture response area 158. Being able to dynamically control the scrolling speed between a slow, a normal, and a fast scrolling speed allows a user to easily locate specific content. For example, the user may position the right index finger 32A in a position on the back touchscreen display gesture response area 158 which causes faster scrolling, and swipe the right thumb 30A a few times. Following that, the right index finger 32A may be moved to a lower position which corresponds to a slower scrolling speed for locating the specific content.
In another example, an application 350 may be configured to scroll the contents of the front touchscreen display main viewing area 146, at a predetermined scrolling speed and by a predetermined amount, in the upward direction in response to receiving a notification that an opposite direction pinch open slide gesture 42 has been detected by the UI module 316. In this case, the UI module 316 may detect a pinch gesture 31 followed by an opposite direction pinch open slide gesture 42. In response, the application 350 scrolls the contents of the front touchscreen display main viewing area 146 by a predetermined amount. If the UI module 316 detects a release of the right thumb 30A and/or the right index finger 32A from the front touchscreen display gesture response area 148 or back touchscreen display gesture response area 158, followed by a pinch gesture 31 and another opposite direction pinch open slide gesture 42, the application 350 scrolls the contents of the front touchscreen display main viewing area 146 by the predetermined amount. Conversely, the application 350 scrolls down the display contents by a predetermined amount in response to detecting a pinch gesture 31 followed by an opposite direction pinch close slide gesture 44.
In yet another example embodiment, the UI module 316 recognizes an opposite direction pinch open slide gesture 42 and in response causes the contents of the front touchscreen display 140 to scroll up until the top of the content to be displayed is rendered on the front touchscreen display. For example with reference to
In a further example embodiment, the recognition of an opposite direction pinch open slide gesture 42 by the UI module 316 causes it to trigger auto-scrolling. In this case when the right thumb 30A is swiped up until it is at or near the top of the front touchscreen display gesture response area 148 and the right index finger 32A is swiped down until it is at or near the bottom of the back touchscreen display gesture response area 158, then the front touchscreen display 140 starts scrolling up line-by-line at a preconfigured magnitude without further intervention by the user. The automatic scrolling may continue until the right thumb 30A and right index finger 32A are swiped in the opposite directions until they are generally aligned in a pinch gesture 31. Conversely, the opposite direction pinch close slide gesture 44 can be used to trigger automatic scrolling down of the front touchscreen display 140 line-by-line at the preconfigured rate.
In some embodiments, the right thumb 30A and the right index finger 32A are swiped independently and at different speeds and may each be used to manipulate different functions in an application 350. In some example embodiments, one finger is swiped on one touch sensitive surface with while the other finger is just touching another touch sensitive surface, i.e. performing a static touch. For example, in a video playback application swipes by the right thumb 30A on the front touchscreen display gesture response area 148 may be recognized by the UI module 316 and used for adjusting a playback slider control. Adjusting a playback slider control can cause forwarding or rewinding video playback by a small time increment, such as seconds. In this case, the UI module 316 recognizes swiping the right thumb 30A in the upward direction 39A with the right index finger 32A staying (touching) on the back touchscreen display gesture response area 158, and advances the video by the granularity of 1 second or 10 seconds. In other words, the right index finger 32A is performing a static touch. Similarly, swiping the right thumb 30A in the downward direction rewinds the video back by the granularity of 1 second or 10 seconds. Conversely, swipes by the right index finger 32A on the back touchscreen display gesture response area 158 with the right thumb 30A performing a static touch on the front touchscreen display gesture response area 148, may cause forwarding or rewinding video playback by a large time increment, such as minutes. In this example, a swipe by the right index finger 32A in the upward direction 39A advances the video playback by 1 minute or 5 minutes. Similarly, swiping the right index finger 32A in the downward direction 39B may rewind the video playback by 1 minute or 5 minutes. Advantageously, both coarse and fine adjustment of a control are provided. For a control which is associated with system parameters, this permits both coarse and fine adjustment of the system parameter associated with the control as described below.
The opposite direction pinch (open or close) slide gestures (42, 44) may be recognized by the UI module 316 and used to rotate an object on the display. In some examples, the pinch open or close slide gestures (42, 44) may be recognized by the UI module 316 and used to manipulate user interface controls which control system parameters such as any rotating dial control including a volume control, a brightness controls and the like. As an example,
While, the above-described electronic device 10 has a front touchscreen display 140 and a back touchscreen display 150, the above-described methods may be performed on an electronic device having a front touchscreen display 140 and a back touchpad 136. For example, with reference to
There has been an increasing research and commercial interest in the development of electronic devices, such as mobile phones that have a flexible display screen which can be folded or formed into different form factors (hereinafter referred to as foldable electronic devices). A foldable device can have two or three touchscreen displays. With reference to
The above-described gestures, used with electronic devices 10 and 10′ may also be used with foldable electronic device 20. For example, with reference to
In another example embodiment, shown in
In yet another embodiment, shown in
While specific gestures were shown in the figures, it would be apparent to persons skilled in the art that other variations of such gestures are possible. For example gestures involving the fingers touching all three touchscreen displays 140, 150 and 160 are possible. In other examples, gestures involving touching only the edge touchscreen display 160 and the back touchscreen display 150 are also contemplated.
In some examples of the embodiments, a two finger input gesture, recognized by the UI module 316, may be used to manipulate a user interface control associated with a system configuration parameter. For example, manipulating a user interface control may allow the adjustment of one of an audio volume, a display brightness, a display contrast, or any other system configuration parameter. The user interface control may be a linear slider control or a rotary slider control. In example shown in
The rotary slider control 80 has a first end 84, a second end 86, and a track 82 extending between the first end 84 and the second end 86. The rotary slider control 80 is associated with a system parameter. The rotary slider control 80 may receive a finger touch on the track 82, and the position of the finger on the track 82 determines a corresponding value for the system parameter. For example, the system parameter is at its minimum value when the UI module 316 recognizes that a finger is touching the track at or near the first end 84. Conversely, the system parameter is at its maximum value when a finger is recognized to touch the track 82 at or near the second end 86. While the thumb 30 is on the soft button 70B, an index finger 32 may be placed on and moved (swiped) along the track 82 to adjust the system parameter associated with the rotary slider control 80.
Advantageously, the embodiments of
The processing unit 100 may include one or more processors 102, such as a microprocessor, a microcontroller, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a dedicated logic circuitry, or combinations thereof. The processing unit 100 may also include one or more input/output (I/O) interfaces 104, which may enable interfacing with one or more appropriate input devices 110 and/or output devices 120.
The input devices 110 may include a front touch sensing system 144 associated with the front touchscreen display 140, a back touch sensing system 154 associated with the touchscreen display 150. Optionally, for some devices, such as the foldable electronic device 20, the input devices 110 may also include an edge touch sensing system 164 associated with an edge touchscreen display 160. For other devices, such as electronic device 10′, the input device 110 includes a touchpad sensing system 138 associated with the touchpad 136.
The output devices 120 may include a front display 142 which is part of a front touchscreen display 140. In some embodiments the output devices 120 may include a back display 152 which is part of a back touchscreen display 150. In some embodiments, the output devices 120 include an edge display 162 which is part of an edge touchscreen display 160.
The processing unit 100 may include one or more network interfaces 106 for wired or wireless communication with a network (e.g., an intranet, the Internet, a peer-to-peer (P2P) network, a wide area network (WAN) and/or a local area network (LAN) or other node. The network interfaces 106 may include wired links (e.g., Ethernet cable) and/or wireless links (e.g., one or more antennas) for intra-network and/or inter-network communications.
The processing unit 100 may also include one or more storage unit(s) 178, which may include a mass storage unit such as a solid state drive, a hard disk drive, a magnetic disk drive and/or an optical disk drive. The processing unit 170 may include one or more memories 180, which may include a volatile (e.g. random access memory (RAM)) and non-volatile or non-transitory memories (e.g., a flash memory, magnetic storage, and/or a read-only memory (ROM)). The non-transitory memory(ies) of memories 180 store programs 300 that include software instructions for execution by the processor 102, such as to carry out examples described in the present disclosure. In example embodiments the programs 300 include software instructions for implementing operating system (OS) 310 and applications 350.
The OS 310 can include kernel 320 for task switching, touchscreen driver 314 coupled with touch sensing systems 144, 154 and 164 for generating touch events as discussed above, and a UI module 316 for recognizing gestures formed by the touch events. The OS 310 also includes a touchpad driver 317 for devices including a touchpad, a display driver 318 coupled with the displays 142, 152 and 162, and other device drivers 312 for various peripherals. The memory 180 also stores one or more applications 350 which render content on any one of the displays 142, 152 and 162 via the display driver 318.
In some examples, memory 180 may include software instructions of the processing unit 100 for execution by the processor 102 to carry out the display content modifications described in this disclosure. In some other examples, one or more data sets and/or modules may be provided by an external memory (e.g., an external drive in wired or wireless communication with the processing unit 100) or may be provided by a transitory or non-transitory computer-readable medium. Examples of non-transitory computer readable media include a RAM, a ROM, an erasable programmable ROM (EPROM), an electrically erasable programmable ROM (EEPROM), a flash memory, a CD-ROM, or other portable memory storage.
There may be a bus 108 providing communication among components of the processing unit 100, including the processor(s) 102, I/O interface(s) 104, network interface(s) 106, storage unit(s) 178 and/or memory(ies) 180. The bus 108 may be any suitable bus architecture including, for example, a memory bus, a peripheral bus or a video bus.
The input device(s) 110 may include other components which are not shown such as a keyboard, a mouse, a microphone, and accelerometer, and/or a keypad). The output device(s) 120 may include other components which are not shown such as an LED indicator and a tactile generator.
The plurality of touch events provided by the touchscreen driver 314 (or the touchpad driver 317 in case of the touch sensitive surface being a touch pad) contain both location information and a time stamp. A swipe is comprised of a plurality of touch events which differ in location and time. Accordingly, the UI module 316 which receives the touch events and can compute a velocity for each of a first swipe and a second swipe. For example, the velocity of a first swipe detected on a first touch sensitive surface may be denoted V1 and the velocity of a second swipe detected on a second touch sensitive surface may be denoted V2. The UI module 316 then classifies the gestures as follows:
When V1=V2+delta, then the second swipe is in the same direction as the first swipe, and the gesture is a same direction pinch slide gesture.
When V1=−V2+delta, then the first swipe is in an opposite direction to the second swipe, and the gesture is an opposite direction pinch slide gesture.
When V1=0 and V2>0, then the first swipe is a static touch while the second swipe is used to manipulate a slider user interface control.
Certain adaptations and modifications of the described embodiments can be made. Therefore, the above discussed embodiments are considered to be illustrative and not restrictive.