This relates to touch-based user interfaces, and more particularly to providing visual feedback for cross-device touch gestures.
Gesture man-machine interfaces involve detection of defined motions made by a user. The various gestures each have an associated user-interface semantic.
Co-pending, co-owned U.S. patent application Ser. No. 15/175,814, the entire contents of which are hereby incorporated by reference, discloses interpreting cross-device gestures and providing access to resources. The gestures may be used to, for example, pair a device with an adjacent device, or to request specific resources of an adjacent device.
Visual feedback may be provided during gesture input to notify a user that a gesture is being recognized. For example, the above noted application discloses embodiments in which a button image, displayed on a touchscreen, tracks the gesture across devices.
Further methods of providing cross-device visual feedback are desirable.
The present application discloses novel ways of providing visual feedback during a cross-device device gesture. These may be used independently of, or in addition to, other methods of providing feedback.
In an aspect, there is provided a computer implemented method comprising at a first electronic device having a touch sensitive display, the touch sensitive display displaying a user interface detecting that a second electronic device is in a defined position proximate the first electronic device; detecting a touch gesture comprising a swipe tracing a path across the touch sensitive display between a first position proximate the second electronic device and a second position away from the second electronic device; and, in response to the detecting of the touch gesture, updating regions of the displayed user interface to include visual attributes of a user interface of the second electronic device.
Conveniently, in this way a user may be provided with visual feedback during gesture completion.
In an aspect, there is provided a non-transitory computer readable medium storing instructions that when executed by a processor of a first electronic device having a touch sensitive display, cause the device to detect that a second electronic device is in a defined position proximate the first electronic device; detect a touch gesture comprising a swipe tracing a path across the touch sensitive display between a first position proximate the second electronic device and a second position away from the second electronic device; and, during the detection of the touch gesture, update regions of the displayed user interface to include visual attributes of a user interface of the second electronic device.
In another aspect, there is provided a first electronic device comprising a touch sensitive display; a processor in communication with the touch sensitive display; a non-transitory computer-readable medium coupled to the processor and storing instructions that when executed by the processor cause the device to detect that a second electronic device is in a defined position proximate the first electronic device; detect a touch gesture comprising a swipe tracing a path across the touch sensitive display between a first position proximate the second electronic device and a second position away from the second electronic device; and, in response to the detection of the touch gesture, update regions of the displayed user interface to include visual attributes of a user interface of the second electronic device.
Embodiments are described in detail below, with reference to the following drawings.
As illustrated, electronic device 12 includes a sensing surface in the form of a touch screen display 14 and includes mechanical connectors 20 for mechanically interconnecting one or more proximate devices.
Electronic device 12 is illustrated as a smartphone, however this is by no means limiting. Instead, as will become apparent, electronic device 12 may be any suitable computing device such as, for example, a smartphone, a tablet, a smart appliance, a peripheral device, etc.
Touch screen display 14 may be, for example, a capacitive touch display, a resistive touch display, etc. Touch screen display 14 may include a display element and a touch sensing element integrated as a single component. Alternatively, touch screen display 14 may include suitably arranged separate display and touch components. Touch screen display 14 may be adapted for sensing a single touch at once, or alternatively, multiple touches simultaneously. Touch screen display 14 may sense touch by, for example, a finger, a stylus, or the like.
As illustrated, magnetic connectors 20 of electronic device 12 permit electronic device 12 to be mechanically coupled to other suitable devices. An example of a possible magnetic connector is described in International Patent Application Publication No. WO 2015/070321 and U.S. Pat. No. 9,312,633. Each connector 20 offers a mechanical coupling function and, optionally, provide an electrical connection to a mechanically interconnected device. For example, a USB 2.0/3.0 bus may be established through the electrical connection.
Additionally or alternatively, electronic device 12 may have non-magnetic connectors for mechanical and/or electrical coupling with other suitable devices.
As illustrated, electronic device 12 includes one or more processor(s) 21, a memory 22, a touch screen I/O interface 23 and one or more I/O interfaces 24, all in communication over bus 25.
Processor(s) 21 may be one or more Intel x86, Intel x64, AMD x86-64, PowerPC, ARM processors or the like. In some embodiments, the one or more processor(s) 21 may be mobile processor(s) and/or may be optimized to minimize power consumption such as where, for example, electronic device 12 is battery operated.
Memory 22 may include random-access memory, read-only memory, or persistent storage memory such as a hard disk, a solid-state drive or the like. Read-only memory or persistent storage is a computer-readable medium. A computer-readable medium may be organized as a file system, controlled and administered by an operating system governing overall operation of the computing device.
Touch screen I/O interface 23 serves to interconnect the computing device with touch screen display 14. Touch screen I/O interface 23 is adapted to allow rendering images on touch screen display 14. Touch screen I/O interface 23 is also operable to sense touch interaction with one or more computer networks such as, for example, a local area network (LAN) or the Internet.
One or more I/O interfaces 24 may serve to interconnect the computing device with peripheral devices, such as for example, keyboards, mice, and the like. Optionally, network controller 26 may be accessed via the one or more I/O interfaces 24.
Software including instructions is executed by processor(s) 21 from a computer-readable medium. For example, software may be loaded into random-access memory from persistent storage of memory 22 or from one or more devices via I/O interfaces 24 for execution by one or more processors 21. As another example, software may be loaded and executed by one or more processors 21 directly from read-only memory.
OS software 31 may be, for example, Android OS, Apple iOS, Microsoft Windows, UNIX, Linux, Mac OSX, or the like. OS software 31 allows software 32 to access one or more processors 21, memory 22, touch screen I/O interface 23, and one or more I/O interfaces 24 of electronic device 12.
OS software 31 may provide an application programming interface (API) to allow for the generation and display of graphics on touch screen display 14. Likewise, OS software 31 may generate messages, callbacks, interrupts or other indications to application software representative of sensed input at touch screen I/O interface 23. Gesture UI software 32 adapts electronic device 12, in combination with OS software 31, to provide a gesture enabled UI (user interface).
OS software 31 may generate a user interface (UI) visible on touch screen display 14 that allows OS software 31 and application software (not shown) to present one or more visible user interfaces on touch screen display 14. OS software 31 may manage how applications are presented, how human-computer interactions are managed, and the like. To that end, OS software 31 may include a UI manager that controls the appearance and behaviour of the UI. UI manager may include a window manager, components for image composition, and the like. UI behaviour and appearance may be controlled by one or more parameters stored within memory 22. The UI manager may be controlled through an interface presented to a user of device 47, and/or an application program interface (API) to vary UI parameters—including for example, one or more of, the appearance of the visible user interface (e.g. screen background; font size; application appearance; icon appearance; etc.), notifications, and certain application and UI behaviours (e.g. screen behaviour). In some embodiments, the UI manager may comprise a theming engine that adapts the user interface including display thereof to correspond to a defined visual appearance. For example, the theming engine may utilize one of more packages of UI behavior configuration settings, visual elements, and the like that serve to group together elements associated with a particular visual appearance.
Second electronic device 10 is similar or identical to electronic device 12, and includes hardware and software components as detailed above. As will be appreciated, second electronic device 10 need not be identical to electronic device 12, but may include functional components allowing device 12 to interact with device 10, as described herein. Second electronic device includes magnetic connectors 20 and a touch screen display 16.
As illustrated, electronic device 12 and second electronic device 10 may be mechanically coupled by way of magnetic connectors 20. As noted above, magnetic connectors may optionally offer an electronic connection.
Optionally, electronic device 12 and second electronic device 10 may communicate wirelessly, in which case connectors 20 need not, but still may, establish an electrical connection. Wireless communication may be, for example, by way of an 802.11x connection or, additionally or alternatively, using another technology such as, for example, Zigbee™, Bluetooth™, TransferJet™, or the like.
Electronic device 12 and second electronic device 10 each display a respective user interface. In particular, electronic device 10 displays a visible user interface 100 on touch screen 16. Similarly, electronic device 12 displays visible user interface 120 on touch screen 14.
As illustrated in
In other embodiments, user interface 100 and user interface 120 may be similar or identical in appearance. Additionally or alternatively, user interface 100 and user interface 120 may offer similar or varied functionality.
In some embodiments, differences between user interface 100 and user interface 120 may result from or may be represented as different theme packages such as described above.
Once the two devices are connected by way of connectors 20, a user may interface a cross-device request by inputting a gesture that begins on touchscreen 16, proximate an edge of device 10, and extends across touch screen 14 towards a far edge of device 10. Of course, this is merely exemplary and a cross-device gesture could be performed in the opposite direction (i.e. from device 10 to device 12). In some embodiments, a cross-device gesture may begin on touch screen 16 before extending across touch screen 14 (or vice-versa).
In some embodiments, the direction of the gesture may itself at least partially identify the meaning of the gesture. For example, the direction of the cross-device gesture may dictate relative assignment of devices as master and slave in a master-slave relationship. For example, a cross-device gesture from device 12 to device 10 may indicate a user intention for device 10 to gain control resources of device 12, and allow device 10 to act as a master device 12. For example, device 10 may as a host to device 12 so that device 10 can utilize the display of device 12 so that the display elements of touchscreens 14 and 16 can be “stitched” (i.e. treated as a single display) for use by applications executing at one or both of device 10 and device 12.
As a cross-device gesture is detected, visual feedback may be provided as, for example, detailed herein. Conveniently, in this way, a user may understand that the gesture is being detected across devices (e.g. devices 12, 10). Additionally, the visual feedback may provide an indication as to the effect of the gesture. Conveniently, such visual feedback may yield a more intuitive user interface.
As will become apparent, visual feedback may include causing the user interface (or attributes of it) of one device to appear to propagate to the other device. For example, the look of user interface 100 may appear to propagate to touch screen 14 of device 12, replacing all or a portion of user interface 120. This may be accomplished by appropriate API calls to or configuration of, for example, a UI manager as described above.
The operation of exemplary gesture UI software 32 is described with reference to the flowchart of
At block S502, processor(s) 21 detect that another device is connected. For example, in some embodiments, processor(s) 21 may receive an indication such as, for example, over bus 25 that another electronic device, such as second electronic device 10, is mechanically connected to electronic device 12 by way of mechanical connectors 20. For example, processor(s) 21 of device 12 may also determine the relative spatial relationships of the interconnected device. For example, in some embodiments, processor(s) 21 may receive an indication such as, for example, over bus 25 that another electronic device, such as second electronic device 10, is mechanically connected to electronic device 12 by way of mechanical connectors 20. Methods of detecting a connection state may be utilized such as, for example, as disclosed in above-noted U.S. Provisional Patent Application No. 62/327,826.
Additionally, a communications link may be established between electronic device 12 and the connected device such as via magnetic connectors 20 as discussed above. Additionally or alternatively, a wireless communications link may be established such as is discussed above.
At block S504, the start of a swipe gesture is detected originating at a region of touch screen display 14 of electronic device 12 proximate the other connected electronic device 10.
A swipe gesture may be detected as a first detection of a touch caused by an implement such as a finger, stylus, or the like touching down on touch screen 14. The gesture may continue, without lifting the implement, with the implement pulled across touch screen 14 in contact therewith, thereby tracing a path across touch screen 14 before being lifted off touch screen 14 at a second point, the lift off may also be part of the gesture—and detected. Processor(s) 21 may receive indications of all of these events such as, for example, over bus 25 from touch screen I/O interface 23. In some embodiments, multiple indications may be received or generated corresponding to each event. For example, a gesture may result in a series of touch events, each of which may be detected, and the collection of touch events, where appropriate, may be interpreted as a gesture. Multiple touch events may result in the generation of a message indicative of the gesture.
Alternatively, the swipe gesture may start with contact outside the touch sensitive area of the display such as for example, on the screen of the other connected device (e.g. device 10) or on a bezel of electronic device 12. In such cases, processor(s) 21 of device 12 may not receive any indication of the implement touching down and may only receive indication of the implement being pulled across touch screen 14 of device 12. Alternatively, an indication may be received that a touchdown occurred at an appropriate position along the extreme edge of touch screen 14 of device 12.
Additionally or alternatively, the swipe gesture may end with contact outside the touch sensitive area of the display such as for example on a bezel of electronic display 12. In such cases, processor(s) 21 may not receive any indication of the implement lifting off and may only receive indication of the implement being pulled across touch screen 14. Alternatively, an indication may be received that a lift off occurred at an appropriate position along the extreme edge of touch screen 14.
In some embodiments, electronic device 12 may receive an indication such as, for example, by way of the communication link of a first portion of the gesture detected by the other electronic device 10. The communication may, for example be a message passed along any electrical or other communication interconnection between devices 10 and 12. Optionally, electronic device 12 may perform steps to ensure that the portion of the gesture performed/sensed on it complements the portion of the gesture performed on the other electronic device 10. For example, software may be executed to ensure that the portions are spatially aligned such as in, for example, a single gesture spanning the two devices. For example, if electronic device 12 is coupled to second electronic device 10, the devices may communicate to determine whether a single gesture spans touch screen display 14 of device 12 and touch screen display 16 of device 10.
Processor(s) 21 may receive indications of all of these events such as, for example, over bus 25 from touch screen I/O interface 23. In some embodiments, multiple indications may be received or generated corresponding to each event.
At block S506, the current touch position on display 16 during the touch gesture is displayed. In some embodiments, this may involve periodic receipt of a touch position such as for example via touch screen I/O interface 23. Alternatively, one or more processors 21 may periodically poll touch screen I/O interface 23 for an updated touch position.
At block S508, visual feedback is provided to the user by updating attributes of user interface 120 based on the current touch position so that regions touched along the gesture resemble a user interface of the other electronic device. For example, if electronic device 12 is coupled to second electronic device 10, regions of user interface 120 may be updated to resemble user interface 100. For example, an application executing on device 12 may expand in window size as the gesture is inputted. This application on device 12 would be drawn to mimic the user interface of device 10, possibly based on UI assets received on device 12, by for example including visual features (e.g., background, colour, font, wallpaper, etc.) of the user interface of device 10.
As will become apparent, in some embodiments once regions of user interface 100 are updated to resemble the user interface of an interconnected device, the appearance of these regions may be maintained in that state at least until the completion of the detection of the gesture.
Device 12 may update user interface 120 based on configuration parameters of user interface 100. For example, configuration parameters may be received via a wired connection such as, for example, via an electrical connection established over connectors 20. Additionally, or alternatively configuration parameters may be received via a wireless connection. For example, configuration parameters may be received by a wireless connection as may be established with, for example, device 10 as described above.
User interface configuration parameters may be received from device 10. Additionally or alternatively, some or all of the configuration parameters may be obtained from a remote server, such as for example, based on a lookup using a characteristic of device 10 such as, for example, a device identifier. This may allow device 12 to obtain the parameters from a trusted server before a trusted communication channel has been established between devices 10/12. For example, a server could transmit the configuration parameters of a user interface of device 10 to device 12 or permit access by device 12 thereto upon receiving verification that device 10 and 12 are connected. For example, device 10 could communicate (directly or indirectly) with the server to indicate that it should communicate with device 12 regarding UI assets corresponding to device 10.
User interface configuration parameters may include graphic images (e.g. wall paper, backgrounds, or other graphic elements), colour schemes, icons, fonts, etc. In some embodiments, device 12 may receive an image for display. For example, device 12 may receive an image in bitmap, jpeg, or other format corresponding to the current screen display of user interface 100. Additionally or alternatively, device 12 may receive a bitmap corresponding to a wallpaper (eg. background image) of user interface 100. Additionally or alternatively, UI assets may be or may comprise, for example, a theme package as described above.
Additionally or alternatively, device 12 may receive more complex graphic parameters. Device 12 may also receive instruction for rendering graphic assets. For example, device 12 could receive a description of a screen display using a description format such as for example portable document format (PDF), Display PostScript (DPS), Quartz 2D, Extensible Application Markup Language (XAML), HTML5, or similar. Such a description may be, for example, be parsed and used to modify the screen display by appropriate API calls to a UI manager. Additionally or alternatively, the description of the screen display may supplied as received to OS software 13 such as, for example, for processing by a window manager component of a UI manager as described above.
At block S510, a determination is made to assess if the gesture is completed. A gesture may be considered complete if, for example, it has passed a pre-defined threshold such as disclosed in, for example, above noted co-pending application Ser. No. 15/175,814. In another example, a gesture may be considered complete if it has entered a pre-defined trigger region.
If the gesture is not completed, control flow proceeds to block S506 so that further visual feedback may be provided. Alternatively, if the gesture is completed control flow may terminate at block S510.
As will become apparent, following completion of the process of
As noted above, visual feedback is provided during the gesture by updating regions of user interface 120 based on the current touch position so that those regions resemble user interface 100.
As illustrated, as gesture 32 traverses across touchscreen 14, a region 62 of touch screen 14 is updated to corresponding to user interface 100 of device 10. Notably region 62 of screen spans the entire vertical extent of touchscreen 14 and is bounded on the left by the edge of touchscreen 14 most proximate device 10. On the right, region 62 is bounded by a straight line boundary 60 (denoted as a stippled line for the purposes of illustration only) parallel to the aforementioned edge. Straight-line boundary 60 is positioned according to the extent of travel of gesture 32 across touchscreen 14. Put differently, region 62, mathematically speaking, is the locus of points having a perpendicular distance from the aforementioned edge of touchscreen 14 that is less than or equal to the perpendicular distance from that edge of the rightmost extent of gesture 32. As gesture 32 sweeps rightward, the rightmost extent of gesture 32 is, put differently, the current touch point of gesture 32 at a given time.
Of course, a straight-line boundary between region 62 and the rest of touchscreen 14 is in no way required. For example, region 62 could have a different shape and correspondingly boundary 60 may take a different shape.
In
In other words in
Other definitions of region 62 are possible. For example, in some embodiments, region 62 may not span the vertical extent of touchscreen 14.
As illustrated, region 62 may expand in a direction that follows a vector of a user's gesture 32. The extent of such a region 62 may be defined, for example, by stretching a defined shape to encompass the path of gesture 32 or otherwise along the vector of gesture 32.
Following the completion of gesture 32, device 12 processes the request. Optionally device 12 may ask the user to confirm the request.
As illustrated, a user may be presented with user interface display 90 having options to accept or reject the device pairing.
Following completion of the gesture and, if provided, an approval of the confirmation, device 12 may grant the request from device 10. As shown in
Subsequently, device 10 can utilize the granted resources of device 12. For example, if device 10 has been granted access to touchscreen display 14, device 10 may cause device 12 to display some or all of an interface 1100 of an application executing at device 10 as shown in
As shown in
Notably displayed graphic portion 1200A is only a portion of a much larger bitmap.
As shown in
As illustrated, portion 1200B has, in effect, a straight-line akin to that of
In some embodiments, the entire bitmap may be displayed as graphic portion 1200A prior to the start of the cross-device gesture and may be stretched across touchscreen displays 16 and 14 as the gesture is performed by the user. In such embodiments, both device 10 and device 12 update their respective displays to show the stretching.
In some embodiments, dynamic content may be displayed in lieu of a bitmap. For example, the bitmap may be replaced with a video.
Devices 10 and 12 may each run different operating systems or different operating systems versions. Accordingly, as in shown in
Device 10′ is equipped with one or more magnetic connectors 20.
The user interface of device 10′ may be very limited. As illustrated, device 10′ includes a button 1400. Button 1400 may be, for example, a mechanical switch, a capacitive button, etc. Button 1400 may be used to indicate the start of a gesture.
As illustrated, device 12 has been connected with device 10′ by way their respective magnetic connectors 20.
Notably, the above embodiments have been described with devices, such as the requesting and responding device, devices having displays, and devices not equipped with a touch sensitive region being in particular relative positions. Of course, this is by way of illustration only and is in no way limiting. The devices may, for example, be rotated into various positions. Similarly, gestures need not proceed left-to-right only or even only left-to-right or right-to-left. For example, where the devices are placed with one above the other, gestures may be, in effect, vertical rather than horizontal.
Of course, the above described embodiments are intended to be illustrative only and in no way limiting. The described embodiments are susceptible to many modifications of form, arrangement of parts, details and order of operation. The invention is intended to encompass all such modification within its scope, as defined by the claims.
This application claims the benefit of U.S. Provisional Application No. 62/293,296, filed Feb. 9, 2016, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
62293296 | Feb 2016 | US |