1. Field of the Invention
The present invention relates to an apparatus and method for intuitive user interaction between multiple devices, and more particularly to a plurality of intuitive user interaction methods in which an electronic device, a remote device, without intuitive user interaction and user interface capability is connected to another electronic device, a local device, configured with user interface capability and intuitive user interaction methods for facilitating sharing therebetween.
2. Description of Related Art
In recent years, an integrated multimedia presentation or display system featuring multiple numbers of display devices being connected together has become popular among consumers. The typical usage includes of having one mobile electronic device configured with user interaction capability and an another electronic device without user interaction capability, such as the instance when one mobile phone is connected with one HDTV, in which the connection therewith can be achieved via a wired cable (i.e., such as RGB, HDMI cables) or via wireless connection (i.e., such as, WIFI display, WIDI etc).
In an usage example of the aforementioned integrated multimedia display system, one application (can be referred as an application A) can be operating in one mobile phone, while also having displayed images thereof being mirrored and rendered in a remotely connected HDTV; meanwhile, another application (can be referred as an application B) can be operated in the mobile phone and having the displayed images thereof rendered in the local screen of the mobile phone. Since typically there is no touch-based graphical user interface peripheral device or capability being equipped with the HDTV, thus, the necessity to provide user interactions with the application A becomes a problem. The usage scenarios of the aforementioned integrated multimedia display system can also include the situation of having one application (application A) operating on the HDTV (i.e. the HDTV being either a smart HDTV or a set-top box) and having the generated displayed images thereof rendered in the respective HDTV; meanwhile, another application (application B) operating in the mobile phone has the displayed images thereof being rendered in the display screen thereof.
In a situation where the application A and the application B are not operating in a same electronic device, there is a need for configuring two separate electronic devices to be operatively connected together so as to enable the exchange of relevant control information therebetween. In addition, there is another usage scenario where one application (application A) is running in one mobile device A, while having its respective display images rendered in a HDTV; meanwhile there is another application (application B) running on another mobile device B while having its respective display images rendered in the display screen thereof. In the usage scenario where the application A and the application B are not running in the same electronic device, there should be a way to have these two electronic devices connected so that the control and other information can be exchanged therebetween. As a result, there is room for improvement within the art.
An objective of the present disclosure is to provide user interface capability to an electronic device that is without any such user interface capability through the connection to another electronic device that is configured with the user interface capability so as to form an integrated multiple-device multimedia display system featuring connected multiple multimedia electronic devices configured with displays.
Another objective of the present disclosure is to provide a set of methods for intuitive user interaction between multiple devices, and to provide a plurality of intuitive user interaction methods in which a remote device without intuitive user interaction is connected to a local device configured with intuitive user interaction methods for facilitating sharing therebetween.
To achieve the above-said objectives, the present invention provides an application to run in one electronic device, such as a local device, i.e. a mobile phone, having fully range of touch-enabled user interaction capability, while one or more multimedia or application content can be rendered in another electronic device, such as a remote device, i.e. a HDTV, a regular TV, without having full range of touch-enabled user interaction capability.
a-12b show the mirrored content in a plurality of interested designated areas from the HDTV being without layout reconfiguration; and with layout reconfiguration in the mobile phone, respectively.
In the embodiments as described herein, the operating system (OS) for mobile phone can be Apple iOS, Google Android, Samsung Bada, Blackberry OS, Windows Phone, Linux, etc. . . . . The operating system (OS) for smart TV or HDTV can be Google Android, Apple iOS, Linux, etc.
Thumbnail Control Method:
Referring to
Step S10: Displayed images for an application A are rendered in a remote HDTV; the application A can run in a local mobile phone device, while the displayed content is streamed to the remote HDTV; or the application A itself can run_in the remote device side such as, a set-top box or a smart TV.
Step S20: Displayed images for an application B can be rendered in the screen of the local mobile phone device under a landscape orientation (see
Step S30: During the application B's execution, a user can invoke the operation on the application A, so as to achieve a quick interaction with the application A, thus the user can quickly invoke a widget panel (which can be invoked via a system gesture, for example) in the screen of the local mobile phone. This widget contains a mirrored area that mirrors or replicates the displayed content of the entire display screen of the remote device within that “mirrored area” (typically the whole or one selected rectangular area on the display of the remote device is resized to a local window area with stretch on the content), and the user can interact with that widget panel in the local mobile phone, and the user interaction events (the touch events) occurring inside the “mirrored area” of the widget may be re-mapped per various screen resolutions between the screen of the local mobile phone and the screen of the remote display device, and to be sent to the application A for processing. More generally, the mirroring process of a pair of specified local area and remote area can be any predefined 2-D area to area coordinates transformation (typically linear translations) and the visual effects on the contents, mathematically speaking.
Step S40: (this is an optional step) Referring to
In the aforementioned thumbnail control implementation method embodiment, the mirror area 200 is not a static thumbnail snapshot, but a real-time mirroring of rendered display images at the remote device, or in other words, the mirror area 200 provides a live streaming of a plurality of video thumbnail image captures. The mirror area 200 does not have to contain the whole image capture of the screen of the remote device, but can also be just part of the screen of the remote device. In actual implementation, one button can be defined in the menu area 100, and once the user clicks that button defined in the menu area 100, a user mode can be entered by the user in which the user can select a particular desired rectangular area to be mapped to the local electronic device—which allows easier interaction in that desired rectangular area, especially when the local mirror area window is much smaller than the remote device, thus it may be difficult to accurately touch certain areas that are too small if the full larger screen was mirrored to the smaller screen of the local device.
The user can quickly dismiss the widget activity so that he can continue working in the application B in the local electronic device. In one of the possible implementation embodiments of the thumbnail control method, when the widget window is displayed, it will not allow the user to interact with other applications other than that particular widget window content, and in such implementation situation, any touch by the user in areas outside the widget window can dismiss the widget activity; and in other implementation embodiment, it allows the user to interact with both the widget window content (if the touch takes place inside the widget window) and other applications (if the touch had been taken place outside the widget window), the dismiss command can be issued via a command button located in the menu area 100.
In the above implementation embodiments for the thumbnail control method, the thumbnail control mode activity is always invoked in the electronic device which possesses user interaction capability. And it can be treated as a mirrored application of the original application A that is rendered in the remote electronic device. Once the activity is invoked, it will remap the touch events occurring in its mirrored region and sent to the application A for processing—thus as being perceived from the application A's point of view, the user interaction occurs on itself directly.
Touch Event Mapping:
In this embodiment, referring to the thumbnail control method, the touch events mapping can be illustrated as follow: the mapping is performed between the source area and the destination area via the coordinate transformation to ensure that the touch event sensed in one area can be mapped to another area. And there can be various algorithms or methods to implement the coordinate transformation such as, for example, a) as shown in
After the touch event mapping, for any touch event that has occurred inside the mirror area 200 of the widget panel, the touch position is determined and calculated so that it is mapped to the relative position of the rendering area of the application A, and it does not have to mirror the entire rendering area of the application A, since it can also mirror just part of the rendering area of the application A. For any touch event, the sensing information is detected and sent to the application A directly.
If the application A and the application B are running in the same electronic device, the display content mirroring, user interaction remapping and insertion can be conducted via across process communication method; if the application A and the application B are running in multiple electronic devices at the same time, there should be a physical data connection established between the multiple electronic devices, so as to allow for display content streaming and user touch event exchanges between the multiple electronic devices.
Generic Remote Control Method:
Another method to allow for additional user interaction is to treat the mobile phone as an external user interaction peripheral device of the HDTV, for extending the user interaction/interface capability of the HDTV itself—similar to that of a mouse or a touch pad, etc. Below describes some possible implementation embodiments under this method of the present disclosure.
Mouse/Track Pad Operating Mode:
Referring to
Touch Screen Operating Mode:
In another embodiment under the generic remote control method of present disclosure, a touch screen operating mode, similar as the mouse operating mode, is provided. The only difference for this embodiment is that the user interface event is a touch event instead. Due to the various resolution sizes, the position needs to be remapped when being injected to the application A's execution environment. The re-mapping mechanism for this operating mode is the same as that found in the Thumbnail Control mode. And in the touch screen operating mode, the application B can also try to display a mirrored image of the application A in real time so that the user can capture the accurate position of as being displayed on the mobile phone.
Enhanced Touch Screen Operating Mode:
The previously described two operating modes have been implemented in existing embodiments and they are described herein for serving as the basis to describe the enhanced touch screen operation mode here which provides a new embodiment. In another embodiment under the generic remote control method of present disclosure, an enhanced touch screen operating mode is further provided. Today most of the mobile electronic devices are most suitable for touch-based operations instead of mouse or track-pad based operations; however, when a user tries to interact with any application when being displayed on the HDTV, there lacks a touch-based peripheral for control of the HOW while being used alongside the HDTV.
In “Mouse/trackpad operating mode”, the user can use a mobile phone to simulate a mouse; which shows a mouse pointer in the HDTV to guide user operation, while the mouse-based screen operation is not that common for many modern mobile device applications, which are touch-based operation instead.
In “Touch screen operating mode”, the user can use a mobile phone to simulate the touch panel device, thus now the user's finger is sliding on top of the mobile phone touch screen instead of the HDTV screen, which makes it difficult to capture the accurate position unless the user is looking at the mobile phone screen and at the same time, there is mirrored image being displayed in the mobile phone. In other words, if the user keeps looking at the screen of the HDTV only, there lacks a way to indicate at where the user's fingers should be placed since the user cannot see his fingers anymore.
Under the enhanced touch screen operating mode, the problems of the aforementioned first and second implementations are resolved by combining the advantages of both of the mouse and touch based implementations, so as to allow the finger position displayed in the remote device, such as the mouse's pointer, while using touch-based operation, assuming that the application A is the one that is rendered in the TV screen and the application B is the one that provides a user interface (UI) in the local mobile phone screen to allow the user to interact with the application A.
Finger Position Visual Hint Generation Method:
In one of the implementations, application B needs to manipulate both screens, namely the mobile phone screen and the TV screen, it provides the UI to allow the user to gain operation; and in the TV device screen, it needs to display the position of the finger placement when the user is interacting from the mobile phone's touch screen to provide user-friendly intuitive feedback. Due to the screen size variance or differences exist between the mobile phone and the TV, the touch position remapping is required, as previously described in the Thumbnail Control mode operation. In an implementation scenario where the TV is connected to the mobile phone directly (either via mHL/HDMI cable or WIFI connection), or in other words. the application A runs in the same device as that of application B, the application B itself can try to generate the finger indicator image and to render to the TV directly. And in another implementation scenario where the TV is connected to other electronic device other than a mobile phone, in other words, the application A runs in a different device other than the one where application B runs, the application B needs to find a way to communicate with that other electronic device so that the finger indicator image can be shown timely and accurately. Usually an utility will run in the same OS environment as application A and communicates with application B to retrieve necessary information regarding where to display the finger indicator image, which finger mode is etc, and this utility can then generate and display the proper finger indicator images in the screen to help the user to operate application A. And in a typical implementation, the finger indicator image is displayed in the remote electronic device in a separate overlay layer for better performance. In other implementation, it's also possible that there is one utility runs in the same device as application A resides and this utility will communicate with application B to draw the Finger Position Visual Hint image accordingly, based on the touch position captured in application B.
Normal Mode:
In enhanced touchscreen operating mode, there is a normal mode of operation for the interconnected multiple devices, and while operating in the normal mode, whenever there is a touch operation which occurs in the “touch” area of the application B, the finger position indicator is always displayed in the TV, meanwhile the touch events are issued directly to the application A. And when operating in the normal mode, the application B can also display the mirrored content of the application A in its operation window to allow for easier position capturing.
Hovering Mode:
In enhanced touchscreen operating mode there is a hovering mode of operation for the interconnected multiple devices, and while operating in the hovering mode, the finger movement or up/down gesture events are not issued to the application itself, but instead the position of finger placement is captured, and the finger indicator image is displayed in the TV accordingly. And when the user exits the hovering mode of operation, the touch event starts to issue to the application A. Below are several illustrative examples of this embodiment.
Some typical implementation methods to enter into and exit from the hovering mode include:
Referring to
An exemplary embodiment of the hovering mode is provided as follow: the user has moved his finger to the right position, and upon exiting the hovering mode, it looks like the user has issued a touch depress event in that position and this touch depress event is issued to the application A directly for processing. And it appears that in the hovering mode, the user is operating a mouse to move the mouse pointer (although the position mapping is not same as the mouse) around in the TV screen, and upon exiting the hovering mode, the user clicks the mouse button.
Referring to
Yet another exemplary embodiment of the hovering mode is provided as follow: Multiple functional areas can be further defined to serve other different purposes other than entering and exiting the hovering mode. For example, the left-bottom corner can be defined as the hover area and the left-top corner can be defined as the “full screen” area to show/hide the local controls. In short, such special purpose functional modes such as for entering and exiting the hovering mode can and should be achieved easily and without interrupting the normal touch operation.
Air Mode Indicator Method:
In another embodiment, there is an air mode indicator method provided to detect the finger position of a user even when his finger is hovering in the air at above an electronic device, the application B can also generate the corresponding finger indicator images per each detected finger gesture position and displays those finger indicator images in the remote device, which will help to guide the user to facilitate more intuitive further actual user interactions in the application B.
Region Enhancement Mode:
When operating under a region enhancement mode, the user can pre-define certain areas in the remote electronic device, and to allow these pre-defined certain areas to have their respective layouts resized, reconfigured (including ZOOM operation capability) and mirrored in the mobile phone device; meanwhile the touch events that have occurred in the mapped areas in the mobile phone will be remapped and sent to the application A that is rendered in the TV.
As shown in
There can be an example usage application found in the mobile phone to ease the interested area selection and layout reconfiguration.
For each interested area re-mapping between the local electronic device (mobile phone) and the remote electronic device (HDTV), the touch events remapping mechanism is the same as the one described under the discussions for the thumbnail control mode herein. So that when the user operates in the reconfigured layout regions in the local electronic device, it looks as if the associated corresponding regions in the mirrored remote electronic device (HDTV) are operating accordingly.
Remapping Standard Skin Method:
In other alternative embodiments as found for gaming consoles, sometimes a full mapping between the two areas does not have to be ensured. And there may only need to be able to recognize certain user keyboard events which can be translated to normal gaming console events such as LEFT, RIGHT, UP, DOWN etc. And in such situation, the mapped two regions do not have to contain the same or decorated contents. For example, certain standard skins can be pre-defined—such as the standard game console skins to help create the region mapping. The following is an implementation embodiment. One can pre-define certain standard game console skins (i.e. four direction joystick pad, eight direction joystick pad, 360 degree round pad, round button etc.) and allow the user to easily re-map the standard skin to the actual control components in the remote electronic device (HDTV). And during the mapping operation, the user can pick up one standard component in the local electronic device, then move it around the screen to ensure the standard component can be mapped to the actual control components via a simple resizing or rotation operation. The captured mapped area will be treated as a source area; and later on, the user can resize the layout of the standard component in the local electronic device. And any touch event occurring in the standard component will then be mapped to the source area of the application A accordingly, which will trigger the relevant events properly.
Power Saving Mode:
In a power saving mode, when the local electronic device is used as an extended console to interact with the remote screen of the remote electronic device, there is no need to maintain the original backlighting or even keep continuously rendering the image to the screen of the local electronic device, since it can also have the option to enter into various power saving modes such as by performing one or more of the following:
Meanwhile, to escape from the power saving mode, the user can easily restore to the original working state by various methods, such as by clicking one pre-defined area (the right-top area for example) or by clicking the power button etc.
Referring to
As shown in
The OS application framework needs to provide for the capabilities for the following functional modules as shown in
Referring to
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the invention without departing from the scope or spirit of the invention. In view of the foregoing, it is intended that the present invention cover modifications and variations of this invention provided they fall within the scope of the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
201310049782.3 | Feb 2013 | CN | national |