This disclosure relates generally to electronic devices, and more particularly to user interfaces for electronic devices.
“Intelligent” portable electronic devices, such as smart phones, tablet computers, and the like, are becoming increasingly powerful computational tools. Moreover, these devices are becoming more prevalent in today's society. For example, not too long ago mobile telephones were simplistic devices with twelve-key keypads that only made telephone calls. Today, “smart” phones, tablet computers, personal digital assistants, and other portable electronic devices not only make telephone calls, but also manage address books, maintain calendars, play music and videos, display pictures, and surf the web.
As the capabilities of these electronic devices have progressed, so too have their user interfaces. Prior keypads having a limited number of keys have given way to sophisticated user input devices such as touch sensitive screens or touch sensitive pads. Touch sensitive systems, including touch sensitive displays, touch sensitive pads, and the like, include sensors for detecting the presence of an object such as a finger or stylus. By placing the object on the touch sensitive system, the user can manipulate and control the electronic device without the need for a physical keypad.
One drawback associated with these touch sensitive systems concerns the user experience. Many applications today are being designed to primarily function with an electronic device having a touch sensitive surface. When one wants to operate such an application with a non-touch sensitive device, adapting the user interface for the non-touch sensitive device can be problematic. An improved electronic device would offer an enhanced user experience by making control of applications more intuitive.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present disclosure.
Before describing in detail embodiments that are in accordance with the explanatory disclosure, it should be observed that the embodiments reside primarily in combinations of method steps and apparatus components related to detect interactive applications operating on a remote device, present a control interface on the display to receive user input for interactive regions of the interactive application, and communicate the user input to the remote device to control the interactive application. Any process descriptions or blocks in flow charts should be understood as representing modules, segments, or portions of code that include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included, and it will be clear that functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved. Accordingly, the apparatus components and method steps have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the several embodiments so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
It will be appreciated that embodiments described herein may be comprised of one or more conventional processors and unique stored program instructions that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of methods for detecting interactive applications operating on remote devices, presenting control interfaces to control the interactive applications on a local device, and communicating control input received at the local device to the remote device as described herein. The one or more conventional processors may additionally implement and execute an operating system, with the methods described below being configured as an application operating in the environment of the operating system. For example, one or more of the embodiments described below are well suited for configuration as an application adapted to operate in the Android™ operating system manufactured by Google, Inc.
The non-processor circuits may include, but are not limited to, a radio receiver, a radio transmitter, signal drivers, clock circuits, power source circuits, and user input devices. As such, these functions may be interpreted as steps of a method to perform control of an interactive application operating on a remote device by presenting a control interface on a local device, receiving user input, and communicating the user input to the remote device. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used. Thus, methods and means for these functions have been described herein. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
One or more embodiments are now described in detail. Referring to the drawings, like numbers indicate like parts throughout the views. As used in the description herein and throughout the claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise: the meaning of “a,” “an,” and “the” includes plural reference, the meaning of “in” includes “in” and “on.” Relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, reference designators shown herein in parenthesis indicate components shown in a figure other than the one in discussion. For example, talking about a device (10) while discussing figure A would refer to an element, 10, shown in a figure other than figure A.
Embodiments described herein provide an electronic device, referred to colloquially as a “target device,” configured to execute a method of controlling an interactive application operating on a remote device. In one embodiment, the electronic device includes a display, a communication circuit, and a control circuit. The control circuit executes instructions configured in the form of executable code to detect, with the communication circuit, the interactive application operating on the remote device. The control circuit then presents a control interface on the display of the electronic device. The control interface is configured to receive user input for interactive regions of the interactive application. In one embodiment, the user input comprises gestures detected on a touch sensitive surface of the display. When user input is received, the control circuit causes the communication circuit to communicate the user input to the remote device to control the interactive application. In one embodiment, the user input communicated to the remote device is mapped to one or more interactive regions of the interactive application.
In another embodiment, the control circuit activates an application configured for interactive operation on a single display. Despite the fact that the application is designed to work only on a single display that is local to the electronic device, in one embodiment the control circuit causes the communication circuit to communicate presentation data of the application for presentation on a remote display device. This can be done in one embodiment by mapping the application to a display region of the electronic device that exceeds the presentation area of the display, and then communicating data presented in areas of the display region outside the presentation area to the remote device.
Once the application, or portions thereof, is being communicated to the remote device, in one embodiment the control circuit presents a control interface for the application on the display. When user input is received at the control interface, the control circuit causes the communication circuit to communicate user input received at the control interface to control the presentation data of the application on the remote display device.
In one embodiment, the communication circuit is operable with a local Wi-Fi network and allows a user to display content from the “single display” application on external large screen devices, one example of which may be a wide screen, high definition television. Since the screen of the television may not be touch sensitive, and as it is inconvenient to attempt to control interactive applications operating on a television with a remote control due to the cumbersome user interface and lack of correspondence between remote control keys and the interactive regions of the interactive application, embodiments described herein allow the user to employ the control interface presented, automatically in one or more embodiments, on the display of a mobile device. Rather than “mirroring” the entire wide screen television screen on the mobile device, which causes the text to become illegible due to the small size of the display on the mobile device and further requires extensive computing power in the mobile device, one or more embodiments provide a control interface that provides an “easy to use” user interface that does not require the single display application to be reconfigured in anyway. Furthermore, there is no requirement for the remote screen to be mirrored.
Turning now to
The explanatory electronic device 100 is shown illustratively in
The illustrative electronic device 100 of
In one or more embodiments, the communication circuit 103 can be configured for data communication with at least one wide area network. For illustration, where the electronic device 100 is a smart phone with cellular communication capabilities, the wide area network can be a cellular network being operated by a service provider. Examples of cellular networks include GSM, CDMA, W-CDMA, CDMA-2000, iDEN, TDMA, 2.5 Generation 3GPP GSM networks, 3rd Generation 3GPP WCDMA networks, 3GPP Long Term Evolution (LTE) networks, and 3GPP2 CDMA communication networks, UMTS networks, E-UTRA networks, and other networks. It should be understood that the communication circuit 103 could be configured to communicate with multiple wide area networks as well.
The communication circuit 103 can also be configured to communicate with a local area network 105, such as a Wi-Fi network being supported by a router, base station, or access point. In the illustrative embodiment of
The remote device 106 is shown in
Another example of an interactive application would be a gaming application where a user can control the actions of the game by delivering user input to the remote device 106. Still another example of an interactive application would be a virtual sketching or painting application where a user could create virtual drawings or paintings on the display of the remote device 106. Interactive applications are frequently identified by the use of a cursor or other actuation object that the user can move along the display to actuate or control various interactive regions of the interactive application.
In this illustrative embodiment, the electronic device 100 includes a control circuit 107, which in
The control circuit 107 can be configured to process and execute executable software code to perform the various functions of the electronic device 100. A storage device, such as memory 108, stores the executable software code used by the control circuit 107 for device operation. The executable software code used by the control circuit 107 can be configured as one or more modules 109 that are operable with the control circuit 107. Such modules 109 can comprise instructions, such as control algorithms, that are stored in a computer-readable medium such as the memory 108 described above. Such computer instructions can instruct processors or the control circuit 107 to perform methods described below in
Turning now to
At step 201, an interactive application (110) is detected operating on a remote device (106). In one embodiment, the remote device (106) is in communication with a communication circuit (103) of the electronic device (100).
At step 202, a control interface is presented on a user interface (101) of the electronic device (100). In one embodiment, the control interface is configured to allow a user to control the interactive application (110) operating on the remote device (106) from the display (102) or other user interface (101) of the electronic device (100). User input can be received at the display (102) or user interface (101) of the electronic device (100) at the control interface.
In one or more embodiments, this step 202 can optionally include mapping a portion of the information of the interactive application (110) that is visible on the remote device (106) in the control interface. For example, where the interactive application (110) is a web browser having a scroll bar with which a user may move the displayed website up and down, the scroll bar would constitute an interactive region of the interactive application (110). Accordingly, in one embodiment, step 202 can include mapping the scroll bar or a portion thereof in the control interface.
In one embodiment, at optional step 203, a portion of the control interface can be mapped to the display of the remote device (106) as well. For example, when the control interface corresponds to only a portion of the displayed content, such as an interactive region (which may be one of many interactive regions) of the interactive application (110), a user may want a visual indicator of what portion of the content is presently controllable with the control interface. Accordingly, in one embodiment a portion of the control interface or an indicator thereof can be mapped to the content on the remote device (106) so that the user can identify and/or adjust the portion or interactive region of the content being controlled.
At step 204, the user input received at the control interface can be communicated to the remote device (106) to control the interactive application (110). In one or more embodiments, this step 204 comprises mapping the user input to interactive regions of the interactive application.
The method 200 of
Many prior art devices attempt to facilitate control of a remote device by mirroring the display of the remote device on a local device. As noted above, this method creates distinct problems. First and foremost, attempting to mirror a remote display on a local device requires significant computing and memory resources. Second, mirroring causes power consumption in the local device to increase, thereby decreasing the operable run time. Third, there can be significant latency between display updates on the local and remote devices. As one example, the remote display may have one hundred or more milliseconds of communication and/or processing delay than does the local display. This may cause less than desirable user experiences, especially when the interactive application is a gaming application. Finally, the remote display may have a different resolution from the local display, which results in the mirrored content not presenting the finer details on the local display due to its small size.
Embodiments of the present disclosure, such as the method 200 shown in
In one or more embodiments, the control interface presented at step 202 can be associated with expected types of user input or interactions. For example, if the mapped portion of the content of the interactive application (110) is a scroll bar, expected interactions may be dragging motions. Accordingly, the control interface may be uniquely designed to allow the user to perform dragging operations. Similarly, if the mapped portion of the content of the interactive application (110) corresponds to, for example, a hyperlink, the expected interaction may be a touch input. The control interface presented at step 202 can be uniquely configured to permit simple touch inputs. Further, in one or more embodiments, the control interface presented at step 202 can change as the mapped portion of the content of the interactive application (110) changes. While touch and drag interactions are two examples of expected interactions, it should be obvious to those of ordinary skill in the art having the benefit of this disclosure that other interactions could be expected as well, including extended touch, gestures, patterns, and so forth.
In another embodiment, the method 200 of
While one example of an electronic device (100) and one illustrative method (200) have been described, operation of the various aspects of embodiments of the disclosure may become more clear with the illustration of the electronic device (100) performing various method steps in an example or two. One such example is illustrated in
Turning now to
Upon detecting that the interactive application 110 is operating on the remote device 106, the control circuit of the electronic device in one embodiment presents a control interface 300 on the display 102 of the electronic device 100. In one embodiment, this presentation occurs automatically. In one embodiment, the control interface 300 comprises a selection one of a plurality of predefined control interfaces stored in a memory of the electronic device 100, where each of the plurality of predefined control interfaces is associated with a predetermined touch control interaction as previously described. The control interface 300 in one embodiment is specific to the interactive application 110, i.e., it includes a shape, control, or actuation target that is specifically configured to control the interactive application 110. In other embodiments, the control interface 300 is general in that it can be used to control different interactive applications. The explanatory control interface 300 of
The control interface 300 can take a variety of forms. In one embodiment, the control interface 300 is in the form of a graphical “widget” that is presented on only a portion of the display 102 of the electronic device 100 and that “floats” above other information 302 being presented on the display 102. In one embodiment, the portion of the display 102 upon which the control interface 300 is presented is user configurable. While the control interface 300 can be configured based upon the input control needs of the interactive application 110, in one embodiment the control interface 300 can be one of a plurality of control interfaces. For example, one control interface may be a scrolling control interface, while another may be a press control interface. One control interface may be a pinch control interface, while another control interface is a stretch control interface, and so forth. The various control interfaces may be visually different so that an associated expected user interaction is evident to a user 303 by the shape, contour, color, or other visually distinguishing identifier of the control interface.
Illustrating by example, a scrolling control interface may be configured as a lengthy rectangle, while a press control interface may be configured as a small circle. Similarly, a pinch control interface may be a geometric shape having one or more concave sides, while a stretch control interface may be a geometric shape having one or more convex sides. In one embodiment, the plurality of control interfaces can be stored in a library of interactive elements resident in a memory of the electronic device (100). The control circuit may select the proper control element based upon the mapping occurring between the content of the interactive application (110) and the electronic device 100, or upon other criteria. In other embodiments, a user may select the proper control interface. As noted above, as the mapped region of the content of the interactive application changes, the control interface can dynamically change in real time as well.
In one or more embodiments, the control circuit of the electronic device 100 detects the interactive application 110 operating on the remote device 106 and presents a preconfigured control interface designed to control interactive regions of the content presented by the interactive application 110. Examples of preconfigured control interfaces include a scrolling control interface for a web browsing interactive application, a media controlling control interface comprising multiple buttons or virtual user actuation targets for a media player interactive application, or a keyed control interface for a gaming interactive application.
In one or more embodiments, the graphical appearance and/or layout of the control interface 300 does not “match” or otherwise mirror the user interface of the interactive application 110 visible on the remote device 106. In such embodiments, the control circuit of the electronic device 100 translates user input applied to the control interface 300 into preconfigured input events for communication to the remote device 106 to control the interactive application 110. Said differently, the logic employed by the control circuit of the electronic device 100 in presenting the control interface 300 need not understand the logic used by the interactive application 110 to control its content. Instead, the control interface 300 functions as a receiver of user input. This user input is then communicated to the remote device 106 after a predefined transformation, which in one embodiment is performed by the control circuit of the electronic device 100. Advantageously, when using such an embodiment, there is absolutely no change required for the interactive application 110, i.e., no reconfiguration or reprogramming, because the interactive application 110 needs only to react to communicated user input in the same way it would if the interactive application 110 were operating on the electronic device 100 itself.
While one control interface 300 is shown on the display 102 of the electronic device 100 in the illustrative embodiment shown in
As previously mentioned, in one or more embodiments the control interface 300 can perform a translation prior to communicating user input to the remote device 106 for controlling the interactive application 110. The following list provides examples of such translations: A touch control interface can be configured to deliver selection or touch input to a mapped portion of content presented by the interactive application 110. Accordingly, the touch control user interface can receive touch input and translate it to a predetermined location along the content.
A scrolling control interface configured as a scroll bar can receive dragging or scrolling user input and can translate that user input into a predefined curve that corresponds to an interaction region of the content presented by the interactive application 110. A rotating control interface, which can appear as a “ball” on the display 102 of the electronic device 100 in one embodiment, can translate an amount of rotation of the ball to an amount of rotation for the content presented by the interactive application 110. A stretch control interface, which can allow two fingers to stretch the ball, can translate an expansion input to interactive portions of the content presented by the interactive application 110.
In one or more embodiments, the control interface 300 is customized for the interactive application 110. For example, in one embodiment the control circuit of the electronic device 100 uses a plurality of control templates stored in memory that are common with popular applications to configure the control interface 300. In one or more embodiments, the control circuit of the electronic device 100 can be configured to change the control interface 300 when the interactive application 110 operating on the remote device 106 changes.
In one or more embodiments, the control circuit of the electronic device 100 allows the user 303 to control the design of the control interface 300 as well. In one embodiment, the user 303 can launch the interactive application 110 on the remote device 106 and then, using a camera of the electronic device 100 or other means, can capture a screen shot of a portion of the display of the remote device 106. A configuration module operating on the electronic device 100 then searches the library of control templates to determine whether a particular control interface has been designed for the interactive application 110. If not, the configuration module allows a new entry to be created in the control module library that will be associated with the interactive application 110. In one embodiment, the screen shot can be presented on the display 102 of the electronic device 100 so that the user 303 can confirm that the desired control interface will be used. The user 303 can then select the size and orientation of the control interface, and can move the control interface along the display 102 of the electronic device 100. In one embodiment, the user 303 may employ the initially captured screen shot as a starting point for the control interface. In one or more embodiments, the user 303 can also define a relative speed and scale factor for location transformation. Every control interface can optionally have a name displayed proximally thereto, so the user 303 can easily remember what the control interface does. When the user 303 closes the control interface, it can be saved into the library. When the interactive application 110 is launched subsequently, the control interface may automatically appear on the display 102 of the electronic device 100.
In one or more embodiments, the control interface 300 can be a singularly configured control interface that provides different control input to different interactive applications. Since the control circuit of the electronic device 100 may not be aware of the logic state of the interactive application 110 operating on the remote device 106, in one embodiment, the control circuit of the electronic device 100 receives runtime feedback from the interactive application 110 running on the remote device 106. For example, an event call back application protocol interface can be designed for the interactive application 110 to provide current runtime status information back to the control circuit of the electronic device 100 via the communication circuit in one embodiment.
Turning now to
In this illustrative embodiment, the user input 400 has been mapped to the interactive region 301 of the content presented by the interactive application 110. Accordingly, the content presented by the interactive application 110 has moved just as if the user 303 had touched the interactive region 301 on the remote device 106 and made the rotational input. However, since in this embodiment the display of the remote device 106 is not touch sensitive, the user 303 may make a simple touch gesture on the electronic device 100 to control the content. Advantageously, there is no need to operate user interface devices with inputs that do not correspond to the actions normally used to control the content of the interactive application 110.
As noted above, in one or more embodiments, the control interface 300 can comprise a mapping of a portion of information 401 of the interactive application 110 visible on the remote device 106 on the display 102 of the electronic device 100. This occurs in
In some embodiments, a portion of the control interface 300 can be mapped to the interactive application 110 as well. In the illustrative embodiment of
One of the advantages of presenting the control interface 300 on only a portion of the display 102 of the electronic device 100 is that other portions of the display 102 are available for other uses. Turning now to
While detecting an interactive application 110 operating on a remote device 106, and providing a control interface 300 on an electronic device 100 to receive user input for controlling the interactive application 110 is one method of operating the electronic device 100 in accordance with embodiments of the disclosure, the various embodiments can be used to communicate application data from the electronic device 100 to a remote device 106 and correspondingly control the application data using a control interface 300 as well. Turning now to
Beginning with
In the illustrative embodiment of
In one embodiment, the control circuit 707 is operable to activate an application 710 configured for interactive application on a single display, i.e., the display 702 of the electronic device 700. It is contemplated that many operating systems of portable electronic devices presently do not allow for multiple applications to be operable on the display 702 of the electronic device 700 concurrently. Accordingly, applications are frequently designed to operate on only a single display. Embodiments of the disclosure are adapted to allow such applications to be presented on a remote device, yet controlled with the electronic device 700, without any reconfiguration of the application itself. Accordingly, embodiments of the disclosure can be used with “off the shelf” applications to provide superior user experiences by allowing those off the shelf applications to be used with remote devices having larger, and often better, displays.
In one embodiment, the control circuit 707 then causes the communication circuit 703 to communicate presentation data of the application 710 for presentation on the remote device. In one embodiment, the control circuit 707 accomplishes this by presenting the presentation data in the presentation region 773 that is complementary to the presentation region 772 of the display 702. When this occurs, the control circuit 707 can present a control interface in the presentation region 772 of the display 702 to allow the user to control the presentation data with the control interface. The communication circuit 703 can communicate the user input received at the control interface to control the presentation data of the application 710 on the remote device. This will be illustrated in
Turning to
At
At
At
At
At
At
In the foregoing specification, specific embodiments of the present disclosure have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present disclosure as set forth in the claims below. For instance, it has been explained that various control interfaces configured in accordance with embodiments of the disclosure can comprise touch sensitive interfaces that are visually indicative of one or more predetermined touch interactions operable to control an interactive application operating on, or having presentation data displayed on, a remote device. Examples of predetermined inputs have been described to include a touch input, a drag input, an extended touch input, a gesture input, or combinations thereof. When a control interface comprises a scrolling interface, it can be configured to receive a predetermined touch interaction that comprises a drag input in one embodiment. To provide even more examples of control interfaces,
Turning now to
Embodiment 1701 is a configured as a QWERTY keypad. A full QWERTY keypad can be implemented. Alternatively, variations or subsets of keys from a QWERTY keypad can be implemented to save space. Alternatively, multiple languages can be supported by dedicated user input attachments as previously described.
Embodiment 1702 is referred to as a “jelly bean” in that a user 803 can squeeze, stretch, rotate, slide, or otherwise manipulate a control interface configured as a virtual spongy ball to control interactive applications operating on, or presentation data being presented on, a remote device in accordance with one or more embodiments of the disclosure. Other variants of geometric shapes may also be created to receive gesture input.
Embodiment 1703 is a game control interface. Each piece can be presented on a touch sensitive display of an electronic device. As shown, one piece includes buttons and the other piece includes a D-pad. The embodiment 1703 can be user configurable to accommodate either a right-handed configuration (as shown) or a left-handed configuration.
Embodiment 1704 is a numerically specific control interface. Embodiment 1705 is an application specific control interface that includes features such as a navigational wheel, page back/forward keys, an enter key, and a D-pad.
Embodiment 1705 is a multifunction control interface keypad illustrating some of the varied user actuation targets that can be included in a control interface presented on a touch sensitive display. Such controls include virtual sliders (suitable for scrolling operations and for receiving drag or slide user input), virtual rockers, and virtual joysticks. Thus, while preferred embodiments of the disclosure have been illustrated and described, it is clear that the disclosure is not so limited. Numerous modifications, changes, variations, substitutions, and equivalents will occur to those skilled in the art without departing from the scope of the present disclosure as defined by the following claims.
To this point, the device with the smaller display, i.e., the target device, has been described as receiving the user input to control an interactive application being presented on a device with a larger display, i.e., a remote device. It will be obvious to those of ordinary skill in the art having the benefit of this disclosure that embodiments of the disclosure could work in the opposite, with the user input being received at the remote device to control an interactive application operating on the target device. For example, in an interactive classroom, each student may be operating an educational application on their respective target devices. A teacher may be operating a remote device that is in communication with each of the devices. In such a use case, the teacher may want to provide user input at the remote device that can control the interactive application operating on a particular student's target device. Turning now to
In
The explanatory remote device 1800 is shown illustratively in
The illustrative remote device 1800 of
In one or more embodiments, the communication circuit 1803 can be configured for data communication with at least one local network. For illustration, the local area network can be a Wi-Fi network being supported by a router, base station, or access point. In the illustrative embodiment of
It should also be noted that the communication circuit 1803 could be configured to communicate with a plurality 1880 of target devices. Continuing with the teacher-student example from above, the teacher may desire her remote device be in communication with the target devices of each student. Accordingly, in one embodiment the communication circuit 1803 is configured to communicate with a plurality 1880 of target devices. As shown in
In this illustrative embodiment, the remote device 1800 includes a control circuit 1807, which in
The control circuit 1807 can be configured to process and execute executable software code to perform the various functions of the electronic device 1800. A storage device, such as memory 108, stores the executable software code used by the control circuit 1807 for device operation. Such computer instructions can instruct processors or the control circuit 1807 to perform methods described below in
Turning now to
In another embodiment, as shown in
In one embodiment, the user input 2001 received at the remote device 1800 can be communicated to the target device 1806 to control the interactive application 1810. Accordingly, a teacher can select, open, control, launch, or close the interactive application 1810 operating on the target device 1806. In one or more embodiments, the user input 2001 can be associated with expected types of user input or interactions. For example, the user input 2001 may be associated with dragging motions. Similarly, the user input 2001 may be associated with touch input. While touch and drag interactions are two examples of expected interactions, it should be obvious to those of ordinary skill in the art having the benefit of this disclosure that other interactions could be expected as well, including extended touch, gestures, patterns, and so forth. In another embodiment, the communication circuit (1803) of the remote 1800 device can communicate user input 2001 received at the remote device 1800 to a runtime component operating on the target device 1806 to control the interactive application 1810.
In one embodiment, control circuit 1807 of the target device 1806 can communicate a control mapping 2002 of control interface data to the remote device 1800 to be superimposed on a portion of information of the interactive application 1810 to demonstrate information that, when within the control mapping 2002, will be visible on the target device 1806. This control mapping 2002 allows the user to easily identify what portion of the information of the interactive application 1810 will be seen on the display of the target device 1806. In one embodiment, this control mapping 2002 is user definable in that a user may expand and shrink the control mapping 2002 based upon a desired resolution. Accordingly, in the teacher-student use case, the teacher may resize the control mapping 2002 to determine what portion of the output of the interactive application 1810 is visible on the target device 1806.
Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present disclosure. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims.