1. Field of the Invention
The invention relates to a mobile device, more particularly to a mobile device that is capable of interacting with an electronic device having a display function.
2. Description of the Related Art
Smart phones have evolved rapidly as a result of increasing demands and competitions among manufacturers, providing a wide variety of features such as internet browsing, video games and video conferencing. However, the size of the screen of the smart phone is limited for the sake of portability, and thus may not meet the needs of users that play video games using the smart phone.
Therefore, a conventional interactive system 900 has been provided, as shown in
Nonetheless, the interactive system 900 does not display a virtual button set 912 that is associated with video game control satisfactorily on the screen of the smart phone 910 while the display device 920 displays the gaming screen 911. As a result, users must pay attention to the screen of the smart phone 910 and the display device 920 concurrently when playing the video game, which results in some difficulty. Additionally, the rotate function of the smart phone 910 is generally not available to be configured according to different software or game or office applications, and may cause inconvenience to users.
Therefore, one object of the present invention is to provide a method for a mobile device to interact with an electronic device having a display function.
According to one aspect, a method of the present invention is to be implemented by the mobile device for interacting with the electronic device. The mobile device is operable to display at least one image that is generated by a processor of the mobile device that executes a program. The image includes a primary image portion and a first secondary image portion that is superimposed on the primary image portion. The method comprises the following steps of:
configuring the mobile device to transmit the primary image portion to the electronic device for display by the electronic device;
configuring the mobile device that executes the program to transform the first secondary image portion into a second secondary image portion that conforms with a specified presentation;
configuring the mobile device to display the second secondary image portion;
configuring the mobile device that executes the program to generate a new primary image portion in response to a control signal generated as a result of user operation; and
configuring the mobile device to transmit the new primary image portion to the electronic device for display by the electronic device.
According to another aspect, a method of the present invention is to be implemented by a mobile device for interacting with an electronic device having a display function. The mobile device is operable to display at least one image generated by a processor of the mobile device that executes a program. The method comprises the following steps of:
configuring the mobile device to transmit the image to the electronic device for display by the electronic device;
configuring the mobile device to generate a new image in response to a control signal from a peripheral device that is operatively coupled to the mobile device; and
configuring the mobile device to transmit the new image to the electronic device for display by the electronic device.
Yet another object of the invention is to provide a mobile device that is operable to implement the aforementioned methods.
Still another object of the invention is to provide an interactive system that comprises an electronic device having a display function and a mobile device that is operable to implement the aforementioned methods, such that the mobile device is capable of interacting with the electronic device in real time.
Other features and advantages of the present invention will become apparent in the following detailed description of the preferred embodiments with reference to the accompanying drawings, of which:
Before the present invention is described in greater detail, it should be noted that like elements are denoted by the same reference numerals throughout the disclosure.
The mobile device 100 includes a display unit 1, a control unit 2 that is coupled to the display unit 1, an image transforming unit 3 that is coupled to the control unit 2, an output unit 4, a signal generator 5 and a storage unit 6. In this embodiment, the control unit 2 may be a processor or CPU or GPU of the mobile device 100, and is operable to control operations of the various components of the mobile device 100.
The display unit 1 may be a touch screen of the mobile device 100, for displaying an image as shown in
The image transforming unit 3 is operable to transform the first secondary image portion 11 into a second secondary image portion 12 that conforms with a specified presentation, that serves as a user interface, and that is displayed by the display unit 1 of the mobile device 100 (as shown in
The output unit 4 is operable to, upon being instructed by the control unit 2, transmit the primary image portion 10 to the electronic device 200 for display by the electronic device 200 (as shown in FIG. 5). The image transmission can be wired or wireless, and the primary image portion 10 can be processed by known image codec transformation techniques for more efficient transmission. The signal generator 5 is a touch sensing electronic circuit disposed at the display unit 1, and is operable to generate a control signal as a result of user operation (i.e., a result of a touch event).
The storage unit 6 is provided for storing a plurality of setup values that are set by the user and associated with the mobile device 100. Besides, the control unit 2 can be a processor to handle all or part of the circuitry operations.
Referring to
In step S11, the control unit 2 of the mobile device 100 is operable to configure the output unit 4 to transmit the primary image portion 10 to the electronic device 200 for display by the electronic device 200, upon receiving the request from the user.
In step S12, the control unit 2 of the mobile device 100 is operable to transform the first secondary image portion 11 (see
In step S13, the control unit 2 of the mobile device 100 is operable to configure the display unit 1 to display the second secondary image portion 12. That is, the primary image portion 10 and the second secondary image portion 12 are displayed by the electronic device 200 and the mobile device 100, respectively.
In step S14, the control unit 2 of the mobile device 100 is operable to generate a new primary image portion 10 in response to a control signal generated by the signal generator 5 as a result of user operation. In this embodiment, the user operation involves the user touching the second secondary image portion 12 on the display unit 1 (e.g., touching the second operation button (A) 121), prompting the signal generator 5 to generate a control signal indicating a touch event the associated operation button. The control unit 2 is then operable to generate a new primary image portion 10 in response (e.g., swing of a golf club).
Then, in step S15, the control unit 2 of the mobile device 100 is operable to configure the output unit 4 to transmit the new primary image portion 10 to the electronic device 200 for display by the electronic device 200 in real time. This serves as a response to the user operation.
In addition, in order to make the user aware of a successful touch event on the operation buttons of the second secondary image portion 12, the mobile device 100 may further include a vibration unit 7 (see
Setup of the interactive system 300 before performing step S11 and the flow of step S14, in which the control unit 2 of the mobile device 100 generates the new primary image portion 10, will now be described in detail in the following, with respect to a new game (i.e., a newly developed application that is compatible with the interactive system 300, and parameters thereof are adjustable) and an existing game (i.e., an application that has been developed commercially and parameters thereof are not adjustable).
The following paragraphs are directed to the case in which a new game is executed, and each of the first secondary image portion 11 and the second secondary image portion 12 includes objects built in the Android operating system (OS).
Referring to
It should be noted that, while this invention is exemplified using Android operating system (OS) as a development platform, other operating systems may be employed in other embodiments of this invention.
Referring to
In step S102, the control unit 2 is operable to configure the display unit 1 to display the image and the first objects, the latter serving as main trigger objects. For example, the user may only desire to use the mobile device rather than the electronic device for playing the video game.
In step S103, the control unit 2 is operable to configure the display unit 1 to display the second objects that serve as the main trigger objects, and the flow goes to step S11, in which the control unit 2 is operable to transmit the primary image portion 10 and optionally the first secondary image portion 11 that is superimposed on the primary image portion 10 to the electronic device 200 for display by the electronic device 200 (as shown in
In step S141, the control unit 2 is operable to detect the control signal from the signal generator 5 in the physical tier. The flow goes to step S142 when the control signal is detected, and goes back to step S141 when otherwise. The control signal is a result of a touch event in this embodiment, and the signal generator 5 can generate the control signal from events of other components in the physical tier, such as at least one of a motion detector and a peripheral device.
In step S142, the control unit 2 is operable to transmit the control signal to the kernel layer 80. The kernel layer 80 is configured to process the control signal and to obtain a spatial point that is associated with a location of the display unit 1 touched by the user.
In step S143, the spatial point is then transmitted to the framework layer 81.
Then, in step S144, the spatial point serves as a reference in a callback operation, in which the control unit 2 transmits the spatial point from the framework layer 81 to the application layer 82. For example, the framework layer 81 may include a program library that is operable to link the kernel layer 80 with the application layer 82 in the android operating system, and to associate the spatial point to a specific operation button on the user interface, which is further associated with a specific button parameter that is defined by the application in the application layer 82. Thus, the application layer 82 is notified that the specific operation button (e.g., the second D-pad 120, the second operation button (A) 121 or the second operation button (B) 122) is touched.
The flow then goes to step S145, in which the control unit 2 is operable to change appearances of the second object that is touched and the associated first object (e.g., change in color and/or size), through the framework layer 81. The first and second objects thus altered are then displayed respectively by the display unit 1 and the electronic device 200 for informing the user of the detected touch event on the specific operation button.
In step S146, the control unit 2 is operable to generate a new primary image portion 10 based on the first and second objects through the application layer 82. It is noted that the kernel layer 80 is responsible for the steps 141 through 143, the framework layer 81 for the steps 144 and 145, and the application layer 82 for the step 146.
In this embodiment, the electronic device 200 can be configured to only display the primary image portion 10, as shown in
The following paragraphs are directed to the case in which a new game is executed, and each of the first secondary image portion 11 and the second secondary image portion 12 includes objects having new button layouts designed by the user or application developer, as shown in
Referring to
In step S112, the control unit 2 is operable to configure the display unit 1 to display the game image and the first button images serving as the main trigger objects. In step S113, the control unit 2 is operable to configure the display unit 1 to display the second button images that serve as the main trigger objects, and the flow goes to step S11 (see
The sub-steps of step S14 will now be described in detail with reference to
In step S151, the control unit 2 is operable to detect the control signal from the signal generator 5. In this case, the control signal is a touch control signal as a result of a touch event. The flow goes to step S152 when the touch control signal is detected, and goes back to step S151 when otherwise.
In step S152, the control unit 2 is operable to transmit the touch control signal to the kernel layer 80. The kernel layer 80 is configured to process the touch control signal and then to obtain a spatial point that is associated with a location of the display unit 1 touched by the user.
In step S153, the spatial point is then transmitted to the framework layer 81.
In step S154, the control unit 2 is operable to map the spatial point spatially onto one of the first button images on the first secondary image portion 11 through the framework layer 81, so as to establish a link between the spatial point and the corresponding one of the first button images.
Then, in step S155, the spatial point serves as a reference in a callback operation, in which the control unit 2 transmits the spatial point from the framework layer 81 to the application layer 82. Thus, the application layer 82 is notified that the specific operation button (e.g., the second D-pad 120, the second operation button (A) 121 or the second operation button (B) 122) is touched.
The flow then goes to step S156, in which the control unit 2 is operable to change appearance of the touched second button image through the framework layer 81. The second button image thus altered is then displayed by the display unit 1.
In step S157, the control unit 2 is operable to transmit a flag to the application layer 82 indicating that one of the second button images is touched, so as to enable triggering of the associated first button image. Thus, appearance of the associated first button image can be changed concurrently for informing the user of a detected touch event on the specific operation button.
In step S158, the control unit 2 is operable to generate a new primary image portion 10 based on the first and second button images through the application layer 82. It is noted that the kernel layer 80 is responsible for the steps 151 through 153, the framework layer 81 for the steps 154 and 157, and the application layer 82 for the step 158.
When each of the first secondary image portion 11 and the second secondary image portion 12 includes objects built in the Android operating system (see
In this embodiment, the electronic device 200 can be configured to only display the primary image portion 10, as shown in
The following paragraphs are directed to the case in which an existing game is executed, and the game parameters cannot be changed. In such case, the second secondary image portion 12 includes objects having new button layouts, and the first secondary image portion 11 may include either objects having new button layouts or objects built in the Android operating system. For illustration purposes, in this embodiment, the first secondary image portion 11 includes objects built in the Android operating system. Thus, each of the first D-pad 110, the first operation button (A) 111 and the first operation button (B) 112 is defined as a distinct first object, while the second D-pad 120, the second operation button (A) 121 and the second operation button (B) 122 are three distinct second button images each associated with one of the first objects.
Referring to
In step S122, the control unit 2 is operable to configure the display unit 1 to display the game image and the first objects, the latter serving as main trigger objects. In step S123, the control unit 2 is operable to configure the display unit 1 to display the second button images that serve as the main trigger objects, and the flow goes to step S11 (see
The sub-steps of step S14 will now be described in detail with reference to
In step S161, the control unit 2 is operable to detect the touch control signal from the signal generator 5. The flow goes to step S162 when the touch control signal is detected, and goes back to step S161 when otherwise.
In step S162, the control unit 2 is operable to transmit the touch control signal to the kernel layer 80. The kernel layer 80 is configured to process the touch control signal and then to obtain a spatial point that is associated with a location of the display unit 1 touched by the user.
In step S163, the control unit 2 then transmits the spatial point to the framework layer 81.
Then, in step S164, a control process in the framework layer 81 is executed by the control unit 2 in a callback operation in the kernel layer 80. The control process is configured for establishing a link between each of the second image buttons and a corresponding one of the first objects, such that the touch event on one of the second image buttons leads to a simultaneous trigger event of the linked one of the first objects.
In this embodiment, an area of the second secondary image portion 12 is larger than that of the first secondary image portion 11 (the second D-pad 120 and the first D-pad 110 are illustrated in
Referring back to
In step S172, the spatial point serves as a reference to the callback operation, such that the control process can be notified that the specific operation button (e.g., the second D-pad 120, the second operation button (A) 121 or the second operation button (B) 122) is touched.
The flow then goes to step S173, in which the control process is operable to change appearance of the touched second button image through the framework layer 81. The second button image thus altered is then displayed by the display unit 1.
In step S174, the control process is operable to create the touch event associated with the specific second button image so as to establish the link between the specific operation button and the corresponding first object.
The control process is then operable to execute the callback operation toward the application layer 82 and the existing game in step S175 and step S176, respectively.
The flow then goes to step S181, in which the callback operation from the framework layer 81 is to allow execution the existing game. When executed, the flow goes to step S182. Otherwise, the flow goes back to step S181 to await execution.
The control unit 2 is operable to change appearance of the linked first object in step S182, and is operable, in step S183, to generate a new primary image portion 10 based on the linked first object through the application layer 82.
In this embodiment, the electronic device 200 can be configured to only display the primary image portion 10, as shown in
In step S183 (see
It has been shown that the aforementioned procedures in combination the multilayer architecture are capable of processing the interaction between the mobile device 100 and the electronic device 200 in all cases involving both the new game and the existing game, and involving both objects built in the Android operating system and objects having new button layouts designed by the user or application developer.
Reference is now made to
Further referring to
The display unit 1 is for displaying an image, as shown in
The output unit 4 is operable to, upon being instructed by the control unit 2, transmit the primary image portion 10 to the electronic device 200 for display by the electronic device 200.
The communication interface 8 is for communication with a peripheral device 400 that is operatively coupled to the mobile device 100. In this embodiment, the peripheral device 400 includes a press key unit having a D-pad key 410, a first operation key 411 and a second operation key 412 that correspond respectively to the first D-pad 110, the first operation button (A) 111 and the first operation button (B) 112. The vibration unit 7 is operable to vibrate at a specified frequency when one of the operation keys of the press key unit is pressed. It is worth noting that, while each operation key of the press key unit is assigned a specific vibration frequency in this embodiment, the vibration frequency associated with each operation key of the press key unit can be user-configured.
Referring to
In step S21, the control unit 2 of the mobile device 100 is operable to configure the output unit 4 to transmit the primary image portion 10 to the electronic device 200 for display by the electronic device 200.
In step S22, the control unit 2 of the mobile device 100 is operable to generate a new primary image portion 10 in response to a control signal from the peripheral device 400 that is operatively coupled to the mobile device 100. Unlike the first preferred embodiment, the user is operating the peripheral device 400, rather than the mobile device 100, to interact with the electronic device 200. Accordingly, the control signal is a press signal generated by the peripheral device 400, upon pressing the press key unit of the peripheral device 400.
Then, in step S23, the control unit 2 of the mobile device 100 is operable to configure the output unit 4 to transmit the new primary image portion 10 to the electronic device 200 for display by the electronic device 200 in real time. This serves as a response to user operation.
Setup of the interactive system 300 before performing step S21 and the flow of step S22, in which the control unit 2 of the mobile device 100 generates the new primary image portion 10 will now be described in detail in the following, with respect to both a new game and an existing game.
The following paragraphs are directed to the case in which a new game is executed. In this case, the first secondary image portion 11 includes objects having new button layouts designed by the user or application developer. Hence, the first D-pad 110, the first operation button (A) 111 and the first operation button (B) 112 of the first secondary image portion 11 are three distinct first button images.
Referring to
In step S202, the control unit 2 is operable to configure the display unit 1 to display the game image and the first button images that serve as the main trigger objects.
In step S203, the control unit 2 is operable to use the press signal from the peripheral device 400 as the main trigger object, and the flow goes to step S21, in which the control unit 2 is operable to transmit the primary image portion 10 and the first secondary image portion 11 that is superimposed on the primary image portion 10 to the electronic device 200 for display by the electronic device 200 (as shown in
The of step S22 are described with further reference to
In step S241, the control unit 2 is operable to detect the press signal from the peripheral device 400. The flow goes to step S242 when the press signal is detected, and goes back to step S241 when otherwise.
In step S242, the control unit 2 is operable to transmit the press signal to the kernel layer 80.
In step S243, the kernel layer 80 is then configured to transmit the press signal to the framework layer 81.
In step S244, the press signal serves as a reference in a callback operation, in which the control unit 2 transmits the press signal from the framework layer 81 to the application layer 82. Thus the application layer 82 is notified that the specific operation key (e.g., the D-pad key 410, the first operation key 411 and the second operation key 412) is pressed.
The flow then goes to step S245, in which the control unit 2 is operable to change appearance of the corresponding first button image in the framework layer 81.
In step S246, the control unit 2 is operable to generate a new primary image portion 10 based on the first button image through the application layer 82. It is noted that the kernel layer 80 is responsible for the steps 241 through 243, the framework layer 81 for the steps 244 and 245, and the application layer 82 for the step 246.
In this embodiment, the electronic device 200 can be configured to only display the primary image portion 10. In such case, step S245 becomes redundant and can be omitted.
The following paragraphs are directed to the case in which an existing game is executed. Similar to the case with the new game, the first secondary image portion 11 includes objects having new button layouts designed by the user or application developer. Hence, the first D-pad 110, the first operation button (A) 111 and the first operation button (B) 112 of the first secondary image portion 11 are three distinct first button images.
Referring to
In step S212, the control unit 2 is operable to configure the display unit 1 to display the game image and the first button images that serve as main trigger objects.
In step S213, the control unit 2 is operable to use the press signal as the main trigger object, and the flow goes to step S21, in which the control unit 2 is operable to transmit the primary image portion 10 and the first secondary image portion 11 that is superimposed on the primary image portion 10 to the electronic device 200 for display by the electronic device 200 (as shown in
The sub-steps of step S22 are now described with further reference to
In step S251, the control unit 2 is operable to detect the press signal from the peripheral device 400. The flow goes to step S252 when the press signal is detected, and goes back to step S251 when otherwise.
In step S252, the control unit 2 is operable to transmit the press signal to the kernel layer 80.
In step S253, the control unit 2 is further operable to create a touch event that can be subsequently mapped onto a corresponding one of the first button images.
In step S254, the touch event is then transmitted to the kernel layer 80.
In step S255, the touch event is transmitted to the framework layer 81 from the kernel layer 80.
In step S256, the touch event serves as a reference in a callback operation, in which the control unit 2 transmits the press signal from the framework layer 81 to the application layer 82. Thus, the application layer 82 is notified that the specific operation key (e.g., the D-pad key 410, the first operation key 411 or the second operation key 412) is pressed.
The flow then goes to step S257, in which the control unit 2 is operable to change appearance of the corresponding first button image through the framework layer 81.
In step S258, the control unit 2 is operable to generate a new primary image portion 10 based on the first button image in the application layer 82.
In this embodiment, the electronic device 200 can be configured to only display the primary image portion 10. In such case, step S257 becomes redundant and can be omitted (see
It has been shown that the aforementioned procedures in combination with the multilayer architecture are capable of processing the interaction between the mobile device 100 and the electronic device 200 in all cases involving both the new game and the existing game. The second preferred embodiment has the same advantages as those of the first preferred embodiment.
Reference is now made to
Further referring to
Further referring to
In step S31, the control unit 2 of the mobile device 100 is operable to configure the output unit 4 to transmit the primary image portion 10 and the first secondary image portion 11 to the electronic device 200 for display by the electronic device 200.
In step S32, the control unit 2 of the mobile device 100 is operable to transform the first secondary image portion 11 into a second secondary image portion 12 that conforms with a specified presentation. In this embodiment, the configuration of the second secondary image portion 12 (e.g., size, location and shape) can be specified by the user via the user interface of the mobile device 100 (examples of which are shown in
In step S33, the control unit 2 of the mobile device 100 is operable to configure the display unit 1 to display the second secondary image portion 12. That is, the primary image portion 10 and the second secondary image portion 12 are displayed by the electronic device 200 and the mobile device 100, respectively.
In step S34, the control unit 2 of the mobile device 100 is operable to generate a new primary image portion 10 in response to the control signal generated by the signal generator 5 and/or by the peripheral device 400 as a result of user operation (e.g., touch event and/or press event).
Then, in step S35, the control unit 2 of the mobile device 100 is operable to configure the output unit 4 to transmit the new primary image portion 10 to the electronic device 200 for display by the electronic device 200 in real time. This serves as a response to the user operation.
Setup of the interactive system 300 before performing step S31 and the flow of step S34, in which the control unit 2 of the mobile device 100 generates the new primary image portion 10 will now be described in detail in the following, with respect to both a new game and an existing game.
The following paragraphs are directed to the case in which a new game is executed. In this case, as shown in
Referring to
In step S302, the control unit 2 is operable to configure the display unit 1 to display the image and the first button images that serve as the main trigger objects.
In step S303, the control unit 2 is operable to use the second button images and the D-pad key 410 as the main trigger objects, and the flow goes to step S31, in which the control unit 2 is operable to transmit the primary image portion 10 and the first secondary image portion 11 that is superimposed on the primary image portion 10 to the electronic device 200 for display by the electronic device 200 (as shown in
The sub-steps of step S34 are now described with further reference to
In step S321, the control unit 2 is operable to detect the press signal from the peripheral device 400. The flow goes to step S322 when the press signal is detected, and goes back to step S321 when otherwise.
In step S322, the control unit 2 is operable to transmit the press signal to the kernel layer 80.
In step S323, the kernel layer 80 is then configured to transmit the press signal to the framework layer 81.
Instep S324, the press signal serves as a reference in a callback operation, in which the control unit 2 transmits the press signal from the framework layer 81 to the application layer 82. Thus, the application layer 82 is notified that the specific operation key (e.g., the D-pad key 410) is actuated.
In addition to detecting the press event of the peripheral device 400, the control unit 2 is further operable to detect the touch event of the display unit 1 concurrently. The procedure is described below.
In step S331, the control unit 2 is operable to detect the touch control signal from the signal generator 5. The flow goes to step S332 when the touch control signal is detected, and goes back to step S331 when otherwise.
In step S332, the control unit 2 is operable to transmit the touch control signal to the kernel layer 80. The kernel layer 80 is configured to process the touch control signal and to obtain the spatial point that is associated with a location of the display unit 1 touched by the user.
In step S333, the control unit 2 then transmits the spatial point to the framework layer 81.
Then, in step S334, the control unit 2 is operable to map the spatial point spatially onto one of the first button images on the first secondary image portion 11 through the framework layer 81, so as to establish a link between the spatial point and the corresponding first button image.
Then, in step S335, the spatial point serves as a reference in a callback operation, in which the control unit 2 transmits the spatial point from the framework layer 81 to the application layer 82. Thus, the application layer 82 is notified that the specific operation button (e.g., the second operation button (A) 121 or the second operation button (B) 122) is touched.
After step S324 and/or S335, the flow goes to step S336, in which the control unit 2 is operable to change appearances of the first and second button images through the framework layer 81. The second button image thus altered is then displayed by the display unit 1.
In step S337, the control unit 2 is operable to generate a new primary image portion 10 based on the first and second button images through the application layer 82.
In this embodiment, the electronic device 200 can be configured to only display the primary image portion 10. In such case, the control unit 2 changes appearance of the touched second button image in step S336, and generates the new primary image portion 10 based on the touched second button image in step S337.
The following paragraphs are directed to the case in which an existing game is executed, and the game parameters cannot be changed. In such case, as shown in
Referring to
In step S312, the control unit 2 is operable to configure the display unit 1 to display the game image and the first button images that serve as the main trigger objects. In step S313, the control unit 2 is operable to use the second button images and the D-pad key 410 as the main trigger objects, and the flow goes to step S21, in which the control unit 2 is operable to transmit the primary image portion 10 and the first secondary image portion 11 that is superimposed on the primary image portion 10 to the electronic device 200 for display by the electronic device 200 (as shown in
The sub-steps of step S34 will now be described with further reference to
In step S341, the control unit 2 is operable to detect the press signal from the peripheral device 400. The flow goes to step S342 when the press signal is detected, and goes back to step S341 when otherwise.
In step S342, the control unit 2 is operable to transmit the press signal to the kernel layer 80. The control unit 2 is further operable to create a touch event that can be subsequently mapped onto a corresponding one of the first button images in step S343. The touch event is then transmitted to the kernel layer 80 in step S344 and transmitted to the framework layer 81 in step S345.
Then, in step S346, a control process is executed by the control unit 2 in a callback operation in the kernel layer 80.
In addition to detecting the press event of the peripheral device 400, the control unit 2 is further operable to detect the touch event of the display unit 1 concurrently. The procedure is described below.
In step S351, the control unit 2 is operable to detect the touch control signal from the signal generator 5. The flow goes to step S352 when the touch control signal is detected, and goes back to step S351 when otherwise.
In step S352, the control unit 2 is operable to transmit the touch control signal to the kernel layer 80. The kernel layer 80 is configured to process the touch control signal and to obtain a spatial point that is associated with a location of the display unit 1 touched by the user.
In step S353, the control unit 2 then transmits the spatial point to the framework layer 81.
Then, in step S354, the control process in the framework layer 81 is executed by the control unit 2 in a callback operation in the kernel layer 80.
The control process is operable to allow the callback operation from the kernel layer 80 to execute the control process in step S361. When executed, the flow goes to step S362. Otherwise, the flow goes back to step S361 to await execution.
In step S362, the spatial point serves as a reference to the callback operation, such that the control process can be notified that the specific operation button (e.g., the second operation button (A) 121 or the second operation button (B) 122) is touched.
The flow then goes to step S363, in which the control process is operable to change appearance of the touched second button image through the framework layer 81. The second button image thus altered is then displayed by the display unit 1.
In step S364, the control process is operable to create the touch event associated with the specific second button image so as to establish the link between the specific operation button and the corresponding first button image. The control process is then operable to execute the callback operation toward the application layer 82 and the existing game instep S365 and step S366, respectively. The flow then goes to step S371, in which the callback operation from the framework layer 81 to is to allow execution of the existing game. When executed, the flow goes to step S372. Otherwise, the flow goes back to step S371 to await execution.
The control unit 2 is operable to change appearance of the corresponding first button image based on the touch event in step S372, and is operable, in step S373, to generate a new primary image portion 10 based on the linked first object through the application layer 82.
In this embodiment, the electronic device 200 can be configured to only display the primary image portion 10. In such case, step S372 becomes redundant and can be omitted. The control unit 2 then generates the new primary image portion 10 based on the second button image in step S373.
It has been shown that the aforementioned procedures in combination with the multilayer architecture are capable of processing the interaction between the mobile device 100 and the electronic device 200 in all cases involving both the new game and the existing game.
Reference is now made to
Reference is now made to
Compared with the first preferred embodiment, the mobile device 100 in this embodiment is operable to interact with the electronic device 200 in real time via the movement of the mobile device 100, instead of the touch event. Such interaction configuration can be further categorized into two modes, namely an air mouse mode and an axis transformation mode. In the air mouse mode, the mobile device 100 is operable as a mouse for controlling a cursor displayed on the electronic device 200 via movement of the mobile device 100. In the axis transformation mode, the mobile device 100 is operable to perform a change of page orientation (e.g., change from portrait mode to landscape mode, or vise versa) for accommodating different game requirements.
Further referring to
In step S51, the control unit 2 of the mobile device 100 is operable to configure the output unit 4 to transmit the primary image portion 10 to the electronic device 200 for display by the electronic device 200.
In step S52, the control unit 2 of the mobile device 100 is operable to transform the first secondary image portion 11 into a second secondary image portion 12 that conforms with a specified presentation (see
In step S53, the control unit 2 of the mobile device 100 is operable to configure the display unit 1 to display the second secondary image portion 12.
In step S54, the control unit 2 of the mobile device 100 is operable to generate a new primary image portion 10 in response to a control signal generated by the signal generator 5 as a result of user operation.
Then, in step S55, the control unit 2 of the mobile device 100 is operable to configure the output unit 4 to transmit the new primary image portion 10 to the electronic device 200 for display by the electronic device 200 in real time. This serves as a response to the user operation.
The sub-steps of step S54, in which the control unit 2 of the mobile device 100 generates the new primary image portion 10, will now be described in detail in the following, with respect to the air mouse mode and the axis transformation mode.
The following paragraphs are directed to the case of the air mouse mode, with further reference to
In this case, the primary image portion 10 is an icon menu for home-screen of a smart phone, or a springboard of the iPhone OS, the first secondary image portion 11 is a cursor, and the second secondary image includes a first click button 521, a second click button 522, a scroll control button 523 and a touchpad area 524.
In step S541, the control unit 2 is operable to detect the motion signal from the signal generator 5. In this embodiment, the motion signal includes at least one of an angular displacement and an angular acceleration of the mobile device 100 on at least one coordinate axis of a three-axis coordinate system, and the coordinate system include three aircraft principle axes, namely a yaw axis, a roll axis and a pitch axis. The flow goes to step S542 when the motion signal is detected, and goes back to step S541 when otherwise.
In step S542, the control unit 2 is operable to detect the motion signal on the yaw axis. That is, the control unit 2 detects at least one of an angular displacement and an angular acceleration of the mobile device 100 on the yaw axis. The flow goes to step S543 when the motion signal is detected on the yaw axis, and goes to step S544 when otherwise.
In step S543, the control unit 2 is operable to create a horizontal motion event associated with the at least one of an angular displacement and an angular acceleration of the mobile device 100 on the yaw axis. The flow then goes to step S546.
In step S544, the control unit 2 is operable to detect the motion signal on the pitch axis in a manner similar to step S542. The flow goes to step S545 when the motion signal is detected on the pitch axis, and goes back to step S541 when otherwise.
In step S545, the control unit 2 is operable to create a vertical motion event associated with the at least one of an angular displacement and an angular acceleration of the mobile device 100 on the pitch axis. The flow then goes to step S546. A procedure starting from step S546 is implemented by the control unit 2 for detecting the touch control signal attributed to the first secondary image portion 11. The touch control signal cooperates with the motion signal for providing better interactive effects. For example, when only the motion signal is detected, the first secondary image portion 11 (i.e., the cursor) is moved accordingly on the electronic device 200, while the primary image portion 10 is held still. When the motion signal and a hold control signal associated with the first click button 521 (i.e., as if a left click button of a mouse is clicked and held) are both detected, the primary image portion 10 is instead moved as if being dragged in the direction of the motion signal. The procedure will be described in detail in the following.
In step S546, the control unit 2 is operable to detect the hold control signal associated with the first click button 521 from the signal generator 5. The control unit 2 can be operable to detect the hold control signal, or the touch control signal associated with other buttons in the second secondary image portion 12 in other embodiments. The flow goes to step S547 when the touch control signal is detected, and goes to step S553 when otherwise.
In step S547, the control unit 2 is operable to create a hold event associated with the first click button 521. The hold event is then transmitted, along with either the horizontal motion event or the vertical motion event, to the kernel layer 80 in step S548 and to the framework layer 81 in step S549.
Then, in step S550, the hold event serves as a reference in a callback operation, in which the control unit 2 transmits the hold event from the framework layer 81 to the application layer 82. Thus, the application layer 82 is notified that the first click button 521 is touched.
The flow then goes to step S551, in which the control unit 2 is operable to change appearance of the first click button 521 through the framework layer 81. The first click button 521 thus altered is then displayed by the electronic device 200 for informing the user of a detected touch event on the first click button 521.
In step S552, the control unit 2 is operable to generate a new primary image portion 10 based on either the horizontal motion event or the vertical motion event through the application layer 82. That is, the new primary image portion 10 is shifted accordingly, compared to the original primary image portion 10.
When the hold event is not detected in step S546, the control unit 2 is operable to transmit either the horizontal motion event or the vertical motion event to the kernel layer 80 instep S553 and to the framework layer 81 in step S554.
In step S555, the control unit 2 is operable to generate a new primary image portion 10 based on either the horizontal motion event or the vertical motion event through the application layer 82. That is, the new first secondary image portion 11 (cursor) is moved accordingly, compared to the original first secondary image portion 11.
It is noted that, since the display unit 1 of the mobile device 100 is a touch screen, it can be configured such that a part of the display unit 1 serves as a touchpad area 524 which controls the movement of the first secondary image portion 11 (cursor), and that the scroll control button 523 is operable to control the movement of the primary image portion 10. Since the controlling mechanism is similar to that of the first preferred embodiment, further details are omitted herein for the sake of brevity. Alternatively, the second secondary image portion 12 can be configured to include a touch area 525(as shown in
The following paragraphs are directed to the case (e.g., angry bird) of the axis transformation mode, with further reference to
In step S561, referring to
In step S562, the control unit 2 is operable to determine, based on the application being executed, whether or not the coordinate axis transform is required. The flow goes to step S563 when the coordinate axis transform is required, and goes to step S568 when otherwise.
In step S563, the control unit 2 is operable to interchange the pitch axis and the roll axis that correspond to the movement of the mobile device 100. Accordingly, a new axis coordinate system is obtained. The yaw axis is not changed in this procedure because the signal generator 5 detects the motion signal of the yaw axis in portrait control and landscape control. In other embodiments, the axes can also be changed in other ways (e.g., rotating the coordinate system by 90 degrees about one of the axes), for obtaining other axis coordinate systems.
Then, in step S564, the control unit 2 is operable to create a motion event based on the new axis coordinate system. In addition, in step S568, the control unit 2 is operable to create the motion event based on the original axis coordinate system. The motion event is then transmitted to the kernel layer 80 in step S565 and transmitted to the framework layer 81 in step S566.
In step S567, the control unit 2 is operable to generate a new primary image portion 10 based on the motion event.
It has been shown that the aforementioned procedures in combination with the multilayer architecture are capable of processing the interaction between the mobile device 100 and the electronic device 200 in all cases involving both the air mouse mode and the axis transformation mode. The fifth preferred embodiment has the same advantages as those of the previous preferred embodiments.
To sum up, since the mobile device 100 of this invention is operable to transmit the primary image portion 10 to the electronic device 200 for display by the electronic device 200, and is operable to transform the first secondary image portion 11 into the second secondary image portion 12 that conforms with a specified presentation, the interaction between the mobile device 100 and the electronic device 200 is achieved, thereby providing the user with a relatively more friendly environment.
While the present invention has been described in connection with what are considered the most practical and preferred embodiments, it is understood that this invention is not limited to the disclosed embodiments but is intended to cover various arrangements included within the spirit and scope of the broadest interpretation so as to encompass all such modifications and equivalent arrangements.
This application claims priority of U.S. Provisional Application No. 61/478,945, filed on Apr. 26, 2011.
Number | Date | Country | |
---|---|---|---|
61478945 | Apr 2011 | US |