The present disclosure relates to an information processing apparatus, an information processing method, and a program.
Along with the spread of smartphones and the expansion in network connectivity, technologies for treating a touch panel-equipped terminal as a controller for controlling a TV set, a game console, or the like have been provided. However, since hardware buttons are not present on a touch panel, it is easy for users to press the wrong button, especially in situations where the user is not looking at the controller, such as when playing a game. For this reason, when a user makes operations using a touch panel, the user will sometimes momentarily glance at the controller to prevent large positional displacements from occurring between the position of the button the user wishes to press and the position actually touched by the user. Note that the “buttons” referred to here are one example of objects.
Technologies for preventing such positional displacements include technologies that provide the user with a virtual haptic sensation through the use of vibration, for example. However, such technologies can only be applied in devices with a function for generating vibration. For normal devices, there is also a technology for enlarging and reducing the recognition regions used to recognize objects selected using a touch panel and a technology for shifting such regions.
As one example, there is a technology that shifts a recognition region for recognizing a selected object according to the position where an object is disposed (see for example Japanese Laid-Open Patent Publication No. 2011-175456). Since it is difficult to select an object disposed at a lower end of a touch panel of a copier, for example, this technology shifts the recognition region upward. If the recognition region is shifted in this way, an object may be selected even if the position touched by the user is displaced from the position of the object the user intended to select.
Although the technology disclosed in the cited publication is effective when the displacement between the position touched by the user and the position of the object the user wishes to select is constant, such displacement is sometimes not constant. For example, when a touch panel is used as a controller, since it is necessary to select objects multiple times in a short time without looking at the touch panel, the displacement may change. This means it is difficult to recognize the object the user intended to select by merely shifting the recognition region in a uniform manner.
Also, even if the recognition region is shifted in a uniform manner, there is the possibility that the position touched by the user will become increasingly displaced from an object position and that the user will touch a position midway between a plurality of objects, for example. Such possibility is especially high when a controller equipped with a touch panel is used since it is common for users to make operations without looking at the panel.
For the reasons given above, there is demand for a technology for improving accuracy when selecting a user's intended object. As examples, it is desirable to improve accuracy even when selecting a user's intended object when there is a large displacement between the position of an object and the position touched by the user and when the user touches a position midway between a plurality of objects.
According to an embodiment of the present disclosure, there is provided an information processing apparatus including a position detecting unit detecting a contact position at which an input object has touched a touch panel, and a selecting unit selecting any one of a plurality of objects based on the contact position and a last contact position.
Further, according to an embodiment of the present disclosure, there is provided an information processing method including detecting a contact position at which an input object has touched a touch panel, and selecting any one of a plurality of objects based on the contact position and a last contact position.
Further, according to an embodiment of the present disclosure, there is provided a program for causing a computer to function as an information processing apparatus, the information processing apparatus including a position detecting unit detecting a contact position at which an input object has touched a touch panel, and a selecting unit selecting any one of a plurality of objects based on the contact position and a last contact position.
According to the embodiments of the present disclosure described above, it is possible to improve accuracy when selecting a user's intended object.
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
Also, in this specification and the drawings, in some cases a plurality of structural elements that have substantially the same function and structure are distinguished by different letters that have been appended to the same reference numerals. However, when it is not especially necessary to distinguish between such plurality of structural elements with effectively the same function and structure, only the same reference numerals are used.
Preferred embodiments of the present disclosure will now be described in the order indicated below.
According to the present embodiment of the disclosure, it is possible to improve accuracy when selecting a user's intended object out of a plurality of objects displayed on a touch panel. For example, it is possible to improve the accuracy when selecting a user's intended object even when a user makes an input on a touch panel without looking at the touch panel and there is a large displacement between the object the user wishes to select and the position touched by the user or the user selects a position midway between a plurality of objects. In particular, according to the present embodiment of the disclosure, by focusing on the relationship between a last contact position on a touch panel and the present button being pressed, a technology for improving accuracy when selecting a user's intended object is realized.
Note that although an example where a device selects which button has been pressed by a user in order to select an object is described below, the selection of an object is not limited to the selection of a button pressed by the user and may be selection of an object aside from a button, such as an icon. The relationship between a last contact position on a touch panel and the present button being pressed will be described first. When a touch panel-equipped terminal (for example, a terminal of approximately the same size as a smartphone) is used as a controller, although there are exceptions, such terminal will usually be held in both hands near the base of the index fingers and operated using the thumbs.
Also, although four buttons, button A, button B, button C, and button D, to be pressed by the user are displayed on the display unit 140 in the example in
Although the execution result of the application is displayed on the display unit 140 in the example shown in
Here, as one example, as shown in
That is, when moving from the state shown in
This completes the description of the relationship between a last contact position and the present button being pressed. Next, the functions of the information processing apparatus 10 according to an embodiment of the present disclosure will be described.
Characteristics showing how the contact area and/or pressure change according to conditions such as whether the user is making a gentle operation, a sudden reactive operation, or a rapid pounding operation can also be given as such characteristic information. Note that such characteristic information can be obtained in advance by having the user play a simple mini game or the like and can be obtained from the normal operations made by the user.
The storage unit 120 uses a storage medium such as a semiconductor memory or a hard disk drive and stores programs and data for processing by the control unit 110. As one example, it is possible for the storage unit 120 to also store an application to be executed by the application executing unit 114. It is also possible for the storage unit 120 to store a history of contact positions or the like. Although the storage unit 120 is incorporated in the information processing apparatus 10 in the example shown in
The touch panel 130 detects a contact position of an input object. Such detection result is outputted to the control unit 110. As described above, the touch panel 130 may be included in the information processing apparatus 10 or may be present outside the information processing apparatus 10.
The position detecting unit 111 detects the contact position of the input object on the touch panel 130. More specifically, the contact position of the input object outputted from the touch panel 130 is detected by the position detecting unit 111. The contact position detected by the position detecting unit 111 is outputted to the selection unit 116 and is used to select the button pressed by the user.
The viewing determining unit 117 determines whether a part or all of a predetermined region including a plurality of buttons (for example, button A, button B, button C, and button D) is being viewed. As one example, the viewing determining unit 117 may determine whether a part or all of the predetermined region is being viewed by analyzing a picked-up image picked up by the image pickup unit 150 and detecting that the user's line of sight is orientated toward a part or all of the predetermined region.
Meanwhile, the viewing determining unit 117 may determine for example that a part or all of the predetermined region is not being viewed if it is detected that the user's line of sight is not orientated toward a part or all of the predetermined region. In place of the user's line of sight, the orientation of the user's face may be used. So long as a plurality of buttons are included, there are no particular limitations on the predetermined region, which may be the entire region of the display unit 140 or may be a region that is part of the region of the display unit 140 and includes a plurality of buttons.
The region control unit 118 controls recognition regions respectively associated with each of the plurality of buttons. As one example, the region control unit 118 may control the recognition regions respectively associated with each of the plurality of buttons based on the result of determination performed by the viewing determining unit 117. As one example, there are also cases where the user can make operations while looking at a part or all of the predetermined region including the plurality of buttons. Examples of such a case include situations where instantaneous operations are not necessary and where buttons are pressed in accordance with audio without looking at the buttons.
In such cases, there is a high probability that the displacement in the contact position will exhibit different tendencies between when a part or all of the predetermined region is being viewed and when such region is not being viewed. This means that by controlling the recognition regions associated with each of the plurality of buttons according to whether a part or all of the predetermined region is being viewed, it is possible to improve the accuracy when selecting the user's intended button. In more detail, if a part or all of the predetermined region is not being viewed, the recognition regions associated with each of the plurality of buttons are moved, but if a part or all of the predetermined region is being viewed, the recognition regions associated with each of the plurality of buttons do not have to be moved.
Also, if a predetermined time (for example, five seconds) has elapsed from the last contact, it is expected that the last contact will have less influence on the present pressing of a button. For this reason, the region control unit 118 may determine whether the predetermined time has elapsed from the last contact and control the recognition regions associated with each of the plurality of buttons if such predetermined time has not elapsed from the last contact.
Meanwhile, if such predetermined time has elapsed from the last contact, the region control unit 118 does not have to control the recognition regions associated with each of the plurality of buttons. By carrying out such control, it is possible to improve the accuracy of determining the button that the user is trying to press. Note that there are no particular limitations on the predetermined time mentioned here.
The selection unit 116 selects any one of the plurality of buttons based on the contact position detected by the position detecting unit 111 (hereinafter sometimes referred to as the “present contact position”) and the contact position detected the last time by the position detecting unit 111 (hereinafter sometimes referred to as the “last contact position”). The selected button is outputted to the application executing unit 114 as a button pressed by the user. More specifically, the respective positions of the plurality of buttons on the touch panel 130 may be set by an operating system or may be set by an application.
The selection unit 116 selects one of the plurality of buttons based on the set positions of the respective buttons, the present contact position, and the last contact position. As one example, the selection unit 116 selects a button associated with a recognition region to which the contact position detected by the position detecting unit 111 belongs. In addition, the selection unit 116 may carry out calibration based on a history of contact positions and select any one of the plurality of buttons based on the result of the calibration.
The application executing unit 114 executes an application based on the button selected by the selection unit 116. As one example, in the case shown in
The display control unit 115 controls the display unit 140 so that various buttons are displayed on the display unit 140. In the example shown in
In accordance with control by the display control unit 115, the display unit 140 displays various buttons. Also in accordance with control by the display control unit 115, the display unit 140 may display an application execution screen. However, the application execution screen may be displayed by a different display unit to the display unit 140. As described above, the display unit 140 may be provided in the information processing apparatus 10 or may be present outside the information processing apparatus 10. Note that the display unit 140 is constructed for example of an LCD (Liquid Crystal Display) or an organic LE (ElectroLuminescence) display apparatus.
The image pickup unit 150 picks up an eye region or a face region of the user and outputs a picked-up image obtained by the image pickup to the viewing determining unit 117. Note that although an example where the image pickup unit 150 is provided in the information processing apparatus 10 is shown in
One example of control of the recognition regions by the region control unit 118 will now be described in detail. Here, a case where x and y coordinates are set on the touch panel 130 as shown in
Although the recognition regions may be set in any way with the position of the button selected the last time as a standard, as one example it is possible to set recognition regions that are biased. As one example, if the user consecutively selects the same button, since the operation will continue without changing the contact position, it is believed that there will be little difference between the last contact position and the present contact position. This means that if the present contact position is a midpoint between the last selected button and a neighboring button, there is an extremely high probability that the user will have deliberately shifted the contact position, which means that there is a high probability that the button that the user presently wishes to press differs to the button selected the last time. For this reason, the region control unit 118 may set the recognition regions associated with each of the plurality of buttons so that a midpoint between the button selected the last time and a neighboring button is included in the recognition region associated with such neighboring button.
In the example in
In the same way, in the example shown in
Also, for the example shown in
Note that the region control unit 118 may set the recognition regions associated with each of the plurality of buttons based on the button selected the last time. In the example shown in
Although the region control unit 118 controls the recognition regions set as described above, it is also possible to control the recognition regions associated with each of the plurality of buttons based on the last contact position, for example. For example, the region control unit 118 may move the recognition regions associated with each of the plurality of buttons in parallel by a displacement of the last contact position with the position of the button selected the last time as a standard.
In the example shown in
Next, another example of control by the region control unit 118 will be described in detail. When doing so, an example of control of the recognition regions set as described above is described. However, there are no particular limitations on the setting of the recognition regions.
For example, the distance by which the regions are moved in parallel may be a value produced by multiplying the displacement of the last contact position with the position of the last selected button as a standard by a predetermined coefficient. For example, if the predetermined coefficient is set at “0.5”, the recognition regions associated with each of the plurality of buttons are moved in parallel by (3, 4). By using such predetermined coefficient, it is possible to prevent the recognition regions associated with each of the plurality of buttons from becoming largely displaced from their original positions while still giving consideration to the last contact position.
Since the predetermined coefficient may be influenced by the characteristics of how the buttons are laid out and characteristics of how the user presses a button, there is the possibility that an optimal value may not be universally decided. However, if a suitable value is set as the specified coefficient, this should improve the accuracy when determining the selection of a button. Control of the recognition regions based on the last contact position may change the forms of the recognition regions, may change the sizes of the recognition regions, and may change the orientations of the recognition regions.
However, there is the possibility that the positions of the recognition regions will be moved in parallel due to a position displaced from the position of the button to be selected being consecutively touched, resulting in the positions of the recognition regions becoming far from their original positions. In such case, there is the possibility that a determination will be made contrary to the user's intention. For this reason, it is preferable to provide an upper limit (in the example shown in
More specifically, when controlling the recognition regions associated with each of the plurality of buttons, the region control unit 118 may limit the range of movement of the recognition regions associated with each of the plurality of buttons to within the movable range. Since the movable range is influenced by the characteristics of how the buttons are laid out and characteristics of how the user presses a button, there is the possibility that an optimal range may not be universally decided. However, if a suitable value is set as the movable range, this should improve the accuracy when determining the selection of a button.
This completes the description of control of the recognition regions. By carrying out such control of the recognition regions, it is possible to improve the accuracy when selecting the user's intended button. In addition, as described above, it is also possible to improve the accuracy of determinations by using characteristic information for an individual user and/or characteristic information for various usage states acquired by carrying out calibration. That is, the region control unit 118 is capable of correcting the recognition regions associated with each of the plurality of buttons based on a history of contact positions.
The history of contact positions corresponds to a history of contact positions detected in the past by the position detecting unit 111. As one example, the region control unit 118 may correct the recognition regions associated with each of the plurality of buttons based on an average value of the displacement between positions contacted in the past and the objects selected by such contact. Such correction may be carried out for each button that has been selected by last contact.
In more detail, as one example, if there is one or a plurality of contact positions that has/have been detected in the past, the region control unit 118 may calculate a displacement between an average value of such one or plurality of contact positions and the positions of the buttons selected by such contact as a correction amount and carry out correction by shifting the contact position (X, Y) by such correction amount.
As one example, the correction amount may be set at an amount produced by multiplying the displacement by a specified ratio. If a specified ratio is used, it is possible to prevent the recognition regions associated with each of the plurality of buttons from becoming largely displaced from their original positions while still giving consideration to the last contact positions.
Since the predetermined ratio is influenced by the characteristics of how the buttons are laid out and characteristics of how the user presses a button, there is the possibility that an optimal value may not be universally decided. However, if a suitable value is set as the predetermined ratio, this should improve the accuracy when determining the selection of a button. Correction of the recognition regions may change the forms of the recognition regions, may change the sizes of the recognition regions, and may change the orientations of the recognition regions.
Note that such characteristic information can be obtained in advance by having the user play a simple mini game or the like or can be obtained from the normal operations made by the user. Also, instead of average values, maximum values and minimum values may be used. In the same way as when the respective recognition regions are controlled, by providing an upper limit (for the example shown in
This completes the description of the functions of the information processing apparatus 10 according to the present embodiment of the disclosure. Next, the operation of the information processing apparatus 10 according to the present embodiment will be described. First, the flow of region control by the information processing apparatus 10 according to the present embodiment will be described.
First, the viewing determining unit 117 determines whether the user is looking at the buttons (S11). The “buttons” referred to here correspond to one example of a part or all of the predetermined region described above. If the viewing determining unit 117 determines that the user is looking at the buttons (“Yes” in S11), the recognition regions are set by the region control unit 118 (S17) and the operation proceeds to S15. The recognition regions set here may be recognition regions that are not biased. Meanwhile, if the viewing determining unit 117 determines that the user is not looking at the buttons (“No” in S11), the region control unit 118 sets the recognition regions based on the last selected button (S12). The region control unit 118 then controls the recognition regions based on the last contact position (S13).
After this, the region control unit 118 corrects the recognition regions based on the history of the contact position (S14). The position detecting unit 111 detects the input information (contact position) (S15) and the position detecting unit 111 informs the selection unit 116 of the input information (S16). After the selection unit 116 has been informed of the input information, the operation advances to the selection operation by the information processing apparatus 10.
This completes the description of the flow of region control by the information processing apparatus 10 according to the present embodiment of the disclosure. A selection operation by the information processing apparatus 10 according to the present embodiment of the disclosure will be described next.
First, if the X value out of the contact position (X, Y) indicated by the position detecting unit 111 is smaller than “−25” (“Yes” in S21), the selection unit 116 ends the operation without informing the application executing unit 114 of the pressing of a button. Meanwhile, if the X value is at least “−25” (“No” in S21), the selection unit 116 proceeds to S22.
After this, if the conditions that 10<X and |Y|<10 are satisfied (“Yes” in S22), the selection unit 116 sets the button C in the variable selected_button and then proceeds to S30. Meanwhile, if such conditions are not satisfied (“No” in S22), the selection unit 116 proceeds to S23.
Next, if the conditions “10≦X<60” and “|Y|<10” are satisfied (“Yes” in S23), the selection unit 116 sets the button B in the variable selected_button and then proceeds to S30. Meanwhile, if such conditions are not satisfied (“No” in S23), the selection unit 116 proceeds to S24.
After this, if the conditions “X≦60” and “|Y|<X−50” are satisfied (“Yes” in S24), the selection unit 116 sets the button B in the variable selected_button and then proceeds to S30. Meanwhile, if such conditions are not satisfied (“No” in S24), the selection unit 116 proceeds to S25.
Next, if the condition “10≦Y” is satisfied (“Yes” in S25), the selection unit 116 sets the button D in the variable selected_button and then proceeds to S30. Meanwhile, if the condition is not satisfied (“No” in S25), the selection unit 116 proceeds to S26.
After this, the selection unit 116 sets the button A in the variable selected_button and proceeds to step S30. The selection unit 116 informs the application executing unit 114 that the button set in the variable selected_button has been pressed (S30). This completes the selection operation, and after the selection operation has been completed, the application executing unit 114 executes an application based on the received information.
This completes the description of the flow of a selection operation by the information processing apparatus 10 according to the present embodiment of the disclosure. Next a modification of the information processing apparatus 10 according to the present embodiment of the disclosure will be described.
Here, although in many cases operations using a touch panel are made using a single finger, it is also possible to press a plurality of buttons using a plurality of fingers. As one example, it would be conceivable for a user to simultaneously press different buttons with his/her index finger and middle finger. In such a situation also, although there are cases where the contact positions are displaced, it is possible to correct the recognition according to the relative positional relationship between the two fingers.
Accordingly, although the selection unit 116 is described above as selecting any one of a plurality of buttons based on the contact position detected by the position detecting unit 111 and the last contact position detected by the position detecting unit 111, it is also possible for the selection unit 116 to use another contact position detected by the position detecting unit 111 in place of the last contact position. That is, the selection unit 116 may select any one of the plurality of buttons based on two contact positions detected by the position detecting unit 111.
As described above, according to an embodiment of the present disclosure there is provided the information processing apparatus 10 including the position detecting unit 111 that detects the contact position of an input object on the touch panel 130 and the selection unit 116 that selects any one of a plurality of objects based on the contact position and the last contact position.
With the above configuration, since the last contact position is also considered when selecting any one of a plurality of objects, it is possible to improve the accuracy when selecting the user's intended object. If the user operates the touch panel without looking at the objects themselves, for example, there is a high probability that the user's intended object will be selected even if a position displaced from a recognition region is touched. A further effect is expected in that it also becomes unnecessary for the user to momentarily look at the controller when making operations on a touch panel.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
The steps in the operation of the information processing apparatus 10 according to the embodiment described above also do not have to be executed in a time series in the order indicated in the flowchart. For example, the steps in the operation of the information processing apparatus 10 may be executed in a different order to the order indicated in the flowchart or may be executed in parallel.
It is also possible to generate a computer program for causing hardware, such as a CPU, ROM, and RAM incorporated in the information processing apparatus 10, to realize the same functions as the configuration of the information processing apparatus 10 described above. A storage medium that stores such computer program may also be provided.
Additionally, the present technology may also be configured as below.
(1) An information processing apparatus including:
(2) The information processing apparatus according to (1), further including:
(3) The information processing apparatus according to (2),
(4) The information processing apparatus according to (2) or (3), further including:
(5) The information processing apparatus according to (2) or (3),
(6) The information processing apparatus according to (1),
(7) The information processing apparatus according to any one of (2) to (6),
(8) The information processing apparatus according to (7),
(9) The information processing apparatus according to any one of (2) to (8),
(10) The information processing apparatus according to (9),
(11) An information processing method including:
(12) A program for causing a computer to function as an information processing apparatus, the information processing apparatus including
The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2012-021892 filed in the Japan Patent Office on Feb. 3, 2012, the entire content of which is hereby incorporated by reference.
Number | Date | Country | Kind |
---|---|---|---|
2012-021892 | Feb 2012 | JP | national |