The present disclosure relates to an information processing device, a program, and a method.
An information processing device including a touchscreen display, such as a smartphone, is operated with one hand on a daily basis. Meanwhile, the touchscreen display has been increased in resolution and size, which makes it difficult to operate the touchscreen display with one hand in some cases (e.g., a case where a touch operation is performed on the upper right or lower right of a screen while the device is being held with a left hand).
Patent Literature 1 discloses that, when a user performs a touch operation as if to pull a screen with his/her finger (e.g., a swipe operation from the upper left to the lower right), the entire screen or a part of the screen is moved as if pulled, which allows the user to easily operate a position far from the finger. Patent Literature 2 discloses that, when a user tilts a device in order to enlarge or reduce a screen with one hand, a part of the screen is enlarged or reduced.
In the related arts, however, a touch operation is also required to allow the user to easily operate the position far from the finger, and, even if the device is tilted to enlarge or reduce the screen, a position where the user attempts to perform operation is not always appropriately displayed. This lacks usability.
In view of this, the present disclosure proposes an information processing device, a program, and a method capable of determining a position where the user attempts to perform operation on a screen of the information processing device, without impairing usability.
Hereinafter, an embodiment of the present disclosure will be described in detail with reference to the drawings. In the present specification and the drawings, substantially the same parts are denoted by the same reference signs, and repeated description thereof will be omitted.
Description will be provided in the following order.
1. Embodiment
2. Modification examples of embodiment
3. Hardware configuration example
4. Summary
First, an information processing device 10 according to the present embodiment will be described. The information processing device 10 is a device that allows a touchscreen display to be operated with one hand and may be, for example, a mobile terminal such as a smartphone or a tablet personal computer (PC) or a digital camera. The information processing device 10 includes a display unit 110 such as a touchscreen display. The information processing device 10 may further include an imaging unit 170 such as a front-facing camera.
(Display Unit 110)
The display unit 110 according to the present embodiment displays various kinds of visual information under the control of the control unit 200. The display unit 110 may display, for example, an image, a character, and the like related to an application. For this purpose, the display unit 110 according to the present embodiment includes various display devices such as a liquid crystal display (LCD) device and an organic light emitting diode (OLED) display device. The display unit 110 adjusts and displays the screen so that the user can easily operate the screen when operating the screen with one hand.
(Operation Unit 120)
The operation unit 120 according to the present embodiment detects various user operations, such as a device operation for an application. The device operations described above include, for example, a touch operation. Herein, the touch operation indicates various touch operations on the display unit 110 such as tapping, double tapping, swiping, and pinching. The touch operation also includes an operation of bringing an object such as a finger close to the display unit 110. The operation unit 120 according to the present embodiment includes, for example, a touchscreen, a button, a keyboard, a mouse, and a proximity sensor. The operation unit 120 according to the present embodiment inputs information regarding the detected user operations to the control unit 200.
(Storage Unit 130)
The storage unit 130 according to the present embodiment is a storage area for temporarily or permanently storing various programs and data. For example, the storage unit 130 may store programs and data for the information processing device 10 executing various functions. As a specific example, the storage unit 130 may store programs for executing various applications and management data for managing various settings. As a matter of course, the above is merely an example, and the type of data to be stored in the storage unit 130 is not particularly limited.
(Acceleration Sensor Unit 140)
The acceleration sensor unit 140 according to the present embodiment measures acceleration (velocity per unit time) of the information processing device 10.
(Gyroscope Sensor Unit 150)
The gyroscope sensor unit 150 according to the present embodiment measures angular velocity (an amount of change in angle per unit time) of the information processing device 10. As illustrated in
(Determination Unit 160)
In response to detection of a tilt (corresponding to a first tilt) of the information processing device 10 by the acceleration sensor unit 140, the determination unit 160 according to the present embodiment determines a position where the user attempts to perform operation on the display unit 110 on the basis of a gyro waveform obtained from the angular velocity measured by the gyroscope sensor unit 150 and a gyro waveform at each operation position of the information processing device 10 measured in advance. Alternatively, the determination unit 160 estimates (determines) the position where the user attempts to perform operation on the basis of the gyro waveform obtained from the angular velocity measured by the gyroscope sensor unit 150 by using a learning model generated by using the gyro waveform at each operation position as training data.
The graphs 302, 303, and 305 show gyro waveforms obtained in a case where the user operates operation positions “2”, “3”, and “5” of the display unit 110 with one hand, respectively. When the user operates each operation position of the display unit 110 with one hand, the way of tilting the information processing device 10 and velocity thereof are different at each operation position. Thus, characteristics appear in each gyro waveform (angular velocity). Therefore, the information processing device 10 measures the gyro waveform (angular velocity) at each operation position in advance and then extracts and stores the characteristics of each gyro waveform (e.g., intensity of the angular velocity, an amount of temporal change (the way of rise of the waveform), and a ratio of ωx, ωy, and ωz). Then, the determination unit 160 analyzes characteristics of a gyro waveform in a case where the user actually attempts to perform operation and can therefore determine which position the user attempts to perform operation.
The determination unit 160 can also use machine learning at this time. For example, it is possible to prepare about 100 gyro waveforms at each operation position in advance and perform machine learning by using those gyro waveforms as training data. In this way, a learned model for estimating a user operation position is generated. In a case where a gyro waveform is input, the learned model outputs a corresponding operation position. The learned model is stored in the information processing device 10, and thus it is possible to estimate to which operation position a gyro waveform obtained when the user actually performs operation corresponds. Note that, for example, deep learning such as convolutional neural network (CNN) is used as the machine learning.
The gyro waveforms of the graphs 310 and 311 indicate a state in which the information processing device 10 is strongly shaken or is slowly and greatly moved. Those are, for example, gyro waveforms (angular velocities) obtained in a case where the user moves while leaving the information processing device in his/her bag or holding the information processing device 10 in his/her hand (i.e., a state in which the information processing device is not operated). In such gyro waveforms, the determination unit 160 can determine that the user does not perform operation at any position of the display unit 110 (in this case, in
The gyro waveforms at each operation position are different for each user even though the operation position is the same (for example, the information processing device is tiled to be pulled closer to the user or is tilted back, i.e., is variously operated depending on users even at the same operation position). Therefore, in order to optimize the determination for each user, it is possible to generate a new learned model by, for example, preparing a measurement mode in the information processing device 10, measuring gyro waveforms at each operation position for each user, and relearning the measured gyro waveforms as training data. This makes it possible to improve determination accuracy for each user.
The gyro waveforms at each operation position are different depending on with which hand the user performs operation even though the operation position is the same. Therefore, in a case where there is a possibility that the information processing device is operated with either hand, it is possible to measure and store, in advance, both gyro waveforms obtained by performing operation with right hand and with left hand at each operation position. Therefore, even if the user uses either right or left hand when the user performs operation with one hand, it is possible to selectively use the gyro waveforms for right hand and the gyro waveforms for left hand, thereby improving the determination accuracy.
The determination unit 160 determines a rotation direction of the information processing device 10 on the basis of the angular velocity measured by the gyroscope sensor unit 150 (in which direction the information processing device 10 is tilted, that is, the rotation direction can be found on the basis of the angular velocity). Then, the determination unit 160 determines the position where the user attempts to perform operation on the display unit 110 on the basis of the rotational speed.
In response to detection of a second tilt different from the first tilt after the detection of the first tilt of the information processing device 10, the determination unit 160 determines whether or not the user attempts to return the tilt of the information processing device 10 to an original state (i.e., an original angle), that is, a state before the first tilt is detected with the second tilt.
Further, the determination unit 160 determines a direction of the information processing device 10 on the basis of the tilt of the information processing device 10 and the direction of gravity with respect to the information processing device 10 measured and detected by the acceleration sensor unit 140. Details thereof will be described later.
The determination unit 160 determines whether the user holds the information processing device 10 with the right hand or left hand on the basis of a position and track of a swipe operation performed on the display unit 110. For example, in a case where the swipe operation is performed on the display unit 110 from left to right, generally, an arc is drawn from left to right when the information processing device is held in the right hand, whereas an arc is drawn from right to left when the information processing device is held in the left hand. The information processing device 10 also causes the user to perform such a swipe operation with the right hand and with the left hand, stores positions and tracks of the swipe operations for each user in advance, and can therefore use the positions and tracks thereof for the determination by the determination unit 160.
Based on a result of such the determination on in which hand the information processing device 10 is held, the determination unit 160 can selectively use the gyro waveforms for right hand and the gyro waveforms for left hand in order to determine the position where the user attempts to perform operation on the display unit 110.
Based on an actual user operation after the determination on the position where the user attempts to perform operation on the display unit 110, the determination unit 160 determines whether or not the determination on the position where the user attempts to perform operation is correct. Therefore, it is possible to perform machine learning by using a result of the determination on whether or not the determination is correct as learning data and use a result of the machine learning for subsequent determination by the determination unit 160.
(Imaging Unit 170)
The imaging unit 170 according to the present embodiment images, for example, a face of the user who operates the information processing device 10 under the control of the control unit 200. For this purpose, the imaging unit 170 according to the present embodiment includes an imaging element. A smartphone, which is an example of the information processing device 10, includes a front-facing camera (front camera) for imaging the face or the like of the user on the display unit 110 side and a main camera for imaging a landscape or the like on the back side of the display unit 110.
(Control Unit 200)
The control unit 200 according to the present embodiment is a processing unit that controls the entire information processing device 10 and controls each configuration included in the information processing device 10. Details of the functions of the control unit 200 according to the present embodiment will be described later.
The functional configuration example of the information processing device 10 according to the present embodiment has been described above. The functional configuration described above with reference to
The function of each component may be performed in such a way that an arithmetic unit such as a CPU reads a control program in which a processing procedure for achieving those functions is written from a storage medium such as a read only memory (ROM) or random access memory (RAM) storing the control program and interprets and executes the program. Therefore, it is possible to appropriately change the configuration to be used in accordance with the technical level at the time of carrying out the present embodiment. An example of a hardware configuration of the information processing device 10 will be described later.
Next, the function of the information processing device 10 according to the present embodiment will be described in detail. The control unit 200 of the information processing device 10 uses the acceleration and the angular velocity measured by the acceleration sensor unit 140 and the gyroscope sensor unit 150, respectively, to determine the position where the user attempts to perform operation on the display unit 110 (hereinafter, referred to as “user operation position determination”), without impairing usability. In a case where the user performs operation with one hand and a position where the user attempts to perform the operation on the display unit 110 is a position where it is difficult for the user to perform the operation, the control unit 200 controls the display unit 110 to adjust the screen so that the user can easily perform the operation (hereinafter, referred to as “user operation support”). Note that the position where it is difficult for the user to perform operation indicates, for example, positions other than the operation position “3” in
The user operation support is unnecessary in some cases depending on a usage state of the user, for example, in a case where the user operates the information processing device 10 in a lateral direction. In such a case, the adjustment of the screen by the user operation support may interrupt a user operation. In a case where the user operation support is unnecessary, the user operation position determination for determining whether or not the user operation support is necessary is also unnecessary. In this case, it is possible to stop the function of the gyroscope sensor unit 150 used for the user operation position determination, thereby reducing current consumption of the information processing device 10. In the present embodiment, an operation of each function is controlled by three operation modes.
The operation mode is switched by determining in which direction the user operates the information processing device 10 by using the acceleration and the angular velocity measured by the acceleration sensor unit 140 and the gyroscope sensor unit 150, respectively.
As described above, the direction of the information processing device 10 is determined on the basis of the tilt of the information processing device 10 and the direction of gravity with respect to the information processing device 10, and the operation mode is switched between the “within operation range” and the “out of operation range”.
Switching of the operation mode between the “within operation range” and the “one-hand mode” and the user operation support in the “one-hand mode” will be described.
For example, when detecting the tilt of the information processing device 10, the determination unit 160 of the information processing device 10 determines that the user attempts to perform operation at the operation position “4” on the basis of the gyro waveform obtained from the angular velocity measured by the gyroscope sensor unit 150 and the gyro waveform at each operation position of the information processing device 10 measured in advance or the learning model generated by using the gyro waveform as the training data.
In a case where the information processing device 10 is held with the left hand, the operation position “4” is a position where it is difficult to perform operation, and thus the operation mode is switched to the “one-hand mode”. A right part of
When detecting a further tilt (corresponding to the second tilt) of the information processing device 10 after the screen is reduced as illustrated in the right part of
In
Next, a procedure of the user operation position determination processing according to the present embodiment will be described with reference to
As illustrated in
In a case where the direction of gravity is out of the predetermined operation range (step S102: No) and the operation mode is not the “out of operation range”, the control unit 200 switches the operation mode to the “out of operation range” and stops the function of the gyroscope sensor unit 150 (step S111). After step S111, the present processing ends (strictly speaking, the processing returns to the start of the present processing and waits until the direction of gravity falls within the operation range again).
Meanwhile, in a case where the direction of gravity falls within the predetermined operation range (step S101: Yes) and the operation mode is not the “within operation range”, the control unit 200 switches the operation mode to the “within operation range”, activates the function of the gyroscope sensor unit 150 (step S102), and starts measurement of the angular velocity of the information processing device 10.
Next, the acceleration sensor unit 140 determines whether or not any first tilt has been detected (step S103). In a case where the first tilt is not detected (step S103: No), the present processing ends, and the processing returns to the start. In a case where the acceleration sensor unit 140 detects the first tilt (step S103: Yes), the determination unit 160 determines a position where the user attempts to perform operation on the basis of a gyro waveform obtained from the angular velocity whose measurement has been started in step S102 and a gyro waveform at each operation position measured and stored in advance (step S104). Alternatively, the determination unit 160 estimates (determines) the position where the user attempts to perform operation on the basis of the gyro waveform obtained from the angular velocity whose measurement has been started in step S102 by using a learning model generated by using the gyro waveform at each operation position as training data.
Next, the determination unit 160 determines whether or not an operation at the position where the user attempts to perform operation requires the one-hand mode (step S105). As described above, whether or not the operation requires the one-hand mode depends on whether or not the position where the user attempts to perform operation on the display unit 110 is a position where it is difficult to perform operation and whether or not the user operation support is necessary.
In a case where the operation does not require the one-hand mode (step S105: No), the present processing ends, and the processing returns to the start.
Meanwhile, in a case where the operation requires the one-hand mode (step S105: Yes), the control unit 200 switches the operation mode to the “one-hand mode”, and the display unit 110 adjusts the screen so that the user can easily perform the operation (step S106). The screen that allows the user to easily perform the operation is, for example, a screen in which the position where the user attempts to perform the operation on the display unit 110 is moved closer to the thumb of the user by reducing the size of the screen or moving the entire screen as illustrated in
Next, the acceleration sensor unit 140 and the gyroscope sensor unit 150 determine whether or not any second tilt has been detected (step S107). In a case where the second tilt is not detected (step S107: No), the acceleration sensor unit 140 and the gyroscope sensor unit 150 wait until the second tilt is detected. In a case where the acceleration sensor unit 140 and the gyroscope sensor unit 150 detect the second tilt (step S107: Yes), it is determined whether or not the user attempts to return the tilt of the information processing device 10 to an original angle (step S108).
In a case where the user does not return the tilt of the information processing device 10 to the original angle (step S109: No), it is determined that the “one-hand mode” is still necessary, and the acceleration sensor unit 140 further waits until the second tilt is detected (returns to step S107). Note that, although the tilt of the information processing device 10 has not returned to the original angle, the user operation support is further required in some cases. For example, there is a case where operation is performed at the operation position “4” in
Meanwhile, in a case where the user returns the tilt of the information processing device 10 to the original angle (step S109: Yes), it is determined that the “one-hand mode” is no longer necessary, and the control unit 200 switches the operation mode to the “within operation range”, and the display unit 110 adjusts the screen again (step S110). The screen adjustment herein refers to returning the screen adjusted in step S106 as in the right part of
Next, modification examples of the present embodiment will be described. Note that the information processing device 10 may implement the modification examples described below instead of the above-described embodiment or may implement the modification examples in combination with the above-described embodiment.
The information processing device 10 images and stores a front-facing camera image at each operation position in advance, compares the front-facing camera image with a front-facing camera image obtained when the user actually attempts to perform operation, and can therefore determine at which position the user attempts to perform the operation. Note that the front-facing camera image does not need to be displayed on the display unit 110 and is obtained by internally converting a picture captured by the imaging unit 170 into digital data.
The position of the user appearing in the front-facing camera image, as well as the gyro waveform, is different for each user even though the operation position is the same. Further, even for the same user, the position of the user appearing in the front-facing camera image is different depending on with which hand the user operates the information processing device 10. Therefore, both front-facing camera images obtained by performing operation with the right hand and with the left hand at each operation position can be imaged and stored in advance for each user.
A plurality of front-facing camera images can be captured and stored in advance for one operation position for each user. The determination unit 160 can also determine the position where the user attempts to perform operation by using both the determination based on the gyro waveform (angular velocity) described above and the determination based on the front-facing camera image. This makes it possible to further improve the determination accuracy of the determination unit 160.
As another modification example, the control unit 200 can perform control so that content displayed on the display unit 110 is switched to another content depending on a position where the user attempts to perform operation, the position being determined by the determination unit 160.
Further, the control unit 200 can perform control to switch a login user of the information processing device 10 to another login user depending on a position where the user attempts to perform operation, the position being determined by the determination unit 160.
Furthermore, the control unit 200 can perform control to switch a SIM of the information processing device 10 to another SIM depending on a position where the user attempts to perform operation, the position being determined by the determination unit 160.
Still further, the control unit 200 can perform control to switch an imaging mode of the imaging unit 170 depending on a position where the user attempts to perform operation, the position being determined by the determination unit 160.
The control in the other modification examples described above can further improve the usability.
Next, a hardware configuration example of the information processing device 10 according to an embodiment of the present disclosure will be described.
(Processor 871)
The processor 871 functions as, for example, an arithmetic processing device or a control device and controls the entire or part of operation of each component on the basis of various programs recorded on the ROM 872, the RAM 873, the storage 880, or a removable recording medium 901. As a matter of course, the processor 871 may include a plurality of processors.
(ROM872, RAM873)
The ROM 872 is means for storing programs to be read by the processor 871, data to be used for calculation, and the like. The RAM 873 temporarily or permanently stores, for example, programs to be read by the processor 871 and various parameters that appropriately change when the programs are executed.
(Host Bus 874, Bridge 875, External Bus 876, and Interface 877)
The processor 871, the ROM 872, and the RAM 873 are mutually connected via, for example, the host bus 874 capable of transmitting data at a high speed. The host bus 874 is connected to, for example, the external bus 876 having a relatively low data transmission speed via the bridge 875. The external bus 876 is connected to various components via the interface 877.
(Input Device 878)
Examples of the input device 878 include a mouse, a keyboard, a touchscreen, a button, a switch, and a lever. The input device 878 can also include a remote control capable of transmitting a control signal by using infrared rays or other radio waves. The input device 878 also includes sound input devices such as a microphone and sensor devices such as an acceleration sensor and a gyroscope sensor.
(Output Device 879)
The output device 879 is a device capable of visually or audibly notifying the user of acquired information, and examples thereof include display devices such as a cathode ray tube (CRT) display, an LCD, and an organic EL display, audio output devices such as a speaker and a headphone, a printer, a mobile phone, and a facsimile. The output device 879 according to the present disclosure includes various vibration devices capable of outputting tactile stimulation.
(Storage 880)
The storage 880 is a device for storing various kinds of data. Examples of the storage 880 include a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, and a magneto-optical storage device.
(Drive 881)
The drive 881 is, for example, a device that reads information recorded on the removable recording medium 901 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory or writes information to the removable recording medium 901.
(Removable Recording Medium 901)
Examples of the removable recording medium 901 include a DVD medium, a Blu-ray (registered trademark) medium, an HD DVD medium, and various semiconductor storage media. As a matter of course, the removable recording medium 901 may be, for example, an IC card on which a non-contact IC chip is mounted or an electronic device.
(Connection Port 882)
The connection port 882 is, for example, a port for connecting an external connection device 902 such as a universal serial bus (USB) port, an IEEE 1394 port, a small computer system interface (SCSI), an RS-232C port, or an optical audio terminal.
(External Connection Device 902)
Examples of the external connection device 902 include a printer, a portable music player, a digital camera, a digital video camera, and an IC recorder.
(Communication Device 883)
The communication device 883 is a communication device to be connected to a network, and examples thereof include communication cards for a wired or wireless LAN, Bluetooth (registered trademark), and a wireless USB (WUSB), a router for optical communication, a router for asymmetric digital subscriber line (ADSL), and modems for various types of communication.
As described above, an information processing device (10) includes: a display unit (110); a gyroscope sensor unit (150) that measures angular velocity of the information processing device (10); and a determination unit (160) that, in response to detection of a first tilt of the information processing device (10), determines a position where a user attempts to perform operation on the display unit (110) on the basis of a first gyro waveform obtained from the angular velocity and a second gyro waveform at each operation position of the information processing device (10) measured in advance or a learning model generated by using the second gyro waveform as training data.
Therefore, it is possible to determine the position where the user attempts to perform operation on a screen of the information processing device, without impairing usability.
Although the preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can conceive various changes or modifications within the scope of the technical idea described in the claims, and it is naturally understood that these changes and modifications also belong to the technical scope of the present disclosure.
The effects described in the present specification are merely illustrative or exemplary and are not restrictive. That is, the technology according to the present disclosure can have other effects obvious to those skilled in the art from the description of the present specification in addition to or instead of the above effects.
The present technology can also have the following configurations.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/050993 | 12/25/2019 | WO |