The present disclosure relates to an information processing device and an information processing method.
Recently, a technology of supporting an operation by a finger of a user in a device having a touch panel has been developed. For example, in Patent Literature 1, a technology in which a position of a controller displayed on a screen is controlled on the basis of a position of a finger of a user which position is detected by a sensor included in a back surface of the device is disclosed.
However, in the technology disclosed in Patent Literature 1, a device needs to additionally include a sensor to detect a finger. Also, there is a case where the technology disclosed in Patent Literature 1 cannot optimize an operation in a small device such as a smartphone used with one hand.
Thus, the present disclosure proposes an information processing device and an information processing method capable of reducing an operational burden of a small terminal that has a touch panel and that is used with one hand.
According to the present disclosure, an information processing device is provided. The information processing device includes a control unit that determines, on the basis of a detection position of an operating finger of a user on a side surface with respect to a display screen, a region to display predetermined information on the display screen, and that determines, on the basis of the detection position, a region to receive an operation by the operating finger from a region that is a combination of the display screen and the side surface.
Moreover, according to the present disclosure, an information processing method is provided. The information processing method includes determining, on the basis of a detection position of an operating finger of a user on a side surface with respect to a display screen, a region to display predetermined information on the display screen, and determining, on the basis of the detection position, a region to receive an operation by the operating finger from a region that is a combination of the display screen and the side surface, the determining being performed by a processor.
As described above, according to the present disclosure, it becomes possible to reduce an operational burden of a small terminal that has a touch panel and that is used with one hand.
Note that the above effect is not necessarily a limitation, and any of the effects described in the present description, or other effects that can be grasped from the present description may be acquired in addition to or instead of the above effect.
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. Note that the same signs are assigned to components having substantially the same functional configurations in the present description and drawings, and overlapped description thereof is omitted.
Note that the description will be made in the following order.
1. Example of external configuration of information processing terminal 10
2. Example of functional configuration of information processing terminal 10
3. Detailed example
3-1. Example of screen display
3-2. Example of operation flow
4. Example of hardware configuration
5. Conclusion
The information processing terminal 10 includes a display 160. The display 160 covers a sensor unit 110 (described later). With such a configuration, the information processing terminal 10 can detect a finger of a user in contact with the display 160. Note that the display 160 may cover a part or whole of a side surface of the information processing terminal 10.
Also, the information processing terminal 10 can control contents of a display screen on the basis of a user input. For example, the information processing terminal 10 can display execution information related to function execution of an application on the display 160 by using a so-called graphical user interface (GUI). Here, the execution information means visual information such as an icon causing an application to execute predetermined processing, a result of execution of the predetermined processing by the application, and the like.
The information processing terminal 10 includes, for example, a camera 140 and a display 160. Here, the camera 140 may be provided in one or both of the front surface and the back surface of the information processing terminal 10, as illustrated in
Next, an example of the functional configuration of the information processing terminal 10 according to the present embodiment will be described.
(Sensor Unit 110)
The sensor unit 110 according to the present embodiment has a function of detecting a contact made by a finger of a user with the information processing terminal 10. Here, the sensor unit 110 can detect a position of the finger of the user in contact with the front surface and the side surface of the information processing terminal 10. Note that the sensor unit 110 is realized, for example, by a capacitance-type touch sensor or a pressure sensitive-type touch sensor. Note that in a case of the capacitance-type touch sensor, operation using a side surface may be realized without an arrangement of a touch sensor immediately below the side surface portion of the information processing terminal 10 by setting such that sensitivity of a touch sensor, which is arranged immediately below the front surface of the information processing terminal, in a direction of the side surface becomes high. In this case, when the sensitivity of the side surface portion of the information processing terminal 10 is set to be higher than that of a touch sensor in other regions, the operation using the side surface is realized.
Also, the sensor unit 110 may detect a position of an operating finger that executes a trigger operation. Here, the trigger operation means an operation for causing an application to execute predetermined processing, or an operation for causing a result of the execution of the predetermined processing by the application to be displayed. Also, the trigger operation specifically refers to a tap, double-tap, push, and the like. In addition, the operating finger means, among a plurality of fingers of a user that holds the information processing terminal 10, a finger that executes an operation on the display 160 in order to cause the application to execute predetermined processing.
(Input Unit 120)
The input unit 120 according to the present embodiment includes various sensors such as an acceleration sensor and a position information sensor, and has a function of acquiring data by using the various sensors. The input unit 120 provides the acquired data to the specification unit 170 (described later).
(Storage Unit 130)
The storage unit 130 according to the present embodiment is a storage area to temporarily or permanently store various kinds of programs and data. The storage unit 130 may store information related to various applications, for example. More specifically, for example, the storage unit 130 may store a program for executing an application, and management data to manage various settings and the like. Obviously, the above is merely an example, and a type of data stored in the storage unit 130 is not specifically limited.
(Camera 140)
The camera 140 according to the present embodiment is used for photographing processing of an image or moving image under control of a photographic application. Here, the photographic application means an application capable of photographing an image or a moving image. Note that the camera 140 is provided in one or both of the front surface and the back surface of the information processing terminal 10.
(Control Unit 150)
The control unit 150 according to the present embodiment has a function of determining, on the basis of a detection position of an operating finger on the side surface of the information processing terminal 10 which position is detected by the sensor unit 110, a region to receive an operation by the operating finger. Also, here, the detection position of the operating finger means a position where an operation on the side surface of the information processing terminal 10 is executed among positions where the operation finger executes an operation on the information processing terminal 10.
A detailed description will be made. For example, the control unit 150 determines a region to display predetermined information on a display screen on the basis of a detection position of an operating finger that executes a trigger operation on the side surface, the position being detected by the sensor unit 110. Also, the control unit 150 determines a region to receive a subsequent operation by the operating finger from a region, which is a combination of the display screen and the side surface, on the basis of the detection position of the operating finger that executes the trigger operation on the side surface which position is detected by the sensor unit 110. Here, on the basis of the detection position, the control unit 150 may determine a region assumed to be reached by the operating finger as a region to receive an operation.
Here, the subsequent operation means an operation executed on the region determined by the control unit 150 on the basis of the detection position of the operating finger that executes the trigger operation.
Note that there is a case where the control unit 150 determines a region to receive an operation by the operating finger on the basis of a detection position where the sensor unit 110 detects a finger other than the operating finger of the user.
In such a manner, by receiving the operation on the region assumed to be reached by a finger of the user, it is possible to reduce a burden on the user of an operation with respect to the information processing terminal 10.
Also, the control unit 150 may cause the display 160 to display execution information related to function execution of an application in a case where the sensor unit 110 detects that the trigger operation is executed by the operating finger on the side surface.
The control unit 150 has a function of causing various applications to execute predetermined processing on the basis of an operation related to function execution of the application by the finger of the user. The operation related to the function execution of the application includes, for example, a touch operation on an icon of the application. Also, the function execution of an application includes execution of functions unique for various applications.
For example, the control unit 150 can control photographing processing of the photographic application. More specifically, the control unit 150 can perform control in such a manner that the photographic application photographs an image or moving image with the camera 140. Also, the control unit 150 may cause the display 160 (described later) to display information related to the photographing processing. Here, the information related to the photographing processing means a preview image expressing a result of the photographing processing, a controller to control the photographing processing, and the like.
Also, the control unit 150 has a function of controlling each configuration included in the information processing terminal 10. The control unit 150 controls, for example, starting and stopping of each component.
(Display 160)
The display 160 according to the present embodiment is a touch panel having a function of displaying execution information related to function execution of an application on a display screen under control by the control unit 150.
(Specification Unit 170)
The specification unit 170 according to the present embodiment has a function of specifying a predetermined application on the basis of predetermined criteria. Here, the predetermined criteria are, for example, current time, a current location, a frequency of use of various applications, and the like. For example, the specification unit 170 may specify an application used more frequently than other applications among various applications stored in the storage unit 130.
The specification unit 170 may specify an application by using a model constructed by machine learning of an operation by the user on the information processing terminal 10 or of predetermined criteria. The machine learning may be machine learning using a neural network such as deep learning. Note that the constructed model may be stored in the storage unit 130.
Also, the specification unit 170 may specify an application by using information acquired by machine learning from data that is related to terminal usage by a plurality of users and that is received from other devices via a network 20 (described later). Execution information related to the application may be displayed to the user by the display 160 controlled by the control unit 150.
(Communication Unit 180)
The communication unit 180 according to the present embodiment has a function of executing communication with other devices via the network 20 (described later). For example, the communication unit 180 may receive, from the other devices, information acquired by machine learning from the above-described data related to terminal usage by a plurality of users.
(Network 20)
The network 20 according to the present embodiment has a function of connecting the components included in the information processing system. The network 20 may include the Internet, a public line network such as a telephone line network or a satellite communication network, various local area networks (LAN) including Ethernet (registered trademark), a wide area network (WAN), and the like. Also, the network 20 may include a leased line network such as an Internet protocol-virtual private network (IP-VPN). Also, the network 20 may include a wireless communication network such as Wi-Fi (registered trademark) or Bluetooth (registered trademark).
In the above, an example of a functional configuration of the information processing terminal 10 according to the present embodiment has been described. Note that the functional configuration described above with reference to
Also, functions of the components may be executed when an arithmetic unit such as a central processing unit (CPU) reads a control program describing a processing procedure to realize these functions from a storage medium such as a read only memory (ROM) or a random access memory (RAM) storing the control program, and interprets and executes the program. Thus, it is possible to appropriately change a configuration to be used according to a technical level at the time of performance of the present embodiment. An example of a hardware configuration of the information processing terminal 10 will be described later.
<3-1. Example of Screen Display>
Next, an operation of the control unit 150 according to the present embodiment will be described in detail with a detailed example.
On the left side of
In the example on the left side of
In the example on the right side of
For example, in a case where the user executes, as the subsequent operation, a touch operation on any one of the icons I1 of the plurality of applications, the control unit 150 may execute starting processing of an application corresponding to an icon on which the touch operation is executed.
Also, an application related to a displayed icon can be specified by the specification unit 170 on the basis of predetermined criteria. The control unit 150 causes the display 160 to display icons I1 of a plurality of applications specified by the specification unit 170. Here, for example, the control unit 150 may cause the display 160 to display an icon of a more-frequently-used application at a high position on the display screen compared to compared to icons of other applications.
Note that in a case where the sensor unit 110 detects a slide operation by the operating finger F on the display screen, the control unit 150 may control the icons of the plurality of applications to scroll vertically, for example. By the control, even in a case where icons of many applications are displayed, the user can execute an operation related to the starting processing of an application.
In such a manner, the information processing terminal 10 can display icons of applications on the basis of a detection position of a finger of a user. With such a function, it becomes possible to reduce a burden on the user of the operation related to the starting processing of an application.
Note that with respect to the display 160 that is a display screen, the control unit 150 may also determine a region to display execution information related to an application that does not require a subsequent operation and that is, for example, a clock application or the like on the basis of the detection position TP1 of the operating finger F.
In the above, an example in which the information processing terminal 10 displays icons of applications has been described. Here, the information processing terminal 10 may execute processing related to an active application in a case where a trigger operation on the side surface of the information processing terminal 10 is detected.
On a left side of
Here, for example, in a case where the sensor unit 110 detects execution of a double-tap operation, the control unit 150 may cause the photographic application to execute the photographing processing after a predetermined period. This is because the information processing terminal 10 shakes due to the double-tap immediately after the double-tap operation by the user and a still image with a blurred picture may be photographed even when the photographing processing is executed. Here, the control unit 150 may cause the display 160 to display a figure or a character of a predetermined icon (such as icon indicating that photographing processing is in progress) or the like on the basis of a detection position TP2 during standing by for elapse of the predetermined period before the photographic application starts executing the photographing processing. The displayed icon is hidden after the elapse of the predetermined period. Note that the standby and display of icons and the like after the double-tap operation are the similar in moving image photographing processing (described later).
On a right side of
In the example on the right side of
Note that, for example, in a case where the sensor unit 110 detects an operation such as a touch by the operating finger F on the preview image W1, the control unit 150 may cause the image photographed by the photographic application to be displayed on the entire display 160. Note that the control unit 150 may cause the photographic application to stand by for a predetermined period before the photographic application is caused to execute the photographing processing after the trigger operation executed on the side surface by the operating finger is detected.
In such a manner, the information processing terminal 10 can photograph an image and display a preview image on the basis of the trigger operation by the operating finger on the side surface. With such a function, it becomes possible for the user to perform photographing while holding the information processing terminal without difficulty.
In the above, an example in which the information processing terminal 10 causes the photographic application to photograph an image has been described. However, application to an example of executing photographing of a moving image is also possible.
On a left side of
In the middle of
On a right side of
In such a manner, the information processing terminal 10 can photograph a moving image and display a preview image on the basis of the trigger operation by the operating finger on the side surface. With such a function, it becomes possible for the user to perform moving image photographing while holding the information processing terminal without difficulty.
In the above, an example in which a region to receive an operation on the display screen is determined on the basis of a detection position of a trigger operation on the side surface of the information processing terminal 10 has been described. However, a region to receive an operation by a finger of a user on the side surface of the information processing terminal 10 may be determined on the basis of the detection position.
In
In an example on the right side of
Here, the control unit 150 may determine, as a region (side surface) to receive an operation related to function execution of an application, a side surface on which an operation is assumed to be executed by the operating finger of the user between the right and left side surfaces of the information processing terminal 10. Note that in a case of the example illustrated in
In such a manner, it is possible to determine a region to receive an operation on the side surface of the information processing terminal 10. With such a function, it is possible to prevent erroneous generation of processing.
Also, processing executed by the control unit 150 in a case where the sensor unit 110 detects execution of an operation on the side surface is not limited to the above example. For example, the control unit 150 may execute processing of adjusting an output volume of the information processing terminal 10 in a case where the sensor unit 110 detects a slide operation on the side surface.
In the above, an example of determining a region to receive an operation on the side surface has been described. However, it is also possible to prevent erroneous generation of processing by limiting a region to receives an operation on the side surface.
In
In the example of
In the example of
In such a manner, it is possible to limit a region to receive an operation on the side surface, that is, a region in which the sensor unit 110 detects an operation on the side surface. With such a function, it becomes possible to efficiently prevent erroneous generation of processing according to a way how the information processing terminal 10 is held.
In
In the above, an example of a case where the information processing terminal 10 is used in a vertical direction has been described. However, even in a case where the information processing terminal 10 is used in a horizontal direction, an operation using the side surface of the information processing terminal 10 is possible.
In an example in an upper part of
In an example in a lower part of
With such a function, even in a case where the information processing terminal 10 is used in the horizontal direction with one hand, it becomes possible to reduce a burden of the operation.
Also, the information processing terminal 10 according to the present embodiment can select an application to be started only on the basis of an operation on the side surface.
In an upper left part of
As illustrated in an upper right part of
Note that the control unit 150 may cause the display 160 to display icons C2 larger than the icons C1 of the plurality of applications in a case where the sensor unit 110 keeps detecting a pushing operation at the detection position TP9. Also, in a case where the sensor unit 110 keeps detecting the pushing operation at the detection position TP9, the control unit 150 causes the display 160 to display icons C3 larger than the icons C2 of the plurality of applications.
As described above, the information processing terminal 10 performs displaying in such a manner that icons of a plurality of applications become gradually larger in response to a pushing operation on the side surface by a finger of a user. Thus, the user can easily grasp a status of a series of operations related to the application starting processing.
In an upper left part of
As illustrated in an upper right part of
In the lower left part of
Also, in the lower left part of
In the lower right part of
In such a manner, the user can execute starting processing of the application without hiding the display 160 with a finger of the user, which display functions as the display screen, and without changing a position of an operating finger much. With such a function, it becomes possible to further reduce a burden of an operation by the finger of the user on the side surface of the information processing terminal 10.
An example in which application starting processing is executed only on the basis of an operation on the side surface has been described. Here, the information processing terminal 10 may execute screen switching processing of active applications only on the basis of an operation by an operating finger on the side surface. That is, icons displayed in a circular manner on the display 160 may be icons of active applications, and the control unit 150 can display a screen of a selected active application on the display 160.
<3-2. Example of Operation Flow>
Next, an example of a flow of an operation of the information processing terminal 10 according to the present embodiment will be described.
Referring to
Then, the control unit 150 causes an icon of an application to be displayed on the region determined in Step S1003 (S1004). Then, the sensor unit 110 determines whether an operation on the icon of the application which icon is displayed on the display screen of the information processing terminal 10 is detected (S1005). In a case where the operation on the icon is detected (S1005: YES), the control unit 150 executes starting processing of the application corresponding to the icon (S1006), and the operation is ended. On the one hand, in a case where the trigger operation on the icon is not detected (S1005: NO), it is determined whether an operation such as a double-tap on the side surface of the information processing terminal 10 is detected (S1007). In a case where the operation on the side surface is not detected (S1007: NO), the operation returns to Step S1005. In a case where a double-tap operation on the side surface is detected (S1007: YES), the icon displayed in Step S1004 is hidden (S1008), and the operation is ended.
Next, an example of a flow of an image photographing operation by a photographic application according to the present embodiment will be described.
Referring to
Then, the control unit 150 causes the region on the display 160, which region is determined in Step S1104, to display a preview image of the image photographed in Step S1103 (S1105). Then, it is determined whether the sensor unit 110 detects an operation on the preview image displayed on the display screen of the information processing terminal 10 (S1106). In a case where the sensor unit 110 detects an operation on the preview image (S1106: YES), the control unit 150 displays the image photographed in S1103 on the display screen (S1107), and the operation is ended.
On the one hand, in a case where the sensor unit 110 does not detect an operation on the preview image (S1106: NO), it is determined whether the sensor unit 110 detects an operation on the side surface of the information processing terminal 10 (S1108). In a case where the sensor unit 110 detects an operation on the side surface (S1108: YES), the control unit 150 hides the preview image (S1110), and the operation is ended. On the one hand, in a case where the sensor unit 110 does not detect an operation on the side surface (S1108: NO), the control unit 150 determines whether a certain period elapses after the preview image is displayed in Step S1105 (S1109). In a case where the certain period does not elapse (S1109: NO), the operation returns to Step S1106. On the one hand, in a case where the certain period elapses (S1109: YES), the control unit 150 hides the preview image (S1110), and the operation is ended.
Next, an example of a flow of a moving image photographing operation by the photographic application according to the present embodiment will be described.
Referring to
Then, the control unit 150 causes the photographic application to start the moving image photographing processing (S1204). Then, the control unit 150 causes a controller to be displayed on the region determined in Step S1203 (S1205). Then, it is determined whether the sensor unit 110 detects an operation of ending the photographing by a predetermined operation on the side surface, an operation on a predetermined display region of the controller, or the like (S1206). In a case where the sensor unit 110 does not detect the operation of ending the photographing (S1206: NO), the operation returns to Step S1206. On the one hand, in a case where the sensor unit 110 detects the operation of ending the photographing (S1206: YES), the control unit 150 causes the photographic application to end the moving image photographing processing started in Step S1204 (S1207).
Then, the control unit 150 causes the region determined in Step S1203 to display a preview image of the moving image photographed by the photographic application in Step S1204 to Step S1207 (S1208). Then, the sensor unit 110 determines whether a subsequent operation is detected with respect to the preview image displayed on the display screen of the information processing terminal 10 (S1209). In a case where the sensor unit 110 detects the subsequent operation on the preview image (S1209: YES), the control unit 150 displays the moving image photographed between Step S1205 and Step S1207 on the display screen (S1210), and the operation is ended.
On the one hand, in a case where the sensor unit 110 does not detect the subsequent operation on the preview image (S1209: NO), it is determined whether an operation is detected on the side surface of the information processing terminal 10 (S1211). In a case where the operation on the side surface is detected by the sensor unit 110 (S1211: YES), the control unit 150 hides the preview image (S1213), and the operation is ended. On the one hand, in a case where the operation on the side surface is not detected (S1211: NO), the control unit 150 determines whether a certain period elapses since the preview image is displayed in Step S1105 (S1212). In a case where the certain period does not elapse (S1212: NO), the operation returns to Step S1209. On the one hand, in a case where the certain period elapses (S1212: YES), the control unit 150 hides the preview image (S1213), and the operation is ended.
Next, an example of a flow of an operation of determining a region to receive an operation according to the present embodiment will be described.
Referring to
Next, an example of a hardware configuration of the information processing terminal 10 according to an embodiment of the present disclosure will be described.
(Processor 871)
The processor 871 functions, for example, as an arithmetic processing device or a control device, and controls the overall operation of each component or a part thereof on the basis of various programs recorded in the ROM 872, the RAM 873, the storage 880, or a removable recording medium 901.
(ROM 872 and RAM 873)
The ROM 872 is a means to store a program read by the processor 871, data used for calculation, and the like. The RAM 873 temporarily or permanently stores, for example, a program read by the processor 871, various parameters that appropriately change in execution of the program, and the like.
(Host Bus 874, Bridge 875, External Bus 876, and Interface 877)
The processor 871, ROM 872, and RAM 873 are connected to each other, for example, via the host bus 874 capable of high-speed data transmission. On the one hand, the host bus 874 is connected to the external bus 876, which has a relatively low data transmission rate, via the bridge 875, for example. Also, the external bus 876 is connected to various components via the interface 877.
(Input Device 878)
As the input device 878, for example, a mouse, a keyboard, a touch panel, a button, a switch, a lever, or the like is used. Furthermore, there is a case where a remote controller (hereinafter, remote controller) capable of transmitting a control signal by using infrared rays or other radio waves is used as the input device 878. Also, the input device 878 includes a voice input device such as a microphone.
(Output Device 879)
The output device 879 is a device that can visually or audibly notify a user of acquired information and is, for example, a display device such as a cathode ray tube (CRT), an LCD, or an organic EL, an audio output device such as a speaker or headphone, a printer, a mobile phone, a facsimile, or the like. Also, the output device 879 according to the present disclosure includes various vibration devices capable of outputting tactile stimulation.
(Storage 880)
The storage 880 is a device to store various kinds of data. As the storage 880, for example, a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like is used.
(Drive 881)
The drive 881 is a device to read information recorded on the removable recording medium 901 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, or to write information on the removable recording medium 901.
(Removable Recording Medium 901)
The removable recording medium 901 is, for example, a DVD medium, a Blu-ray (registered trademark) medium, an HD DVD medium, various semiconductor storage media, or the like. Obviously, the removable recording medium 901 may be, for example, an IC card equipped with a non-contact type IC chip, an electronic device, or the like.
(Connection Port 882)
The connection port 882 is a port to connect an external connection device 902 and is, for example, a universal serial bus (USB) port, an IEEE 1394 port, a small computer system interface (SCSI), an RS-232C port, an optical audio terminal, or the like.
(External Connection Device 902)
The external connection device 902 is, for example, a printer, a portable music player, a digital camera, a digital video camera, an IC recorder, or the like.
(Communication Device 883)
The communication device 883 is a communication device for connection to a network and is, for example, a communication card for a wired or wireless LAN, Bluetooth (registered trademark), or a wireless USB (WUSB), a router for optical communication, a router for an asymmetric digital subscriber line (ADSL), a modem for various kinds of communication, or the like.
As described above, an information processing terminal 10 according to an embodiment of the present disclosure has a function of determining a region to receive an operation on a display screen and a side surface on the basis of a detection position of an operating finger on the side surface. With such a function, it becomes possible to reduce a burden of an operation related to a small terminal that includes a touch panel and that is used with one hand.
Preferred embodiments of the present disclosure have been described above in detail with reference to the accompanying drawings. However, the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in claims. Obviously, it is understood that these also belong to the technical scope of the present disclosure.
Also, the effects described in the present description are merely descriptions or examples, and are not limitations. That is, the technology according to the present disclosure may have other effects that are apparent to those skilled in the art from a description of the present description in addition to or instead of the above effects.
Note that the following configurations also belong to the technical scope of the present disclosure.
(1) An information processing device comprising:
a control unit that determines, on the basis of a detection position of an operating finger of a user on a side surface with respect to a display screen, a region to display predetermined information on the display screen, and that determines, on the basis of the detection position, a region to receive an operation by the operating finger from a region that is a combination of the display screen and the side surface.
(2) The information processing device according to (1), wherein
the control unit determines a region to display predetermined information on the display screen on the basis of the detection position of the operating finger executing a trigger operation on the side surface, and determines, on the basis of the detection position, a region to receive a subsequent operation by the operating finger from the region that is the combination of the display screen and the side surface.
(3) The information processing device according to (2), wherein
the control unit limits, on the basis of the detection position of the operating finger that executes the trigger operation on the side surface, the region to receive the subsequent operation to a region assumed to be reached by the operating finger.
(4) The information processing device according to (2) or (3), wherein
the predetermined information is execution information related to function execution of an application, and
in a case where the trigger operation by the operating finger is detected on the side surface, the control unit determines a region to display the execution information on the basis of the detection position of the trigger operation.
(5) The information processing device according to (4), wherein
the execution information includes an icon related to at least one application specified on the basis of a predetermined criterion, and
the control unit determines a region to display the icon on the basis of the detection position of the operating finger that executes the trigger operation on the side surface.
(6) The information processing device according to (5), wherein
the control unit determines a region to display the icon on the basis of the detection position of the operating finger that executes the trigger operation on the side surface, and determines, as the region to receive the subsequent operation by the operating finger, a region within a predetermined range with the detection position as a reference.
(7) The information processing device according to any one of (4) to (6), wherein
the control unit causes the application to execute predetermined processing on the basis of the subsequent operation by the operating finger on the region to receive the operation.
(8) The information processing device according to (7), wherein,
the application is a photographic application, and
the control unit causes the photographic application to execute photographing processing on the basis of the operation by the operating finger on the side surface.
(9) The information processing device according to (8), wherein,
the execution information includes a preview image of an image photographed by the photographic application, and
the control unit causes the preview image to be displayed on the basis of the detection position of the operating finger that executes the trigger operation on the side surface.
(10) The information processing device according to (8) or (9), wherein
the execution information includes a controller that controls the photographing processing, and
the control unit causes the controller to be displayed on the basis of the detection position of the operating finger that executes the trigger operation on the side surface.
(11) The information processing device according to any one of (8) to (10), wherein
the control unit causes the photographic application to execute the photographing processing after a predetermined period elapses from the detection of the trigger operation executed by the operating finger on the side surface.
(12) The information processing device according to any one of (2) to (11), wherein,
the control unit determines the region to receive the subsequent operation on the side surface on the basis of the detection position of a finger of the user on the side surface.
(13) The information processing device according to (12), wherein,
the control unit determines, on the basis of the detection position of the finger of the user on the side surface, a side surface on which it is assumed that the operation is executed by the operating finger among a plurality of side surfaces as the region to receive the subsequent operation on the side surface.
(14) The information processing device according to (1), wherein
the control unit limits the region to receive the operation by the operating finger on the side surface with a height at the detection position of the operating finger on the side surface as a reference.
(15) The information processing device according to any one of (1) to (14), further comprising
a touch panel.
(16) The information processing device according any one of to (1) to (15), wherein
the device is a mobile terminal.
(17) An information processing method comprising:
determining, on the basis of a detection position of an operating finger of a user on a side surface with respect to a display screen, a region to display predetermined information on the display screen, and determining, on the basis of the detection position, a region to receive an operation by the operating finger from a region that is a combination of the display screen and the side surface,
the determining being performed by a processor.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/032034 | 8/29/2018 | WO | 00 |