This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2020-098493 filed Jun. 5, 2020.
The present disclosure relates to an information processing apparatus and a non-transitory computer readable medium.
Wearable devices have recently been put to practical use. Examples of this type of device include devices that are used by being attached to wrists and the like. Some devices of this type each have a limitation on the positional relationship between a wrist and the device being worn around the wrist due to the structure of the device, and some of devices of this type do not have such a limitation. In the latter case, the devices may be freely worn. Japanese Unexamined Patent Application Publication No. 2015-179299 is an example of the related art.
For example, there is a device having a display surface extending approximately halfway around a wrist in the state where the device is worn around the wrist. When a display of the device displays information, attention-grabbing information is often displayed near the center of the display surface in the longitudinal direction of the display surface. In many cases, the structure of such a device is designed in such a manner that a region of the display surface near the center of the display surface in the longitudinal direction is located at a position where a user may easily look at the region when the device is worn by the user.
However, if the device is not fixedly worn on a body part, and the positional relationship between the body part and the display surface in the longitudinal direction changes, the central region of the display surface in the longitudinal direction will not always be located at a position where the user may easily look at the central region. In such a case, the user needs to change their posture and adjust the angle of the display surface in order to easily look at the display surface. In addition, in the case where the display surface has a ring-like shape, the center of the display surface in the longitudinal direction is not definable.
Aspects of non-limiting embodiments of the present disclosure relate to making it easier for a user to look at predetermined information compared with the case where a device to be used by being worn by a user displays information items in an arrangement that is set without taking into consideration the viewability of a display surface for a user when the user looks at the display surface.
Aspects of certain non-limiting embodiments of the present disclosure overcome the above disadvantages and/or other disadvantages not described above. However, aspects of the non-limiting embodiments are not required to overcome the disadvantages described above, and aspects of the non-limiting embodiments of the present disclosure may not overcome any of the disadvantages described above.
According to an aspect of the present disclosure, there is provided an information processing apparatus including a processor configured to detect a viewable region of a display surface on a user, the viewable region being viewable from the user, and display predetermined information in an area including the center of the viewable region.
Exemplary embodiments of the present disclosure will be described in detail based on the following figures, wherein:
The terminal 1 used in the first exemplary embodiment is used by being worn around a wrist. A body 10 of the terminal 1 has a substantially cylindrical shape. Note that a slit may be formed in a portion of the body 10, and the slit may be expanded when a user wears and takes off the terminal 1.
A display 11 and a camera 12 are provided on the outer peripheral surface of the body 10. The display 11 is, for example, an organic electro luminescence (EL) display and has a shape curved along the outer peripheral surface of the body 10, that is, a shape having a curved surface. When the body 10 is deformed, the display 11 is also deformed integrally with the body 10.
In the case illustrated in
Alternatively, a plurality of displays 11 may be connected to each other so as to form a single display surface. Obviously, the plurality of displays 11 may be arranged in such a manner as to be spaced apart from each other. Note that the plurality of displays 11 may be spaced apart from each other in the circumferential direction of the body 10 or may be spaced apart from each other in the X-axis direction, which is a heightwise direction.
In the configuration example illustrated in
In the first exemplary embodiment, the single camera 12 is provided at a position near the center of the display 11 in the circumferential direction of the display 11.
Thus, when a user looks at an area near the center of the display 11 from the front, the user's face is located substantially at the center in an image captured by the camera 12.
It goes without saying that the camera 12 may at least be positioned outside the display 11. However, the camera 12 in the first exemplary embodiment is used for determining the location of a user who looks at the display 11, and thus, the camera 12 needs to be disposed in the vicinity of the display 11.
The terminal 1 in the first exemplary embodiment is an example of an information processing apparatus.
The CPU 101 in the first exemplary embodiment sets the arrangement of information items that are displayed on the display 11 through execution of a program. The CPU 101 is an example of a processor. The CPU 101 and the semiconductor memory 102 forms a computer.
The semiconductor memory 102 includes a storage device that is used as a work area and a rewritable non-volatile storage device that is used for storing data. The former storage device is a so-called random access memory (RAM), and the latter storage device is a so-called flash memory. Firmware is stored in the flash memory.
The communication module 103 is, for example, a Bluetooth (Registered Trademark) module or a wireless local area network (LAN) module.
The six-axis sensor 104 is a sensor that measures the position and the angular velocity of the terminal 1 and is formed of a three-axis acceleration sensor and a three-axis gyro sensor. Note that a nine-axis sensor that includes a three-axis orientation sensor may be employed instead of the six-axis sensor 104.
The display panel 105 and the capacitive film sensor 106 are included in the above-mentioned display 11. The capacitive film sensor 106 is stacked on a surface of the display panel 105 so as to form a touch panel. The capacitive film sensor 106 has a property of enabling a user to see information displayed on the display panel 105.
In addition, the capacitive film sensor 106 detects, from a change in electrostatic capacitance, the position at which a user makes a tap or the like. The display panel 105 is a so-called output device, and the capacitive film sensor 106 is a so-called input device.
The camera 12 is, for example, a complementary metal oxide semiconductor (CMOS) sensor. The camera 12 is an example of an imaging device.
The microphone 107 is a device that converts a user's voice and ambient sound into electric signals.
The speaker 108 is a device that converts an electrical signal into audio and outputs the audio.
In the first exemplary embodiment, the CPU 101 (see
For example, the CPU 101 determines the relative positional relationship between the user and the camera 12 by detecting the position or the size of a user's face in an image captured by the camera 12.
In a captured image, the size of the face of a user who is closer to the camera 12 is larger than the size of the face of a user who is farther from the camera 12.
When a user looks at the display 11 from the front, the user's face is located substantially at the center in an image captured by the camera 12. In contrast, when a user looks at the display 11 in an oblique direction, the user's face is located at the periphery of an image captured by the camera 12.
As described above, the CPU 101 determines the orientation of a user's face and the positional relationship between the user and the camera 12 on the basis of the size or the position of the user's face captured in an image.
Alternatively, the orientation of a user's face may be determined from the positional relationship or the size relationship between the facial parts captured an image.
For example, when user's facial parts such as eyes, a nose, a mouth, and ears are symmetrically located, the user is looking at the display 11 from the front. In other words, the user's face is oriented in the direction in which the user faces the display 11.
When a user's forehead is large, and the user's chin is small in an image, the user's face is presumed to be oriented in a direction in which the user looks up at the display 11. When the left side of a user's face is large and the right side of the user's face is small or is not visible in an image, the user's face is presumed to be oriented in a direction in which the user looks at the display 11 from the right-hand side.
The direction in which a user looks at the display 11 is presumable also from the position of a pupil in the user's eye. Here, the direction in which the user looks at the display 11 is the direction of the user's line of sight. For example, when a user's pupil is located on the upper side in the user's eye, it is understood that the user is looking up at the display 11, and when the pupil is located on the lower side in the eye, it is understood that the user is looking down at the display 11. Similarly, when the pupil is located on the left side in the eye, it is understood that the user is looking at the display 11 from the right-hand side, and when the pupil is located on the right side in the eye, it is understood that the user is looking at the display 11 from the left-hand side.
When the relationship between the orientation of a user's face and the position of the camera 12 is determined, the relationship between the orientation of the user's face and the position of the display 11 is also determined.
Note that a user's face does not need to be entirely captured in an image for detection of the positional relationship. In addition, by registering a user's face beforehand, the positional relationship is determined with higher accuracy.
Faces other than the face of a user wearing the terminal 1 (see
Once the relationship between the orientation of the user's face and the position of the camera 12 has been determined, the CPU 101 determines an area of the display 11 that is viewable from the user (step 2).
In the first exemplary embodiment, the surface of the display 11 is curved. Thus, the entire display 11 is not always viewable depending on the relationship between the orientation of the user's face and the display 11. For example, a portion of the display 11 having the curved display surface, the portion being located in the user's blind spot, is not viewable from the user. Accordingly, the CPU 101 determines, from the determined relationship between the orientation of the user's face and the position of the camera 12, an area that is viewable from the user. More specifically, the CPU 101 determines a viewable area by also using the curvature of the display 11.
When the area that is viewable from the user is determined, the CPU 101 positions an information item regarding the time (hereinafter referred to as “time information item”) near the center of the determined area (step 3). Here, the time information item is an example of an information item that is specified beforehand by the user.
In the first exemplary embodiment, an information item that is specified beforehand by a user is positioned at a location on the display 11 where the user may easily look at the information item, that is, the information item is positioned near the center of an area that is viewable from the user. Although
An information item that is positioned near the center of an area viewable from a user is an information item that is desired to be preferentially viewed by the user.
In the first exemplary embodiment, an information item that is positioned near the center of an area viewable from a user will also be referred to as a high-priority information item. Note that the other information items that are not a high-priority information item will be referred to as low-priority information items. The priority of each information item is specified beforehand by a user. Note that a user may specify only the priority of an information item to be positioned near the center of a viewable area, and information items to which no priority is given may be regarded as low-priority information items.
In the first exemplary embodiment, although there is one high-priority information item, there may be a plurality of high-priority information items. Also in the case where there are a plurality of high-priority information items, these plurality of high-priority information items are preferentially arranged near the center of a viewable area.
Note that, in the case where priorities are assigned to a plurality of predetermined information items, the information item having a higher priority may be positioned closer to the center of a viewable area.
In the case where priorities are not assigned to a plurality of predetermined information items, a region that is required for displaying these information items may be secured near the center of a viewable area, and the information items may be uniformly arranged in the region.
Arrangement of information items may be changed over time in accordance with a predetermined rule. For example, the positions of information items may be interchanged, or information items may be cyclically moved in a predetermined direction.
Note that the display size of an information item that is positioned near the center of an area viewable from a user may be changed in accordance with the size of an area of the display 11 that is viewable from the user. For example, the information item that is displayed near the center of the viewable area may be enlarged or reduced in size so as to correspond to the size of the viewable area. Here, an information items to be displayed is enlarged or reduced in size by changing, for example, the size of an icon or the font size.
In the first exemplary embodiment, the size of a viewable area is determined by the length or the angle of the display surface in the circumferential direction. Obviously, if an information item to be displayed is simply reduced in size, it may sometimes become difficult to see the information item. In such a case, the display size may be set so as not to be reduced to be smaller than a predetermined size. Similarly, the size of the information item to be displayed may be set so as not to be enlarged to be larger than a predetermined size.
Alternatively, the size of an information item to be displayed may be set to a fixed size regardless of an area that is viewable from a user. In this case, if the viewable area is too small for the size required for displaying the information item, the information item may be viewed by a scroll operation.
In addition, the number of information items to be displayed may be increased or decreased in accordance with the viewable area.
Once the position of the time information item has been set, the CPU 101 arranges the other information items in the remaining region of the determined area in accordance with a predetermined rule (step 4).
The other information items that are arranged in step 4 may be individually set by a user separately from the information item that is positioned near the center of the viewable area or may be set by the terminal 1 in accordance with a predetermined rule. In the case where the other information items are set by a user, the settings made by the user are given priority over the settings made in accordance with the rule.
The CPU 101 sets the arrangement of the information items in such a manner as to, for example, uniformly arrange the other information items in the remaining region. The arrangement may be set in accordance with the area of the remaining region and the contents of the other information items.
After that, the CPU 101 causes the information items to be displayed in the set arrangement (step 5).
Differences in arrangement of information items according to the positional relationship between a user looking at the terminal 1 and the display 11 of the terminal 1 will be described below with reference to
The user illustrated in
The CPU 101 (see
The user illustrated in
Thus, the user's face is located near the lower end of the image captured by the camera 12. The distance between the user's face and the camera 12 in the case illustrated in
The CPU 101 (see
Thus, a central region of the viewable area is located near an intermediate position between the camera 12 and the lower end of the display 11. In
An operation for changing an arrangement or the like of information items displayed on the display 11 and examples of arrangement change and so forth as a result of performing the operation will be described below.
In the case illustrated in
In the first modification, the CPU 101 determines whether an area touched and held by a user is a “central region of the area that is determined as viewable” or a “region of the viewable area other than the central region”. In the case illustrated in
In the first modification, the CPU 101 accepts changes of the positions of all the low-priority information items displayed on the display 11.
The information items in
Subsequently, the four low-priority information items are cyclically moved each time the user performs the touch-hold and drag operation. Note that the time information item, which is a high-priority information item, is displayed at a fixed position.
In the case illustrated in
Although it is very likely that a central region of an area that is viewable from a user is easier for the user to look at than the other regions are, the user may sometimes desire to move a high-priority information item to a different position.
In the case illustrated in
In the second modification, since the area of the time information item, which is a high-priority information item, is touched and held, the CPU 101 accepts a change of the position of the high-priority information item.
The information items in
In the case illustrated in
In the case illustrated in
The information items in
Also in the case illustrated in
In
In
Note that, if the user double-taps the application image again, the display form of the time is changed back to the original display form.
When the terminal 1 receives a call or an e-mail, an image that represents the incoming call or e-mail may be displayed near the center of the display 11 without any user operation, and the time, which is a high-priority information item, may be displayed in the same area by reducing its font size as illustrated in
In the above-described first exemplary embodiment, an image captured by the camera 12 (see
The terminal 1A that is used in the second exemplary embodiment is used by being worn around a wrist. The body 10 has a substantially cylindrical shape.
Note that the inner diameter of the body 10 in the second exemplary embodiment is larger than the diameter of a wrist around which the terminal 1A is to be worn. More specifically, a user may wear the terminal 1A by passing their hand through the opening of the body 10. Thus, the terminal 1A is wearable on a wrist without deforming the body 10. In the state where a user is wearing the terminal 1A, the position of the body 10 and the position of the user's wrist are not fixed with respect to each other. In other words, the body 10 is freely rotatable in the circumferential direction of the wrist.
The display 11 of the terminal 1A in the second exemplary embodiment has a substantially ring-like shape. In other words, the display 11 is provided in such a manner as to extend over substantially the entire circumferential surface of the body 10, which has a substantially cylindrical shape. Thus, an area that is viewable from a user is limited to a region of the substantially cylindrical shape that is oriented toward the user. However, in the case of the terminal 1A of the second exemplary embodiment, such a region that is oriented toward a user is not definable.
In the second exemplary embodiment, contact sensors 13 are arranged in such a manner as to be equally spaced on the inner peripheral surface of the body 10, that is, a surface of the body 10 that is opposite to the outer peripheral surface of the body 10 on which the display 11 is provided. In
The terminal 1A includes the CPU 101 that performs overall control of the device, the semiconductor memory 102 that stores programs and data, the communication module 103 that is used in communication with the outside, the six-axis sensor 104 that detects the movement and posture of a user wearing the terminal 1A, the display panel 105 that displays information, the capacitive film sensor 106 that detects a user operation performed on the display panel 105, the contact sensors 13, the microphone 107, and the speaker 108.
The difference between the terminal 1A and the terminal 1 of the first exemplary embodiment is that the contact sensors 13 are used instead of the camera 12 (see
In the second exemplary embodiment, the CPU 101 (see
When the terminal 1A is not worn by a user, the CPU 101 keeps outputting a negative result in step 11. During the period when the negative result is obtained in step 11, the CPU 101 repeats the determination in step 11.
When a user wears the terminal 1A on their wrist, and any one of the contact sensors 13 is brought into contact with a part of the user's body, an affirmative result is obtained in step 11.
When the affirmative result is obtained in step 11, the CPU 101 determines the position of the contact sensor 13 that is in contact with the user's body (step 12). The number of contact sensors 13 detected to be in contact with the user's body is not limited to one and may sometimes be two or more.
Next, the CPU 101 determines an area that is viewable from the user on the basis of the position of the contact sensor 13 detected to be in contact with the user's body (step 13). In the second exemplary embodiment, the area that is viewable from the user is determined on the assumption that the user looks at the display 11 such that the user looks down at a portion of the display surface that is located at a position corresponding to the position on the inner peripheral surface of the body 10 where the contact sensor 13 detects contact with the user's body.
Note that, in the case where two or more of the contact sensors 13 are detected to be in contact with the user's body, an intermediate position between the detected contact sensors 13 in the circumferential direction of the body 10 is calculated, and the viewable area is determined on the basis of the calculated position. The outer edge of a viewable area is calculated by using the curvature of the display unit 11.
Next, the CPU 101 positions the time information item near the center of the determined area (step 3). Subsequently, the CPU 101 arranges the other information items in the remaining region of the determined area in accordance with a predetermined rule (step 4) and causes the information items to be displayed in the set arrangement (step 5).
A specific example of a viewable area in the second exemplary embodiment will be described below with reference to
In the terminal 1A used in the second exemplary embodiment, substantially the entire circumferential surface of the body 10 serves as the display surface, and thus, a viewable area is set on the assumption that a portion of the body 10 that is in contact with a wrist is located on the upper side in the vertical direction.
The position of the printed mark illustrated in
In the second exemplary embodiment, the time is displayed near the center of the area viewable from the user regardless of the position of the portion on which the mark is printed with respect to the wrist.
In the second exemplary embodiment, although an area that is viewable from a user is determined on the basis of a position at which at least one of the contact sensors 13 (see
In the third exemplary embodiment, first, the CPU 101 determines whether any one of the contact sensors 13 detects contact (step 11), and if contact is detected, the CPU 101 determines the position of the contact sensor 13 that is in contact with a user's body (step 12).
After that, the CPU 101 determines whether there is a human face in an image captured by the camera 12 (step 21).
In the third exemplary embodiment, this is because only one camera 12 is provided even though the orientation of the body 10 with respect to a wrist is freely changeable.
If there is a human face in an image captured by the camera 12, the CPU 101 obtains an affirmative result in step 21. In this case, similar to the first exemplary embodiment, the CPU 101 determines the relationship between the orientation of the user's face and the position of the camera 12 from the image captured by the camera 12 (step 1). Subsequently, the CPU 101 determines an area of the display 11 that is viewable from the user (step 2).
In contrast, if there is no human face in an image captured by the camera 12, the CPU 101 obtains a negative result in step 21. In this case, the CPU 101 determines an area that is viewable from the user on the basis of the position of the contact sensor 13 detected to be in contact with the user (step 13). The subsequent steps are similar to those in the first and second exemplary embodiments.
In the third exemplary embodiment, even if there is no human face in an image captured by the camera 12, the time information item may be displayed near the center of an area that is highly likely to be viewable from a user. However, in the method of determining an area viewable from a user on the basis of the position of the contact sensor 13 that detects contact, it is assumed that the user looks down a portion of the terminal 1B that is detected to be in contact with the user. Thus, if the user actually looks at a portion of the terminal 1B that is different from the assumption, the displayed time is not always easily viewable from the user. Accordingly, in the third exemplary embodiment, when a user is captured in an image by the camera 12, which is provided on the body 10, the image captured by the camera 12 is used so as to reliably display the time at a position where the time is easily viewable from the user.
The terminal 1 (see
The body 10 in the fourth exemplary embodiment may be used in for example, a flat plate-like shape. Alternatively, the body 10 in the fourth exemplary embodiment may be used by being altered its shape into a C-shape or a J-shape when viewed from the side.
Note that the display 11 has flexibility so as to be deformable integrally with the body 10. Here, the display 11 is an example of a display device that is deformable.
In the fourth exemplary embodiment, an area that is viewable from a user is determined by using the contact sensors 13 in addition to the camera 12.
The terminal 1 (see
The terminal 1D that is used in the fifth exemplary embodiment is also used by being worn around a wrist.
The terminal 1D in the fifth exemplary embodiment includes a bar-shaped body 20 having a length that enables the body 20 to be wrapped around a wrist. In the fifth exemplary embodiment, the body 20 has a rectangular parallelepiped shape.
Two cameras 21 are arranged on a surface of the body 20, the surface being the front surface of the body 20 when the body 20 is wrapped around a user's wrist, and two projectors 22 are arranged on a side surface of the body 20, the side surface facing a user's arm when the body 20 is wrapped around the user's wrist.
Each of the cameras 21 is paired with one of the projectors 22. In the fifth exemplary embodiment, each pair of the camera 21 and the projector 22 are arranged so as to be at the same distance from an end of the body 20. The two cameras 21 are provided for the purpose of detecting a face of a user who wears the terminal 1D. The two projectors 22 are provided for the purpose of detecting projecting information onto a user's arm.
One of the two cameras 21 corresponds to the projector 22 that projects an image onto a user's arm on the palm side when the body 20 is wrapped around the user's wrist, and the other camera 21 corresponds to the projector 22 that projects an image on the user's arm on the back side of the hand when the body 20 is wrapped around the user's wrist.
A plurality of infrared sensors 23 are arranged in a row below the projectors 22. The infrared sensors 23 that detect a user operation that is performed on an image projected on the user's arm. The area in which the infrared sensors 23 are arranged is set in accordance with the width of an image that is projected onto the user's arm.
The terminal 1D includes the CPU 101 that performs overall control of the device, the semiconductor memory 102 that stores programs and data, the communication module 103 that is used in communication with the outside, the six-axis sensor 104 that detects the movement and posture of a user wearing the terminal 1D, the projectors 22 that project information, the infrared sensors 23 that detect user operations, the cameras 21, the microphone 107, and the speaker 108.
The CPU 101 in the fifth exemplary embodiment sets the arrangement of information items that are projected by the projectors 22 through execution of a program. The CPU 101 is an example of a processor.
In the fifth exemplary embodiment, the CPU 101 (see
Once the position of the camera 21 capturing the user's face has been determined, the CPU 101 determines the projector 22 that is capable of projecting a display surface onto a portion of the user's arm that is viewable from the user (step 32). Since each of the cameras 21 is paired with one of the projectors 22, when the position of one of the cameras 21 is determined, the position of the corresponding projector 22 is also determined.
Then, the CPU 101 positions the time information item near the center of the display surface projected by the determined projector 22 (step 33).
Once the position of the time information item has been set, the CPU 101 arranges the other information items in the remaining region of a determined area in accordance with a predetermined rule (step 34).
After that, the CPU 101 causes the information items to be displayed in the set arrangement (step 5).
In the fifth exemplary embodiment, the display surface is projected by the projector 22 that is paired with the camera 21 capturing a user's face, and the time is positioned near the center of the projected display surface.
Note that
Although the exemplary embodiments of the present disclosure have been described above, the technical scope of the present disclosure is not limited to the above-described exemplary embodiments. It is obvious from the description of the claims that other exemplary embodiments obtained by making various changes and improvements to the above-described exemplary embodiments are also included in the technical scope of the present disclosure.
For example, in the above-described exemplary embodiments, although an area that is viewable from a user is detected by using the camera 12 (see
In the above exemplary embodiments, although the terminal 1 (see
In addition, in each of the above exemplary embodiments, although the case has been described in which the display surface of the terminal has an area extending approximately halfway around a part of a human body on which the terminal is worn, since the display surface has a curved surface, the display 11 may at least have viewability that varies depending on the position where a user looks at the display 11.
Note that, in the above-described exemplary embodiments, the term “processor” refers to hardware in a broad sense. Examples of the processor include general processors (e.g., CPU: Central Processing Unit) and dedicated processors (e.g., GPU: Graphics Processing Unit, ASIC: Application Specific Integrated Circuit, FPGA: Field Programmable Gate Array, and programmable logic device).
In the above-described exemplary embodiments, the term “processor” is broad enough to encompass one processor or plural processors in collaboration which are located physically apart from each other but may work cooperatively. The order of operations of the processor is not limited to one described in the exemplary embodiments above, and may be changed.
The foregoing description of the exemplary embodiments of the present disclosure has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the disclosure and its practical applications, thereby enabling others skilled in the art to understand the disclosure for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the disclosure be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2020-098493 | Jun 2020 | JP | national |