Embodiments described herein relate generally to an electronic apparatus, a method and a storage medium.
Recently, electronic apparatuses called wearable devices, which can be worn on users to be used, have been developed.
Various forms of wearable device are possible, for example, a glasses-type wearable device which is worn on the head of the user is known. For example, the glasses-type wearable device allows various types of information to be displayed on a display provided in a lens portion of the device.
However, if, for example, the glasses-type wearable device is used when the user is walking, the use is sometimes dangerous depending on the state or condition of the user.
Thus, display of the glasses-type wearable device is preferably controlled in accordance with the state or condition of the user wearing the glasses-type wearable device.
A general architecture that implements the various features of the embodiments will now be described with reference to the drawings. The drawings and the associated descriptions are provided to illustrate the embodiments and not to limit the scope of the invention.
Various embodiments will be described hereinafter with reference to the accompanying drawings.
In general, according to one embodiment, an electronic apparatus in which user can see through at least a transparent part of a first display area when the electronic apparatus is worn on a body of the user is provided. The electronic apparatus includes a camera configured to take an image of surroundings comprising a region which the user cannot see through at least a transparent part of the first display area when the electronic apparatus is worn on a body of the user, and circuitry configured to perform controlling display of the first display area by using the image of surroundings.
First, a first embodiment will be described.
An electronic apparatus 10 shown in
The electronic apparatus body 11 is embedded, for example, in a frame portion of a glasses shape of the electronic apparatus 10 (hereinafter referred to as a frame portion of the electronic apparatus 10). It should be noted that the electronic apparatus body 11 may be attached to, for example, a side of the frame portion of the electronic apparatus 10.
The display 12 is supported by a lens portion of the glasses shape of the electronic apparatus 10 (hereinafter referred to as a lens portion of the electronic apparatus 10). Then, if the electronic apparatus 10 is worn on the head of the user, the display 12 is arranged in a position visually identified by the user.
The camera 13 is mounted on a frame of the electronic apparatus 10 near the display 12 as shown in, for example,
The processor 11a is a processor configured to control an operation of each component in the electronic apparatus 10. The processor 11a executes various types of software loaded from the non-volatile memory 11b which is a storage device into the main memory 11c. The processor 11a includes at least one processing circuitry, for example, a CPU or an MPU.
The display 12 is a display for displaying various types of information. The information displayed on the display 12 may be kept in, for example, the electronic apparatus 10, or may be acquired from an external device of the electronic apparatus 10. If the information displayed on the display 12 is acquired from the external device, wireless or wire communication is executed between the electronic apparatus 10 and the external device through, for example, a communication device (not shown).
The camera 13 is an imaging device configured to image surroundings (take an image of surroundings) of the electronic apparatus 10. If the camera 13 is mounted in a position shown in
The touchsensor 14 is a sensor configured to detect a contact position of, for example, a finger of the user. The touchsensor 14 is provided in, for example, the frame portion of the electronic apparatus 10. For example, a touchpanel can be used as the touchsensor 14.
In this embodiment, all or part of the image acquisition module 101, the state estimation module 103, the display controller 104 and the operation acceptance module 105 may cause the processor 11a to execute a program, that is, may be realized by software, may be realized by hardware such as an integrated circuit (IC) or may be realized as a combination of the software and hardware. Further, in this embodiment, the storage 102 is stored in the non-volatile memory 11b.
Although the electronic apparatus 10 includes the storage 102 in
The image acquisition module 101 acquires an image (for example, still image) of a scene around the electronic apparatus 10 which is taken by the camera 13. It should be noted that the image acquired by the image acquisition module 101 includes, for example, various objects present around the electronic apparatus 10.
The storage 102 prestores an object pattern in which, for example, information concerning an object is defined.
The state estimation module 103 detects an object included in the image acquired by the image acquisition module 101 based on the object pattern stored in the storage 102. The state estimation module 103 estimates the state of the user wearing the electronic apparatus 10 based on the detected object.
The display controller 104 executes processing of displaying various types of information on the display 12. Even if the various types of information is displayed on the display 12, a display area in which the information is displayed includes fixed permeability. Further, the display controller 104 includes a function of controlling display of (the display area on) the display 12 (hereinafter referred to as an automatic display control function) based on the state of the user estimated by the state estimation module 103 (that is, an imaging result by the camera 13).
The operation acceptance module 105 includes a function of accepting an operation of the user to the electronic apparatus 10. The operation accepted by the operation acceptance module 105 includes, for example, an operation to the above-described touchsensor 14.
Next, processing procedures of the electronic apparatus 10 according to this embodiment will be described with reference to the flowchart of
Here, predetermined information can be displayed on the display 12 in accordance with, for example, the operation of the user wearing the electronic apparatus 10 in the electronic apparatus 10 according to this embodiment (block B1).
The information displayed on the display 12 includes, for example, various types of information such as information of a motion picture, a web page, weather forecast and a map. Further, the display 12 is arranged in a position visually identified by the user if the electronic apparatus 10 is worn on the head of the user, as described above. Accordingly, if the user wears the electronic apparatus 10, the predetermined information is displayed (on the display 12) in front of the sight of the user, and the user can visually identify the displayed information without, for example, grasping the electronic apparatus 10 by hand.
It should be noted that the display 12 is constituted of, for example, a special lens, and the various types of information is projected on the display 12 by a projector (not shown) provided in, for example, the frame portion of the electronic apparatus (glasses-type wearable device) 10. This allows the various types of information to be displayed on the display 12. Although the information is displayed on the display 12 using the projector in this description, another structure can be adopted if the information can be displayed on the display 12.
Moreover, although the display 12 is supported by the lens portion corresponding to each of both eyes in the glasses shape as shown in
If the predetermined information is displayed on the display 12 as described above, the image acquisition module 101 acquires an image of a scene around the electronic apparatus 10 taken by the camera 13 (for example, a scene in a sight direction of the user) (block B2). It should be noted that the image acquired by the image acquisition module 101 may be a still image or a moving image in this embodiment.
Next, the state estimation module 103 executes processing of detecting an object from the image acquired by the image acquisition module 101 (block B3).
In this case, the state estimation module 103 analyzes the image acquired by the image acquisition module 101, and applies the object pattern stored in the storage 102 to the analysis result.
Here, for example, information concerning an object arranged out of a house (on a street), an object arranged at home and a person (for example, a shape of the object) is defined as the object pattern stored in the storage 102. It should be noted that the object arranged out of a house includes, for example, a car, a building and various signs. Further, the object arranged at home includes, for example, furniture and a home electrical appliance. By using such an object pattern, the state estimation module 103 can detect an area corresponding to a shape, etc., defined as the object pattern in the image acquired by the image acquisition module 101 as an object (that is, the object arranged out of a house, the object arranged at home, the person, etc.). It should be noted that the object pattern stored in the storage 102 can be properly updated.
Next, the state estimation module 103 estimates the state of the user (state around the user) based on a detection result of an object (block B4). Specifically, the state estimation module 103 estimates that the user is out if the object arranged out of a house is detected from the image acquired by the image acquisition module 101. Further, the state estimation module 103 estimates that the user is at home if the object arranged at home is detected from the image acquired by the image acquisition module 101. If the state estimation module 103 estimates that the user is out, the state estimation module 103 detects a person (the number of persons) from the image acquired by the image acquisition module 101.
The display controller 104 determines whether the display on the display 12 needs to be controlled (changed) or not based on the state of the user estimated by the state estimation module 103 (block B5).
Here, if, for example, the user is out and a number of people are present around the user (that is, the user is in a crowd), the user's sight to surroundings is not sufficiently secured, which sometimes interrupts walk of the user, in a state where the predetermined information is displayed on the display 12. Thus, if the state estimation module 103 estimates that the user is out and a large number of people are detected by the state estimation module 103 (the number is larger than a preset value), the display controller 104 determines that the display on the display 12 needs to be controlled (restricted). If, for example, an object which may bring danger to the user is detected, it may be determined that the display on the display 12 needs to be controlled even if the user is not in a crowd.
On the other hand, if the state estimation module 103 estimates that the user is at home, the display controller 104 determines that the display on the display 12 need not be controlled.
If it is determined that the display on the display 12 needs to be controlled (YES in block B5), the display controller 104 controls the display (state) on the display 12 by the automatic display control function (block B6). It should be noted that the display control may be performed on both of the displays 12, or may be performed on only one of the displays 12.
The electronic apparatus 10 according to the embodiment performs controlling display of the display area by using the image of surroundings comprising a region which the user cannot see through at least a transparent part of the display area when the electronic apparatus 10 is worn on a body of the user.
Processing of controlling the display on the display 12 by the display controller 104 (automatic display control function) will be hereinafter described.
Here, a case where information is displayed in the whole area (screen) of the display 12 as shown in
It should be noted that area 12a shown in
According to the display area pattern, the sight of the user is secured in at least a part of a direction passing a display area having permeability when the electronic apparatus 10 is worn on part of a body of the user to be used.
It should be noted that the first to fifth display area patterns are kept in the display controller 104 in advance. Further, the display area patterns described above are examples, and other display area patterns may be kept.
If the user is in a crowd as described above, the display area of the display 12 as shown in
Further, other than the change to the display area patterns kept in the display controller 104 in advance as described above, information may be displayed, for example, only in an area in which no person is detected. Moreover, even if the state estimation module 103 estimates, for example, that the user is at home, it can be estimated that the user views a TV when the TV is detected from an image by the state estimation module 103. Thus, information can also be displayed only in an area in which the TV is not detected.
Here, if a display area pattern is changed in a state where information is displayed in the whole area of the display 12 as shown in
Specifically, if a plurality of information items are displayed in the whole area of the display 12, for example, a preference of the user is analyzed and priority for each information item is determined in accordance with the analysis result. This allows the determined information item with high priority to be displayed on the display 12 (or in the display area of the display 12). It should be noted that information necessary to analyze the preference of the user may be kept in, for example, the electronic apparatus 10, or may be acquired from an external device.
Further, although a display area pattern is changed in this description, control to change display content of the display 12 may be performed without changing the display area pattern.
Specifically, if the user is in a crowd (that is, it is necessary to pay attention to surroundings), information to call attention, for example, “crowded” may be displayed. Similarly, if a motion picture including a caption is displayed on the display 12 when the user is in a crowd, the caption may be automatically turned off.
Further, when the state of the user is estimated, for example, a matter to which attention should be paid around the user may be preferentially displayed by acquiring a present location of the user using, for example, the Global Positioning System (GPS). It should be noted that the matter to which attention should be paid around the user can be acquired from regional information of the present location of the user, etc.
Further, in a state of emergency such as an earthquake, information concerning the emergency (for example, emergency news report) may be displayed in preference to other information (for example, display of other information is turned off).
Moreover, when the information to call attention and that concerning emergency are displayed, a display form can be changed (for example, a color can be changed), or characters can be enlarged in consideration of, for example, a human visual feature and a color of a surrounding scene.
Although a case where the display controller 104 changes a display area (pattern) or display content of information on the display 12 is mainly described, other control (processing) may be performed in the electronic apparatus 10 according to this embodiment if the display on the display 12 is controlled (changed) in accordance with, for example, the state of the user estimated by the state estimation module 103. If information is displayed to be visually identified with, for example, both eyes (that is, on both of the displays 12), the display may be changed (controlled) to display the information to be visually identified with an eye (that is, on one of the displays 12). The same is true of each of the following embodiments.
Even if the display on the display 12 is changed (controlled) in accordance with the state of the user estimated by the state estimation module 103 as described above, the change (that is, display control) is sometimes unnecessary for the user. Specifically, for example, even if a number of people are present around the user, the control of the display on the display 12 as described above is often unnecessary if the user does not walk. In such a case, the user can perform a predetermined operation (hereinafter referred to as a display switching operation) on the electronic apparatus 10 to switch the display on the display 12 (that is, return it to a state before the processing of block B6 is executed). It should be noted that the display switching operation performed on the electronic apparatus 10 by the user is accepted by the operation acceptance module 105.
Examples of display switching operations performed on the electronic apparatus 10 will be hereinafter described with reference to
Here, as described above, the touchsensor (for example, touchpanel) 14 is provided in the frame portion of the electronic apparatus 10 in this embodiment. Thus, contact (position) of a finger, etc., of the user with the frame portion, a moving direction of the contact position, etc., can be detected in the electronic apparatus 10. Accordingly, for example, each of operations described below can be detected in the electronic apparatus 10.
In the following description, of the frame portion of the electronic apparatus 10, a portion supporting a lens (the display 12) is referred to as a front (portion) and a portion including ear hooks which is other than the front portion is referred to as a temple (portion). Further, when the electronic apparatus 10 is worn, a temple portion located on the right side of the user is referred to as a right temple portion, and that located on the left side of the user is referred to as a left temple portion.
Although the first to eighth operations can be detected by the touchsensor 14 provided in the frame portion of the electronic apparatus 10, the operation performed on the electronic apparatus 10 may be detected by other sensors, etc.
In this embodiment, for example, at least one of the first to ninth operations is specified as a display switching operation.
It should be noted that the first to ninth operations are just examples and other operations may be specified as a display switching operation. As the other operations, for example, a nail of the user may be brought into contact with the frame portion of the electronic apparatus 10, or a finger may be alternately brought into contact with the right temple portion and the left temple portion of the electronic apparatus 10.
Moreover, an operation by eyes of the user wearing the electronic apparatus 10 may be performed as a display switching operation by attaching a sensor for detecting the eyes to, for example, (an inside of) the frame portion of the electronic apparatus 10. Although, for example, a camera configured to image eye movement of the user can be used as the sensor for detecting the eyes of the user, other sensors such as a sensor in which infrared rays are utilized may be used. In this case, an operation of, for example, shifting eyes to the right (or to the left) can be a display switching operation. Moreover, an operation by a blink of the user (by the number of blinks, etc.) can be a display switching operation.
Further, although (at least one of) the first to ninth operations are display switching operations in this description, the first to ninth operations may be performed as normal operations to the electronic apparatus 10.
Specifically, the first operation of stroking the temple portion of the electronic apparatus 10 from the front side to the ear hook side (in a first direction) with a finger, which is described in
Further, the second operation of tapping the tip of the right temple portion of the electronic apparatus 10, which is described in
Further, the third operation of touching the temple portion of the electronic apparatus 10 with two fingers at the same time, which is described in
Further, the fourth operation of picking the contact portion between the front portion and the temple portion of the electronic apparatus 10 from bottom up with a finger, which is described in
Further, the fifth operation of pinching the front portion of the electronic apparatus 10 with the forefinger and thumb once to grasp it, which is described in
Further, the sixth operation of stroking a portion located from just beside the exterior of the right lens frame portion of the electronic apparatus 10 to the lower right of the right lens frame portion from top down with a finger, which is described in
Further, the seventh operation of stroking a portion at the bottom of the exterior of the right lens frame portion of the electronic apparatus 10 from right to left with a finger, which is described in
Moreover, if the eighth operation of pinching a portion near the front portion of the right temple portion of the electronic apparatus 10 with the forefinger and thumb to grasp it, which is described in
Similarly, also in a case where the eighth operation of pinching a portion near the front portion of the right temple portion of the electronic apparatus 10 with the forefinger, middle finger and thumb to grasp it, operations of picking the portion with one of the forefinger, middle finger and thumb once, picking it twice, releasing it after touching it for approximately 0.2 to 1 second, keeping the one finger released, etc., can be an operation indicating, for example, “yes•no/forward•back/information display ON•OFF/upward•downward scroll/rightward•leftward scroll”.
Further, the ninth operation of tilting the electronic apparatus 10 to the right, which is described in
If the above-described operation by the user's eyes is performed, an operation of shifting the user's eyes, for example, to the right (that is, causing the user to slide a glance to the right) can be an operation indicating “yes/forward/information display ON”, and an operation of shifting the user's eyes to the left (that is, causing the user to slide a glance to the left) can be an operation indicating “no/back/information display OFF”. Moreover, an operation of causing the user to slowly blink (slowly close eyes) can be an operation indicating “yes/forward/information display ON”, and an operation of causing the user to quickly blink twice (quickly close eyes twice) can be an operation indicating “no/back/information display OFF”.
Referring to
If it is determined that the display switching operation is accepted (YES in block B7), the display controller 104 controls the display on the display 12 in accordance with the operation (block B8). Specifically, the display controller 104 performs control to return (switch) the display state of the display 12 (display area and display content) to a state before the processing of block B6 is executed. Further, other display control may be performed in accordance with the display switching operation.
If the processing of block B8 is executed, the automatic display control function of the display controller 104 may be disabled for a certain period or turned off. If the automatic display control function is disabled for a certain period, the automatic display control function can be automatically reutilized after the certain period passes. On the other hand, if the automatic display control function is turned off, the automatic display control function cannot be utilized until, for example, the user explicitly turns on the automatic display control function.
If it is determined that the display on the display 12 need not be controlled in block B5 (NO in block B5), the processing after block B6 is not executed, and the display state of the display 12 by the processing of block B1 is maintained.
Similarly, if it is determined that the display switching operation is not accepted in block B7 (NO in block B8), the processing of block B8 is not executed, and the display state of the display 12 by the processing of block B6 is maintained.
The processing shown in
After control of restricting the display on the display 12 is performed, control of removing the restriction may be performed. Specifically, a case where it is determined that the display on the display 12 need not be restricted (that is, a case where a state where the display on the display 12 needs to be restricted is solved) if the processing shown in
Further, even if, for example, the user is in a crowd (that is, the number of persons acquired by the state estimation module 103 is large), it is also possible not to restrict the display on the display 12 if it is determined that persons around the user do not move much (for example, if they sit on chairs and stand by in a waiting room, etc.) by analyzing, for example, an image (here, for example, moving image) taken by the camera 13.
Although the camera 13 is used to estimate the state of the user in
Further, a microphone configured to detect sound (voice) of surroundings may be used to estimate, for example, whether the user is in the crowd or not. In this case, it is possible to estimate, for example, that the user is in the crowd, because living sound, traffic noise, etc., can be recognized by analyzing, for example, ambient sound detected by the microphone (spectrum pattern of background sound).
Moreover, a photodiode may be used to estimate the state of the user. The camera 13 and another sensor are preferably used in combination because the state of the user is sometimes difficult to estimate in detail based on only the information from the photodiode.
Although the GPS antenna, the microphone and the photodiode are described as examples of other sensors, sensors other than them may be used. If the camera 13 and other sensors are used in combination, estimation accuracy of the state of the user can be improved.
It is sometimes difficult to keep the camera 13 working in view of the energy consumption of the electronic apparatus 10. Thus, the camera 13 may be started, for example, only when the state of the user cannot be estimated only based on information detected by sensors other than the camera 13. Further, the camera 13 may be started when change of the state around the user is detected using sensors other than the camera 13. The change of the state around the user can be detected when it is determined that the user moves by a distance greater than or equal to a preset value (threshold value) based on, for example, position information acquired by GPS. Further, the change of the state around the user can be detected based on change of brightness, a color, etc., acquired by the photodiode. Such a structure allows the energy consumption of the electronic apparatus 10 to be reduced.
Here, in the electronic apparatus 10 according to this embodiment, the user can turn off (that is, manually remove) the automatic display control function in advance by operating the electronic apparatus 10. Processing procedures of the electronic apparatus 10 when the automatic display control function is turned off will be described with reference to the flowchart of
First, the operation acceptance module 105 accepts an operation performed on the electronic apparatus 10 by the user (block Ell).
Then, the operation acceptance module 105 determines whether or not the accepted operation is an operation for turning off the automatic display control function (hereinafter referred to as a function OFF operation) (block B12). It should be noted that the function OFF operation is specified in advance, and, for example, at least one of the first to ninth operations can be the function OFF operation.
If it is determined that the accepted operation is the function OFF operation (YES in block B12), the display controller 104 turns off the automatic display control function (block B13). If the automatic display control function is turned off in this manner, the processing after block B2 shown in
On the other hand, if it is determined that the accepted operation is not the function OFF operation, the processing of block B13 is not executed. In this case, the processing according to the operation accepted by, for example, the operation acceptance module 105 is executed in the electronic apparatus 10.
A case where the automatic display control function is turned off is described. However, for example, if an operation similar to the function OFF operation is accepted in a state where the automatic display control function is turned off, the automatic display control function can be turned on. It should be noted that the operation for turning on the automatic display control function may be an operation different from the function OFF operation.
As described above, in this embodiment, the display on the display 12 is controlled in accordance with the imaging result around the user by the camera 13 (imaging device), and the display area or display content of the display 12 can be changed (restricted) in accordance with the state of the user, for example, the user being in a crowd. This prevents the display of the information on the display 12 from disturbing walk of the user. Thus, the user can walk safely, and the safety of the user wearing the electronic apparatus 10, people around the user, etc., can be ensured. Moreover, since control of automatically returning the display on the display 12 to an original state can be performed in this embodiment if a state where the display on the display 12 needs to be restricted is solved, display can be appropriately performed in accordance with the state of the user.
Further, in this embodiment, since an operation to the frame portion, etc., of the electronic apparatus 10 can be performed as an operation of switching the display on the display 12, an operation of turning off the automatic display control function and another normal operation to the electronic apparatus 10, operability of the electronic apparatus (glasses-type wearable appliance) 10 can be improved.
Moreover, in this embodiment, the state of the user can be suitably estimated using the camera 13 and a sensor such as the GPS antenna and the microphone.
Although the processing described in this embodiment is executed in the electronic apparatus 10 in this description, the electronic apparatus 10 may operate as a display device, and the above processing may be executed in an external device (for example, smartphone, tablet computer, personal computer or server device) communicably connected to the electronic apparatus 10. Further, although the electronic apparatus 10 according to this embodiment is mainly a glasses-type wearable device in this description, this embodiment can be applied to, for example, an electronic apparatus in which a display is arranged in a position visually identified by the user when worn on the user (that is, display needs to be controlled in accordance with the state of the user, etc.).
Next, a second embodiment will be described.
This embodiment is different from the first embodiment in that the state (action) of the user wearing the electronic apparatus 10 is estimated based on the imaging result by the camera 13.
As shown in
In this embodiment, the storage 201 is stored in the non-volatile memory 11b. It should be noted that the storage 201 may be provided in, for example, an external device communicably connected to the electronic apparatus 10.
Further, in this embodiment, all or part of the state estimation module 202 and the display controller 203 may be realized by software, may be realized by hardware, or may be realized as a combination of the software and hardware.
The storage 201 prestores state estimation information in which, for example, the state of the user estimated from an amount of movement in a moving image is defined.
The state estimation module 202 estimates the state of the user wearing the electronic apparatus 10 based on the image acquired by the image acquisition module 101 and the state estimation information stored in the storage 201.
The display controller 203 includes a function (automatic display control function) of controlling the display (state) on the display 12 based on the state of the user estimated by the state estimation module 202 (that is, imaging result by the camera 13).
Next, processing procedures of the electronic apparatus 20 according to this embodiment will be described with reference to the flowchart of
First, processing of blocks B21 and B22 equivalent to the processing of blocks B1 and B2 shown in
Next, the state estimation module 202 calculates the amount of movement in the moving image acquired by the image acquisition module 101 from a plurality of frames constituting the moving image (block B23). Specifically, the state estimation module 202 calculates the amount of movement based on, for example, a position of a specific object between the frames constituting the moving image acquired by the image acquisition modyle 101. The amount of movement calculated by the state estimation module 202 allows, for example, a moving direction and a moving amount (moving speed) of the user to be obtained.
The state estimation module 202 estimates the state of the user based on the calculated amount of movement (moving direction and moving amount) and the state estimation information stored in the storage 201 (block B24).
Here, the state of the user estimated from, for example, each of a plurality of prepared amounts of movement (moving direction and moving amount) is defined in the state estimation information stored in the storage 201. It should be noted that the state of the user which can be estimated by the state estimation information includes, for example, a state where the user is on a moving vehicle, a state where the user is walking and a state where the user is running.
The use of such state estimation information allows the state estimation module 202 to specify the state of the user estimated from (an amount of movement equal to) the calculated amount of movement.
In the processing of block B24, the state where the user is on a moving vehicle is estimated if the amount of movement equivalent to, for example, that of movement at dozens of kilometers per hour in the sight direction of the user is calculated. Moreover, the state where the user is walking is estimated if the amount of movement equivalent to, for example, that of movement at approximately four to five kilometers per hour in the sight direction of the user is calculated. Further, the state where the user is running is estimated if the amount of movement equivalent to, for example, that of movement at approximately ten kilometers per hour in the sight direction of the user is calculated.
The state of the user estimated by the state estimation module 202 may be states other than those described above. Specifically, if an amount of movement (moving direction and moving amount) equivalent to that of movement in a vertical direction is calculated, for example, a state where the user is on a vehicle moving in a vertical direction such as an elevator or an escalator, or a state where the user is doing bending and stretching exercises can be estimated in accordance with the moving amount.
Since the state of the user is estimated based on the amount of movement calculated from the moving image taken by the camera 13 in this embodiment, a moving image sufficient to calculate the amount of movement from which the moving direction and moving amount of the user can be obtained needs to be taken to estimate, for example, the state where the user is on a moving vehicle.
Further, although the state where the user is on a vehicle can be estimated from the amount of movement calculated from the moving image taken by the camera 13, the type of vehicle is difficult to estimate. In this case, the user can be caused to register the type of vehicle (for example, car or train). Moreover, a vehicle containing the user may be specified based on a scene around the user included in the moving image by analyzing the moving image taken by the camera 13.
The state estimation information stored in the storaget 201 can be properly updated.
Next, the display controller 203 determines whether the display on the display 12 needs to be controlled (changed) or not based on the state of the user estimated by the state estimation module 202 (block B25).
Here, if, for example, the user is on a vehicle (for example, the user drives a car) in a state where information is displayed on the display 12, the user's sight to the surroundings is not sufficiently secured, or the user cannot concentrate on driving. Then, an accident, etc., may be caused. Further, the state where information is displayed on the display 12 may cause a collision, etc., with a person or an object around the user as well as when the user is walking or running. Accordingly, if the state where the user is on a vehicle, the state where the user is walking or the state where the user is running is estimated by the state estimation module 202, the display controller 203 determines that the display on the display 12 needs to be controlled (restricted). On the other hand, if the user is on, for example, a train or a bus, an accident or a collision is not likely to be caused. Thus, even if, for example, the state where the user is on a vehicle is estimated by the state estimation module 202, the display controller 203 determines that the display on the display 12 need not be controlled if the train, bus or the like is registered by the user as a type of vehicle.
If it is determined that the display on the display 12 needs to be controlled (YES in block B25), the display controller 203 controls the display on the display 12 by the automatic display control function (block B26). Since the processing of controlling the display on the display 12 by the display controller 203 is similar to that in the first embodiment, detailed description thereof will be omitted. That is, the display controller 203 performs control to, for example, change a display area (pattern) or display content of information on the display 12.
Here, the display area of the display 12 may be changed to, for example, any of the first to fifth display area patterns to secure the user's sight to the surroundings; however, it may be changed to a different display area pattern in accordance with the state of the user estimated by the state estimation module 202. Specifically, if the state of the user estimated by the state estimation module 202 is on a vehicle, the user may, for example, drive a car. In this case, the display area of the display 12 may be changed to, for example, the first display area pattern in which information is displayed only in an area lower than the center of the display 12 to secure a sight which will not interfere with driving of a car. Further, the display area of the display 12 may be changed to the fifth display area pattern (that is, display of information is turned off) to further improve safety. On the other hand, if the state of the user estimated by the state estimation module 202 is walking or running, the display area of the display 12 may be changed to, for example, the first display area pattern in which information is displayed only in an area located above the center of the display 12, or the third display area pattern in which information is displayed in triangle areas located in the upper part of the display 12, to secure the sight around the user's feet.
As described above, when the processing of block B26 is executed, the processing of blocks B27 and B28 equivalent to that of blocks B7 and B8 shown in
If it is determined that the display on the display 12 need not be controlled in block B25 (NO in block B25), the processing after block B26 is not executed, and the display state of the display 12 by the processing of block B21 is maintained.
Similarly, if it is determined that the display switching operation is not accepted in block B27 (NO in block B27), the processing of block B28 is not executed, and the display state of the display 12 by the processing of block B26 is maintained.
The processing shown in
After control of restricting the display on the display 12 is performed, control of removing the restriction may be performed, as described in the first embodiment.
Further, although the camera 13 is used to estimate the state of the user in
Further, the camera 13 may be started only when the state of the user cannot be estimated only based on information detected by sensors other than the camera 13 to reduce the energy consumption. Moreover, the camera 13 may be started when change of the state of the user is detected. The change of the state of the user can be detected when it is determined that the user moves by a distance greater than or equal to a preset value (threshold value) based on, for example, position information acquired by GPS. Further, the change of the state of the user can be detected based on ambient sound detected by the microphone. Such a structure allows the energy consumption of the electronic apparatus 20 to be reduced.
It should be noted that the user can turn off (remove) the automatic display control function by operating the electronic apparatus 20 in the electronic apparatus 20 according to this embodiment as well as in the first embodiment. Since the processing procedures of the electronic apparatus 20 to turn off the automatic display control function are described in the first embodiment, detailed description thereof will be omitted.
As described above, the display on the display 12 is controlled in accordance with the imaging result around the user by the camera 13 (imaging device) in this embodiment. Since the display area or display content of the display 12 can be changed (restricted) in accordance with the state of the user (action), for example, the state where the user is on a vehicle, the state where the user is walking or the state where the user is running, the safety of the user wearing the electronic apparatus 20, people around the user, etc., can be ensured.
Next, a third embodiment will be described.
This embodiment is different from the first and second embodiments in that the state of the user (action) is estimated based on information concerning acceleration that acts on the electronic apparatus.
As shown in
As shown in
In this embodiment, all or part of the angular velocity acquisition module 301, the state estimation module 303 and the display controller 304 may be realized by software, may be realized by hardware, or may be realized as a combination of the software and hardware. Further, in this embodiment, the storage 302 is stored in the non-volatile memory 11b.
Although the electronic apparatus 30 includes the storage 302 in
The angular velocity acquisition module 301 acquires angular velocity information detected by the gyro sensor 15. The angular velocity information acquired by the angular velocity acquisition module 301 allows vibration (pattern) caused to the electronic apparatus 30 to be acquired (detected).
The storage 302 prestores the state estimation information in which, for example, the state of the user estimated from the vibration (pattern) caused to the electronic apparatus 30 is defined.
The state estimation module 303 estimates the state of the user wearing the electronic apparatus 30 based on the angular velocity information acquired by the angular velocity acquisition module 301 and the state estimation information stored in the storage 302.
The display controller 304 includes a function (automatic display control function) of controlling the display (state) on the display 12 based on the state of the user estimated by the state estimation module 303 (that is, information detected by the gyro sensor 15).
Next, processing procedures of the electronic apparatus 30 according to this embodiment will be described with reference to the flowchart of
First, processing of block B31 equivalent to the processing of block B1 shown in
Next, the angular velocity acquisition module 301 acquires angular velocity information detected by the gyro sensor 15 (block B32).
The state estimation module 303 acquires a pattern of a vibration (hereinafter referred to as a vibration pattern) caused to the electronic apparatus 30 by an external factor by analyzing an amount of exercise based on the angular velocity information acquired by the angular velocity acquisition module 301 (block B33).
The state estimation module 303 estimates the state of the user based on the acquired vibration pattern and the state estimation information stored in the storage 302 (block B34).
Here, the state of the user estimated from, for example, each of a plurality of prepared vibration patterns is defined in the state estimation information stored in the storage 302. It should be noted that the state of the user which can be estimated by the state estimation information includes, for example, the state where the user is on a moving vehicle, the state where the user is walking and the state where the user is running.
The use of such state estimation information allows the state estimation module 303 to specify the state of the user estimated from (a vibration pattern equal to) the acquired vibration pattern.
In the processing of block B34, the state where the user is on a vehicle is estimated if a vibration pattern equivalent to, for example, that caused on a vehicle is acquired. Further, the state where the user is walking is estimated if a vibration pattern equivalent to, for example, that caused during walking is acquired. Moreover, the state where the user is running is estimated if a vibration pattern equivalent to, for example, that caused during running is acquired.
Moreover, different vibrations (shakes) can be detected using the gyro sensor 15 in accordance with, for example, a type of vehicle containing the user. Thus, for example, a state where the user is in a car, a state where the user is on a train, a state where the user is on a bus, a state where the user is on a motorcycle and a state where the user is on a bicycle can be estimated as the state where the user is on a vehicle using the gyro sensor 15. In this case, it suffices that the storage 302 prestores a vibration pattern caused on each of the vehicles (the state estimation information in which the state of the user estimated from the vibration pattern is defined).
It should be noted that the state estimation module 303 can calculate angular information by carrying out an integration operation of the angular velocity (information) detected by, for example, the gyro sensor 15 and acquire (detect) a moving angle (direction). The state of the user can be estimated with high accuracy using the moving angle calculated in this manner.
Further, the state of the user estimated by the state estimation module 303 may be states other than those described above. Specifically, if the moving angle is acquired as described above, a vibration pattern caused by movement in a vertical direction can be acquired. Thus, for example, a state where the user is on a vehicle such as an elevator or an escalator, or a state where the user is doing bending and stretching exercises can also be estimated in accordance with the vibration pattern.
It should be noted that the user can be caused to register the type of vehicle (for example, car or train).
The state estimation information stored in the storage 302 can be properly updated.
Next, the display controller 304 determines whether the display on the display 12 needs to be controlled (changed) or not based on the state of the user estimated by the state estimation module 303 (block B35).
Here, if, for example, the user is in a car, on a motorcycle or on a bicycle in a state where information is displayed on the display 12, the user's sight to the surroundings is not sufficiently secured, or the user cannot concentrate on driving. Then, an accident, etc., may be caused. Further, the state where information is displayed on the display 12 may cause a collision, etc., with a person or an object around the user as well as when the user is walking or running. Accordingly, if the state where the user is in a car, the state where the user is on a motorcycle, the state where the user is on a bicycle, the state where the user is walking and the state where the user is running are estimated by the state estimation module 303, the display controller 304 determines that the display on the display 12 needs to be controlled (restricted). On the other hand, if, for example, the user is on a train or a bus, an accident or a collision is not likely to be caused. Thus, if the state where the user is on a train or a bus is estimated by the state estimation unit 303, the display controller 304 determines that the display on the display 12 need not be controlled.
Even if, for example, the state where the user is in a car is estimated, the display on the display 12 need not be controlled (restricted) if the user is not a driver but a fellow passenger. Then, it may be determined that the display on the display 12 needs to be controlled only when, for example, an image taken by the camera 13 is analyzed, and it is determined that a steering wheel is at close range to the user (for example, approximately 10 to 50 cm) (that is, when the user wearing the electronic apparatus 30 is a driver). Moreover, it may be determined that the display on the display 12 need not be controlled if it is determined that a vehicle is stopping, based on the angular velocity information (vibration information) detected by the gyro sensor 15, the amount of movement calculated from the image (here, moving image) taken by the camera 13 described in the second embodiment, or the like.
On the other hand, if, for example, the user is on a motorcycle or a bicycle, the user is highly likely to drive it. Thus, if the state where the user is on a motorcycle or a bicycle is estimated, it is determined that the display on the display 12 needs to be controlled.
Further, even if, for example, the user is walking or running, it may be determined that the display on the display 12 need not be controlled if the user is walking or running using an instrument such as a treadmill in a gym, etc. Whether the gym is utilized or not may be determined by analyzing an image taken by the camera 13, or by causing the user to register that the gym is utilized. Further, it may be determined based on a present location, etc., acquired by GPS.
If it is determined that the display on the display 12 needs to be controlled (YES in block B35), the display controller 304 controls the display on the display 12 by the automatic display control function (block B36). Since the processing of controlling the display on the display 12 by the display controller 304 is similar to those in the first and second embodiments, detailed description thereof will be omitted. That is, the display controller 304 performs control to, for example, change a display area (pattern) or display content of information on the display 12.
Here, the display area of the display 12 may be changed to, for example, any of the first to fifth display patterns to secure the user's sight to the surroundings; however, it may be changed to a different display area pattern in accordance with the state of the user estimated by the state estimation module 303. Specifically, if the state of the user estimated by the state estimation module 303 is a state where the user is in a car, on a motorcycle or on a bicycle, the display area of the display 12 may be changed to, for example, the first display area pattern in which information is displayed only in an area lower than the center of the display 12, or the fifth display area pattern (that is, display of information is turned off), as described in the second embodiment. On the other hand, if the state of the user estimated by the state estimation unit 303 is a state where the user is walking or running, the display area of the display 12 may be changed to, for example, the first display area pattern in which information is displayed only in an area located above the center of the display 12, or the third display area pattern in which information is displayed in triangle areas located in the upper part of the display 12, as described in the second embodiment.
When the processing of block B36 is executed as described above, processing of blocks B37 and B38 equivalent to the processing of blocks B7 and B8 shown in
If it is determined that the display on the display 12 need not be controlled in block B35 (NO in block B35), the processing after block B36 is not executed, and the display state of the display 12 by the processing of block B31 is maintained.
Similarly, if it is determined that the display switching operation is not accepted in block B37 (NO in block B37), the processing of block B38 is not executed, and the display state of the display 12 by the processing of block B36 is maintained.
The processing shown in
After control of restricting the display on the display 12 is performed, control of removing the restriction may be performed, as described in the first embodiment.
Further, although the camera 13 and the gyro sensor 15 are used to estimate the state of the user in this embodiment, other sensors such as a GPS antenna and a microphone may be used, as described in the second embodiment. If the camera 13 and gyro sensor 15 and another sensor are used in combination, estimation accuracy of the state of the user can be improved. On the other hand, the state of the user may be estimated using only the gyro sensor 15 without the camera 13.
The user can turn off (remove) the automatic display control function by operating the electronic apparatus 30 in the electronic apparatus 30 according to this embodiment as well as in the first and second embodiments. Since the processing procedures of the electronic apparatus 30 to turn off the automatic display control function are described in the first embodiment, detailed description thereof will be omitted.
As described above, the state of the user is estimated based on the angular velocity information detected by the gyro sensor 15 (detector) (information concerning acceleration), and the display on the display 12 is controlled in accordance with the estimated state in this embodiment. Since the display area or display content of the display 12 can be changed (restricted) in accordance with the state of the user (action), for example, the state where the user is on a vehicle (a car, a motorcycle, a bicycle or the like), the state where the user is walking or the state where the user is running, the safety of the user wearing the electronic apparatus 30, people around the user, etc., can be ensured.
Moreover, since whether, for example, the user is a driver or a fellow passenger can also be estimated by estimating the state of the user in accordance with the angular velocity information detected by the gyro sensor 15 and the imaging result by the camera 13 in this embodiment, the display on the display 12 can be controlled only when necessary (for example, when the user is a driver).
Next, a fourth embodiment will be described.
This embodiment is different from the first to third embodiments in that the state of the user (physical condition) is estimated based on information concerning a biological body of the user wearing the electronic apparatus.
As shown in
In this embodiment, all or part of the biological information acquisition module 401, the state estimation module 403 and the display controller 404 may be realized by software, may be realized by hardware, or may be realized as a combination of the software and hardware. Further, in this embodiment, the storage 402 is stored in the non-volatile memory 11b.
Although the electronic apparatus 40 includes the storage 402 in
The biological information acquisition module 401 acquires biological information detected by the biological sensor 16.
The storage 402 prestores the state estimation information in which, for example, the state of the user estimated from (a pattern of) the biological information is defined.
The state estimation module 403 estimates the state of the user wearing the electronic apparatus 40 based on the biological information acquired by the physiological information acquisition module 401 and the state estimation information stored in the storage 402.
The display controller 404 includes a function (automatic display control function) of controlling the display (state) on the display 12 based on the state of the user estimated by the state estimation module 403 (that is, information detected by the biological sensor 16).
Next, processing procedures of the electronic apparatus 40 according to this embodiment will be described with reference to the flowchart of
First, processing of block B41 equivalent to the processing of block B1 shown in
Next, the physiological information acquisition module 401 acquires the biological information detected by the biological sensor 16 (block B42). It should be noted that the biological information acquired by the biological information acquisition module 401 (that is, the biological information detected by the biological sensor 16) includes (information of) the body motion measured by the acceleration sensor, the skin temperature measured by the thermometer, the cardiac potential measured by the electrocardiographic sensor, etc., which are mounted on the biological sensor 16. Further, the acceleration sensor can measure, for example, the acceleration due to gravity. Thus, the body motion included in the biological information includes, for example, a body position of the user (that is, direction of body) specified in accordance with the direction of the acceleration of gravity measured by the acceleration sensor.
The state estimation module 403 analyzes a health condition of the user from the biological information acquired by the biological information acquisition module 401, and estimates the state of the user based on the analysis result and the state estimation information stored in the storage 402 (block B43).
Here, the state of the user estimated from, for example, each of (patterns of) a plurality of prepared biological information items is defined in the state estimation information stored in the storage 402. It should be noted that the state of the user which can be estimated by the state estimation information includes a state where a convulsion is caused, a state where a fever is caused, a state where arrhythmia is caused, a state where the user is sleeping, etc.
The use of such state estimation information allows the state estimation module 403 to specify the state of the user estimated from a pattern of (biological information equal to) the acquired biological information.
In the processing of block B43, the state where a convulsion is caused is estimated if the biological information equivalent to the pattern of the biological information when, for example, the convulsion is caused (for example, body motion different from that in normal times) is acquired. Further, the state where a fever is caused is estimated if the biological information equivalent to the pattern of the physiological information when, for example, the fever is caused (for example, skin temperature higher than a preset value) is acquired. Moreover, the state where arrhythmia is caused is estimated if the biological information equivalent to the pattern of the biological information when, for example, the arrhythmia is caused (for example, cardiac potential different from that in normal times) is acquired. Further, the state where the user is sleeping is estimated if the biological information equivalent to the pattern of the biological information when, for example, the user is sleeping (for example, body motion and direction of body during sleeping) is acquired.
It should be noted that the state of the user estimated by the state estimation module 303 may be states other than those described above. Specifically, a state where the health condition of the user is abnormal (that is, indisposed), etc., can be estimated by comprehensively considering (body motion, skin temperature, cardiac potential, etc., of) the biological information detected by the biological sensor 16. On the other hand, a state where the health condition of the user is normal can be estimated by comprehensively considering the biological information detected by the physiological sensor 16.
The state estimation information stored in the storage 402 can be properly updated.
Next, the display controller 404 determines whether the display on the display 12 needs to be controlled (changed) or not based on the state of the user estimated by the state estimation module 403 (block B44).
Here, if, for example, the user suffers from the convulsion, fever, arrhythmia, etc., (that is, the user is indisposed) in a state where the predetermined information is displayed on the display 12, the user cannot fully take a rest due to, for example, viewing stress, which may cause deterioration of the health condition of the user. Further, if the user is sleeping, information need not be displayed on the display 12. Thus, if the state where the convulsion, fever or arrhythmia is caused, or the state where the user is sleeping is estimated by the state estimation module 403, the display controller 404 determines that the display on the display 12 needs to be controlled. On the other hand, if other states (for example, a state where the health condition of the user is normal) are estimated by the state estimation module 403, the display controller 404 determines that the display on the display 12 need not be controlled.
If it is determined that the display on the display 12 needs to be controlled (YES in block B44), the display controller 404 controls the display on the display 12 by the automatic display control function (block B45).
In this embodiment, the display controller 404 performs control of changing the display area pattern of the information on the display 12 to the fifth display area pattern (that is, display of information is turned off) to, for example, reduce the viewing stress. Control of changing it to another display area pattern may be performed.
Although the control of turning off the display of the information on the display 12 is described, control of changing the display content of the display 12 may be performed in accordance with the state of the user estimated by the state estimation module 403. Specifically, if the state where the convulsion, fever or arrhythmia is caused is estimated by the state estimation module 403, control of stopping the reproduction of a motion picture (for example, picture containing strenuous movement) may be performed.
When the processing of block B45 is executed as described above, processing of blocks B46 and B47 equivalent to the processing of the blocks B7 and B8 shown in
If it is determined that the display on the display 12 need not be controlled in block B44 (NO in block B44), the processing after block B45 is not executed, and the display state of the display 12 by the processing of block B41 is maintained.
Similarly, if it is determined that the display switching operation is not accepted in block B46 (NO in block B46), the processing of block B47 is not executed, and the display state of the display 12 by the processing of block B45 is maintained.
The processing shown in
After control of restricting the display on the display 12 is performed, control of removing the restriction may be performed, as described in the first embodiment.
Further, although the biological sensor 16 is used to estimate the state of the user in
It should be noted that the user can turn off (remove) the automatic display control function by operating the electronic apparatus 40 in the electronic apparatus 40 according to this embodiment as well as in the first to third embodiments. Since the processing procedures of the electronic apparatus 40 to turn off the automatic display control function are described in the first embodiment, detailed description thereof will be omitted.
As described above, the display on the display 12 is controlled in accordance with the biological information detected by the biological sensor 16 (detector) (information concerning the biological body of the user) in this embodiment. Since the display area or display content of the display 12 can be changed (restricted) in accordance with the state of the user (health condition), for example, the state where the convulsion, fever or arrhythmia is caused (that is, indisposed), or the state where the user is sleeping, the display control of the display 12 can be performed in consideration of the health condition of the user wearing the electronic apparatus 40. That is, this embodiment allows the viewing stress during a bad physical condition to be reduced.
It should be noted that the electronic apparatus 40 according to this embodiment can also be realized in combination with the first to third embodiments. That is, the electronic apparatus 40 may include both the automatic display control function of the first to third embodiments in which the camera 13, the gyro sensor 15, etc., are used and the automatic display control function of this embodiment in which the biological sensor 16 is used. This allows the display control of the display 12 suitable for the state or condition of the user to be performed.
At least one of the above embodiments allows the display control of the display 12 matching the state of the user wearing the electronic apparatus to be performed.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
This application claims the benefit of U.S. Provisional Application No. 62/077,113, filed Nov. 7, 2014, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62077113 | Nov 2014 | US |