The disclosures of Japanese Patent Application No. 2014-182587, filed on Sep. 8, 2014, are incorporated herein by reference.
The technique disclosed here relates to a hand-held electronic apparatus capable of performing predetermined information processing.
Hitherto, there is a portable game apparatus including a display and a camera provided behind the display. Among such conventional game apparatuses, there is an apparatus in which a captured image can be obtained through a shutter operation of a user, a characteristic portion of the obtained image can be analyzed, and a virtual character can be synthesized with the captured image on the basis of a result of the analysis and displayed on the display.
However, in the above conventional portable apparatus, a result of information processing based on an image captured by the camera is displayed on the display, and thus there is room for improvement in performing more effective output with respect to the user as a result of a process corresponding to an input performed by the user.
Therefore, an object of the exemplary embodiments is to provide an electronic apparatus capable of performing more effective output with respect to a user as a result of a process corresponding to an input performed by the user.
In the exemplary embodiments, in order to attain the object described above, the following configuration examples are exemplified.
A hand-held electronic apparatus according to an embodiment includes a camera, a vibrator, an acquirer, a detector, and a vibration controller. The acquirer is configured to acquire an input image captured by the camera. The detector is configured to detect a gesture made by a user, on the basis of the input image acquired by the acquirer. The vibration controller is configured to provide vibration to the user by using the vibrator in accordance with a result of detection of the gesture by the detector.
According to the above, the electronic apparatus is able to detect a gesture made by the user, on the basis of an image captured by the camera, and cause the vibrator to vibrate in accordance with a result of detection of the gesture. Thus, the user is allowed to receive feedback with respect to the gesture input through vibration.
In another configuration, the detector may detect a gesture made with a hand of the user, and vibration may be provided by the vibration controller to a hand of the user different from the hand with which the gesture is made.
According to the above, the user is allowed to perform a gesture input with one hand and receive feedback with respect to the gesture input with the other hand.
In another configuration, the camera may be capable of capturing an image in a side surface direction of the hand-held electronic apparatus.
According to the above, the user is allowed to perform a gesture input from the side surface direction.
In another configuration, the camera may be provided at a side surface of the hand-held electronic apparatus.
In another configuration, a held portion to be held by the user may be provided in the hand-held electronic apparatus at least at a side opposite to a portion at an imaging direction side of the camera.
According to the above, since the camera is provided at the side opposite to the held portion, the user is allowed to hold the held portion with one hand and make a gesture with the other hand.
In another configuration, an input section configured to accept an input performed by the user may be provided to the held portion.
According to the above, the user is allowed to perform an input with respect to the input section while holding the held portion.
In another configuration, the input section may be operated with a finger capable of being moved when the held portion is held.
According to the above, for example, if the user can move their index finger when the held portion is held with their thumb, the user is allowed to operate the input section by using the index finger.
In another configuration, the input section may be at least one push button.
In another configuration, the hand-held electronic apparatus may further include a display at a front surface thereof.
In another configuration, the hand-held electronic apparatus may further include an information processor configured to perform predetermined information processing in accordance with the gesture detected by the detector. The vibration controller provides vibration corresponding to a result of the predetermined information processing.
According to the above, the predetermined information processing is performed in accordance with the gesture made by the user. As the predetermined information processing, for example, game processing may be performed, or a process of evaluating the gesture made by the user may be performed.
In another configuration, the vibration controller may provide vibration after a predetermined time period elapses from the detection of the gesture by the detector.
According to the above, it is possible to provide feedback to the user by means of vibration after the user makes the gesture and the predetermined time period elapses.
In another configuration, the hand-held electronic apparatus may be a hand-held game apparatus which is held by the user with both hands and used. In a state where the hand-held electronic apparatus is held with one hand, a gesture made with the other hand is detected by the detector. Vibration is provided to the one hand by the vibration controller.
According to the above, it is possible to provide a novel hand-held game apparatus which provides feedback by means of vibration.
In another configuration, the hand-held electronic apparatus may have a horizontally long shape, and the camera may be provided at a short side of the hand-held electronic apparatus.
According to the above, since the camera is provided at the short side in the horizontally longhand-held electronic apparatus, the user is allowed to perform a gesture input from the lateral direction of the electronic apparatus.
In another configuration, the camera may be a camera capable of receiving infrared light.
According to the above, it is possible to detect the specific object on the basis of an image captured by the infrared camera. By using the infrared camera, it is possible to reduce influence of the external environment as compared to a normal camera which captures an image of visible light, and it is possible to obtain an image suitable for detecting the specific object.
In another configuration, the hand-held electronic apparatus may further include a sound controller configured to cause a sound to be outputted in accordance with the result of the detection by the detector.
According to the above, it is possible to output a sound in addition to an image in accordance with the result of the detection.
A hand-held electronic apparatus according to one embodiment includes a housing, a camera, a vibrator, an acquirer, a detector, and a vibration controller. The camera is capable of capturing an image in a side surface direction of the housing. The acquirer is configured to acquire an input image captured by the camera. The detector is configured to detect a specific object on the basis of the input image acquired by the acquirer. The vibration controller is configured to provide vibration to a user by using the vibrator in accordance with a result of detection by the detector.
A hand-held electronic apparatus according to one embodiment includes a distance measuring sensor, a vibrator, an acquirer, a detector, and a vibration controller. The acquirer is configured to acquire information from the distance measuring sensor. The detector is configured to detect a movement made by a user, on the basis of the information acquired by the acquirer. The vibration controller is configured to provide vibration to the user by using the vibrator on the basis of a result of detection by the detector.
In another configuration, the detector may detect a distance to an object, and the vibration controller may provide vibration corresponding to the distance.
According to the above, for example, it is possible to provide vibration corresponding to the distance between the object and the electronic apparatus.
According to the present embodiment, it is possible to detect a gesture made by the user with the camera and provide vibration to the user in accordance with a result of detection.
These and other objects, features, aspects and advantages of the exemplary embodiments will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings.
Hereinafter, a portable electronic apparatus according to an exemplary embodiment will be described. The portable electronic apparatus is a hand-held information processing apparatus which can be held with hands and operated by a user, and may be, for example, a game apparatus, or may be any apparatus such as a mobile phone (smartphone, etc.), a tablet terminal, a camera, a watch-type terminal, or the like.
As shown in
As the display 2, for example, a liquid crystal display device, an organic EL display device, or the like is used. In addition, any display device may be used. The screen of the display 2 is provided so as to be exposed on a front surface (T5 surface) of the housing 10. The touch panel 3 is provided on the screen of the display 2 and detects a position, on the screen, which is touched by the user. As the touch panel 3, one capable of detecting a single point or one capable of detecting multiple points is used, and any touch panel such as an electrostatic capacitance type, a resistive film type, or the like may be used.
The input buttons 6A to 6D accept an input (pressing) performed by the user. Each of the input buttons 6A to 6D is provided at a position which a finger of the user reaches when the user holds both ends of the portable electronic apparatus 1. Specifically, each of the input buttons 6A and 6C is located at a position which a finger of the right hand of the user reaches when the user holds the portable electronic apparatus 1 with their right hand, the input button 6A is provided at a position which the thumb of the right hand reaches, and the input button 6C is provided at a position which the index finger or the middle finger of the right hand reaches. In addition, each of the input buttons 6B and 6D is located at a position which a finger of the left hand of the user reaches when the user holds the portable electronic apparatus 1 with their left hand, the input button 6B is located at a position which the thumb of the left hand reaches, and the input button 6D is located at a position which the index finger or the middle finger of the left hand reaches. As shown in
The infrared camera 4 includes a lens and a sensor which senses light (infrared light, specifically, near-infrared light). The sensor of the infrared camera 4 is an image sensor in which elements that sense infrared light are arranged in rows and columns, and each element of the image sensor receives infrared light and converts the infrared light into an electric signal, thereby outputting a two-dimensional infrared image.
Light (e.g., infrared light) emitted from a light source provided in the distance measuring sensor 5 is reflected on an object. The distance measuring sensor 5 measures the distance to the object by its light receiving element receiving the reflected light. As the distance measuring sensor 5, any type of sensor such as a triangulation type sensor or a TOF (Time Of Flight) type sensor may be used. As the light source of the distance measuring sensor 5, an LED, a laser diode, or the like which emits infrared light in a specific direction is used.
The irradiation section 7 emits infrared light at a predetermined time interval (e.g., a 1/60 sec interval). The irradiation section 7 emits infrared light in synchronization with timing at which the infrared camera 4 captures an image. The irradiation section 7 emits infrared light to a predetermined range in a right side surface direction of the portable electronic apparatus 1. The infrared light emitted by the irradiation section 7 is reflected on an object and, the reflected infrared light is received by the infrared camera 4, whereby an image of the infrared light is obtained. The irradiation section 7 may be used for capturing an infrared image by the infrared camera 4 and measuring a distance by the distance measuring sensor 5. That is, using the infrared light from the irradiation section 7, an image may be captured by the infrared camera 4 and also a distance may be measured by the distance measuring sensor 5.
The projector 8 includes a light source which emits visible light, and projects a character, an image, or the like onto a projection surface (a screen, a hand of the user, etc.) by using light from the light source.
The infrared camera 4, the distance measuring sensor 5, the irradiation section 7, and the projector 8 are provided at a side surface (e.g., a right side surface: T1 surface) of the housing 10. Specifically, the imaging direction (optical axis) of the infrared camera 4 is directed in a direction perpendicular to the right side surface. The detection direction of the distance measuring sensor 5 and a direction in which the projector 8 emits light are also similarly directions perpendicular to the right side surface. That is, when the user holds the portable electronic apparatus 1 with their left hand, the infrared camera 4 captures an image of a space in the right side surface direction of the portable electronic apparatus 1, and the distance measuring sensor 5 measures the distance to an object present in the space in the right side surface direction of the portable electronic apparatus 1. In addition, the projector 8 projects an image or the like by emitting visible light in the same direction as those of the infrared camera 4 and the distance measuring sensor 5.
An outer camera 9 is provided at a back surface (T6 surface) of the portable electronic apparatus 1 (
The control section 14 is connected to the respective sections such as the display 2, the touch panel 3, the infrared camera 4, the distance measuring sensor 5, the input button 6, the irradiation section 7, the projector 8, the vibrator 11, the microphone 12, the speaker 13, the communication section 15, the attitude detection section 16, the GPS receiver 17, and the geomagnetic sensor 18, and controls the respective sections.
Specifically, the control section 14 includes a CPU, a memory, and the like, and performs a predetermined process on the basis of a predetermined program (e.g., application programs for performing game processing, image processing, and various calculations) stored in a storage unit (e.g., a nonvolatile memory, a hard disk, etc.) which is provided in the portable electronic apparatus 1 and not shown. For example, the control section 14 acquires an image from the infrared camera 4 and analyzes the image; calculates the distance to an object on the basis of a signal from the distance measuring sensor 5; and performs a process corresponding to an input signal from the touch panel 3 or the input button 6. The control section 14 generates an image based on a result of a predetermined process, and outputs the image to the display 2. A program for performing the predetermined process may be downloaded from the outside via the communication section 15.
The vibrator 11 operates on the basis of an instruction from the control section 14, to vibrate the entire portable electronic apparatus 1. The vibrator 11 is provided at a predetermined position (e.g., at a center portion within the housing 10 or a position shifted left or right therefrom) from which vibration is easily transmitted to the hands of the user.
The microphone 12 and the speaker 13 are used for inputting and outputting sound. The communication section 15 is used for performing communication with another apparatus by a predetermined communication method (e.g., a wireless LAN, etc.). The attitude detection section 16 is, for example, an acceleration sensor or an angular velocity sensor, and detects the attitude of the portable electronic apparatus 1.
The GPS receiver 17 receives a signal from a GPS (Global Positioning System) satellite, and the portable electronic apparatus 1 can calculate the position of the portable electronic apparatus 1 on the basis of the received signal. For example, when a predetermined operation (e.g., a gesture input using the infrared camera 4 described later, a button input, or a motion of shaking the portable electronic apparatus 1) is performed at a specific position, the portable electronic apparatus 1 may display an object associated with the specific position. For example, in the case where a game is played with the portable electronic apparatus 1, when the portable electronic apparatus 1 is present at a specific position, an object associated with the specific position may be caused to appear in the game.
The geomagnetic sensor 18 is a sensor capable of detecting the direction and the magnitude of magnetism. For example, the portable electronic apparatus 1 determines whether the portable electronic apparatus 1 is directed in a specific direction, on the basis of a detection result of the geomagnetic sensor 18. When a predetermined operation (the above-described gesture input, etc.) is performed in the specific direction, the portable electronic apparatus 1 may display an object. For example, when a game is played with the portable electronic apparatus 1, an object corresponding to the specific direction may be caused to appear in the game. In addition, the portable electronic apparatus 1 may use a combination of GPS information obtained by using the GPS receiver 17 and direction information obtained by using the geomagnetic sensor. For example, when the portable electronic apparatus 1 is present at a specific position and directed in a specific direction, the portable electronic apparatus 1 may display an object corresponding to the specific position and the specific direction, or may cause the object to appear in a game.
Next, an input with respect to the portable electronic apparatus 1 will be described. In the present embodiment, the user can perform a gesture input with respect to the portable electronic apparatus 1 by using their right hand in a state of holding the portable electronic apparatus 1 with their left hand.
As shown in
Specifically, when the image shown in
Here, examples of gestures to be identified include body gestures and hand gestures using a part or the entirety of the body such as the hands and the face of the user, and the portable electronic apparatus 1 may recognize, as a gesture input, a state where a hand or the like remains still, or may recognize, as a gesture input, a series of motions using a hand. In addition, the portable electronic apparatus 1 may recognize a gesture input performed in a state where the user holds an object. In this case, the portable electronic apparatus 1 may recognize, as a gesture input, a state where only the object held by the user remains still or is moved, or may recognize, as a gesture input, a state where both the hand of the user and the object remain still or are moved.
The portable electronic apparatus 1 of the present embodiment calculates the distance between the portable electronic apparatus 1 and an object by using the distance measuring sensor 5 as described later, and performs a process on the basis of the calculated distance. For example, on the basis of information from the distance measuring sensor 5, the portable electronic apparatus 1 detects whether an object is present in the right side surface direction of the portable electronic apparatus 1, or detects the distance between the portable electronic apparatus 1 and an object, thereby enabling a movement of the object in the right side surface direction to be detected. For example, when the distance detected by the distance measuring sensor 5 has changed within a predetermined time period, the portable electronic apparatus 1 can recognize that the user is swinging their hand right and left in the right side surface direction of the portable electronic apparatus 1. In addition, when an object has been detected and has not been detected by the distance measuring sensor 5 within a predetermined time period, the portable electronic apparatus 1 can recognize that the user is swinging their hand up and down. Then, the portable electronic apparatus 1 performs a predetermined process in accordance with the movement of the detected object, and displays a result of the process on the display 2.
As described above, in the present embodiment, by detecting a specific object in the right side surface direction of the portable electronic apparatus 1 using the infrared camera 4 and/or the distance measuring sensor 5, it is possible to identify various gesture inputs with respect to the portable electronic apparatus 1 and display, on the display 2, a result of a process corresponding to each input.
Here, each of the infrared camera 4 and the distance measuring sensor 5 has a recognizable range where an object can be recognized.
As shown in
As shown in
Next, a process based on a gesture input detected by using the infrared camera 4 and the distance measuring sensor 5 will be described.
Furthermore, when, from the state shown in
When the input image shown in
Next, when the input image shown in
Next, when the input image shown in
Furthermore, when the input image shown in
As shown in
When a character string “1+1=?” is displayed as shown in
As described above, the portable electronic apparatus 1 identifies a gesture input performed by the user, on the basis of an input image from the infrared camera 4, and determines whether a question has been correctly answered. Then, the portable electronic apparatus 1 displays a result of the determination on the display 2. For example, the portable electronic apparatus 1 sequentially displays a plurality of questions within a predetermined time period, and the user answers the displayed questions through gestures using their right hand. Then, for example, a score is displayed on the display 2 on the basis of the number of questions correctly answered within the predetermined time period.
Next, an example of a process using the distance measuring sensor 5 will be described.
As described above, the portable electronic apparatus 1 calculates the distance between the portable electronic apparatus 1 and an object present in the right side surface direction of the portable electronic apparatus 1 by using the distance measuring sensor 5, and displays an image corresponding to the calculated distance, on the display 2. In the process using the distance measuring sensor 5, a predetermined process is performed on the basis of whether an object is present and/or the distance to the object. In addition, as described above, a movement of an object in the right side surface direction of the portable electronic apparatus 1 is detected on the basis of detection of the object or a change in the distance to the object within a predetermined time period, and a predetermined process is performed in accordance with the detection of the movement.
Next, an example of output using the projector 8 will be described.
As shown in the lower part of
For example, the portable electronic apparatus 1 sequentially displays, on the display 2, gestures which the user is caused to make. When the user makes the same gesture as the displayed gesture within a predetermined time period, it is determined as success; and when the user makes a gesture different from any displayed gesture, it is determined as failure. Then, the portable electronic apparatus 1 outputs an image from the projector 8 in accordance with a result of the determination, thereby projecting an image corresponding to the result of the determination, onto the hand of the user.
It should be noted the hand of the user may be tracked by capturing an image of the hand of the user with the infrared camera 4, and the projector 8 may be controlled such that an image is projected onto the hand of the user. The projector 8 is capable of projecting an image to a predetermined range. The portable electronic apparatus 1 acquires an image from the infrared camera 4 at a predetermined time interval, recognizes the position of the hand of the user on the basis of the image, and sets a position onto which an image is projected. For example, in a state where the image is projected on the hand of the user as shown in the lower part of
A range where a gesture input can be performed by the user may be notified the user of by emitting light in the right side surface direction of the portable electronic apparatus 1 using the projector 8. For example, when an image of the entire hand of the user can be captured by the infrared camera 4, blue light may be emitted; and when an image of only a part of the hand of the user can be captured by the infrared camera 4, red light may be emitted. That is, by emitting light from the projector 8 in a range which is the same as or similar to the imaging range (angle of view) of the infrared camera 4, the user can be caused to recognize the range where a gesture input can be performed, thereby guiding the user. The portable electronic apparatus 1 may include a light source (e.g., an LED, a halogen lamp, a fluorescent lamp, an EL lamp, etc.) which emits predetermined light instead of the projector 8 capable of projecting an image, and the range where a gesture input can be performed may be notified the user of by emitting light from the light source.
As described above, in the present embodiment, not only an input using the infrared camera 4 or the distance measuring sensor 5 but also output by light using the projector 8 are enabled to be performed from the side surface direction of the portable electronic apparatus 1.
As described above, in the present embodiment, the infrared camera 4 and the distance measuring sensor 5 are provided at the right side surface of the portable electronic apparatus 1, the user performs various gesture inputs with their right hand while holding the portable electronic apparatus 1 with their left hand, and a predetermined process corresponding to each gesture input is performed. Since the infrared camera 4 and the distance measuring sensor 5 are provided at the right side surface, while holding the portable electronic apparatus 1 with one hand and viewing the display 2, the user is allowed to perform a gesture input with the other hand without an uncomfortable feeling.
For example, in the case where the infrared camera 4 and the distance measuring sensor 5 for a gesture input are provided at the T5 surface (front surface) at which the screen of the display 2 is provided, when the user performs a gesture input, their hand overlaps the screen of the display 2, thereby decreasing the visibility of the display 2. In addition, in the case where the infrared camera 4 and the distance measuring sensor 5 for a gesture input are provided at the T6 surface (back surface) which is opposite to the screen of the display 2, the user performs a gesture input behind the screen, so that the user cannot see their right hand and has difficulty in performing a gesture input.
However, in the portable electronic apparatus 1 of the present embodiment, the infrared camera 4 and the distance measuring sensor 5 for a gesture input are provided at the side surface of the display 2. Thus, even when the user performs a gesture input with their right hand, the screen and the right hand do not overlap each other, so that the user can perform the gesture input without decreasing the visibility of the screen. In addition, since the infrared camera and the like are provided at the side surface of the display 2, while holding the portable electronic apparatus 1 with one hand, the user is allowed to perform a gesture input with the other hand with respect to the portable electronic apparatus 1, and is allowed to perform an input with respect to the portable electronic apparatus 1 such that the portable electronic apparatus 1 is sandwiched between both hands in the right-left direction, and confirm output (image display) corresponding to the input. Thus, it is possible to provide, to the user, a new feeling of operation and use of the portable electronic apparatus which feeling is not provided in the conventional art.
The user performs a gesture input from the right hand direction while viewing the screen of the display 2, and is allowed to view an image displayed on the display 2 as a result of the gesture input. That is, at the same time when the user views the screen of the display 2, the right hand with which a gesture input is performed comes into the user's field of vision, and the user is allowed to simultaneously view an input using their right hand and output from the display 2. Thus, the user is allowed to more naturally perform a gesture input and confirm output corresponding to the gesture input.
In the portable electronic apparatus 1 of the present embodiment, the infrared camera 4 is used, a normal camera which captures an image (RGB image) of visible light is not used, and thus the portable electronic apparatus 1 has robustness. That is, in the case where a normal camera is used, depending on the brightness of the external environment (intensity of visible light), it may be impossible to obtain an input image suitable for image recognition due to an excessively large or small amount of light. In addition, when a human hand is recognized in the case where a normal camera is used, the hand is recognized on the basis of the shape of the hand and the skin color of the hand, but it may be impossible to recognize the human hand depending on the color of light in the external environment. That is, depending on the external environment, it may be impossible to obtain an image in which a specific object (e.g., the user's hand) is detectable. However, in the case where the infrared camera 4 is used, it is possible to obtain an input image which is not influenced by the external environment and is suitable for image recognition, by transmitting infrared light from the irradiation section 7 (infrared light that is emitted from the irradiation section 7 and reflected on an object) while blocking visible light with a filter. In particular, the amount of infrared light decreases in an indoor space in which sunlight is blocked, and a specific object is irradiated with the infrared light from the irradiation section 7, thereby enabling a clear image, which is not influenced by the external environment and includes the specific object, to be obtained. In addition, also in an outdoor space, for example, at night, the specific object is irradiated with the infrared light from the irradiation section 7, thereby enabling an image, which is required for image recognition, to be obtained.
Next, an example of a process performed in the portable electronic apparatus 1 will be described.
As shown in
Next, the control section 14 performs predetermined information processing (step S12). Here, as the predetermined information processing, any of the following is conceivable: a process using the input image from the infrared camera 4; a process based on the detection information from the distance measuring sensor 5; and a process using both the infrared camera 4 and the distance measuring sensor 5. The predetermined information processing in step S12 will be described later with reference to
After the processing in step S12, the control section 14 outputs an image based on a result of the predetermined information processing in step S12 to the display 2 and/or the projector 8 (step S13). Then, the control section 14 determines whether to end the process shown in
Next, the predetermined information processing in step S12 in
As shown in
Next, the control section 14 determines whether a user's hand has been detected in the image recognition processing (step S21). If a user's hand is detected (step S21: YES), the control section 14 identifies the type of a gesture (step S22). Specifically, on the basis of the shape of the detected user's hand, the control section 14 identifies the type of the gesture corresponding to the shape. Subsequently, the control section 14 performs a process corresponding to a result of the identification of the gesture (step S23). For example, the control section 14 selects an image corresponding to the identified type of the gesture, or performs a determination as to right or wrong in a predetermined game (as to whether the input is correct) in accordance with the identified type of the gesture.
The content of the process in step S23 is different depending on the type of the program (application) to be executed. For example, any game program may be executed as the program. In addition, a program for performing an operation (e.g., enlarging the screen, reducing the screen, scrolling the screen, activating or ending a predetermined application, setting the portable electronic apparatus 1, etc.) with respect to the portable electronic apparatus 1, may be executed. For example, when a program corresponding to the process shown in
Next, the control section 14 generates an output image in accordance with a result of the process in step S23 (step S24). The output image generated here is outputted to the display 2 and/or the projector 8 in step S13.
On the other hand, if the control section 14 determines in step S21 that no specific object has been detected (step S21: NO), the control section 14 ends the processing shown in
As shown in
If the calculation of the distance has been accomplished (step S31: YES), the control section 14 performs a process corresponding to the distance (step S32).
The content of the process in step S32 is different depending on the type of the program (application) to be executed. For example, any game program may be executed as the program. In addition, a program for performing an operation with respect to the portable electronic apparatus 1 may be executed. For example, when a program corresponding to the process shown in
Then, the control section 14 generates an output image in accordance with a result of the process in step S32 (step S33). The output image generated here is outputted to the display 2 and/or the projector 8 in step S13.
Next, the predetermined information processing using both the infrared camera 4 and the distance measuring sensor 5 will be described.
As shown in
If the gesture has been normally identified (step S40: YES), the control section 14 performs a process corresponding to a result of the identification (step S23).
On the other hand, if the gesture has not been normally identified (step S40: NO), or if no user's hand has been detected (step S21: NO), the control section 14 calculates a distance on the basis of the detection information from the distance measuring sensor 5 (step S30). Then, the control section 14 determines whether the calculation of the distance has been accomplished similarly to
If the process in step S23 is performed, or if the process in step S32 is performed, the control section 14 generates an output image in accordance with a result of the process in step S23 or the process in step S32 (step S41). That is, if the process in step S23 has been performed, an output image corresponding to the process in step S23 is generated; and if the process in step S32 has been performed, an output image corresponding to the process in step S32 is generated. The generated output image is outputted to the display 2 and/or the projector 8 in step S13.
As described above, in the processing shown in
As described above, either of the infrared camera 4 or the distance measuring sensor 5 is selected in accordance with a detection state. Specifically, when a detection result of one of the infrared camera 4 and the distance measuring sensor 5 is not suitable for a predetermined process, the same process is performed by using a detection result of the other of the infrared camera 4 and the distance measuring sensor 5, whereby the infrared camera 4 and the distance measuring sensor 5 can complement each other. Thus, it is possible to more assuredly identify a gesture input performed by the user, and it is possible to improve the operability.
When both the infrared camera 4 and the distance measuring sensor 5 are used, a predetermined process may be performed on the basis of detection results of both the infrared camera 4 and the distance measuring sensor 5. That is, both the infrared camera 4 and the distance measuring sensor 5 may be selected and used for a predetermined process. For example, when a specific object is present in the recognizable range of the infrared camera 4, a predetermined process may be performed on the basis of an image of the specific object detected with the infrared camera 4 and the distance to the specific object calculated with the distance measuring sensor 5.
The processes shown in the flowcharts of the above-described embodiment are merely an example, a part of the processes may not be performed, and another process other than the above-described processes may be added. In addition, the order of the processes may be any order.
As described above, in the portable electronic apparatus 1 of the present embodiment, the infrared camera 4 and the distance measuring sensor 5 are provided at the side surface (right side surface) when the display 2 is viewed from the front surface, it is possible to perform a gesture input with the hand of the user by using the infrared camera 4 and the distance measuring sensor 5, and it is possible to output an output image corresponding to the gesture input, to the display 2 and the projector 8. When the infrared camera 4 is used, it is possible to identify various gestures at a position away from the portable electronic apparatus 1 by a predetermined distance. When the distance measuring sensor 5 is used, it is possible to determine whether an object is present also near the portable electronic apparatus 1, and perform a predetermined process on the basis of the distance between the object and the portable electronic apparatus 1. In addition, it is possible to perform a predetermined process by using both the infrared camera 4 and the distance measuring sensor 5. Then, it is possible to display an image as a result of the predetermined process on the display 2, project the image by using the projector 8, or vibrate the portable electronic apparatus 1 in accordance with the result of the predetermined process.
In the present embodiment, since the infrared camera 4 and the distance measuring sensor 5 are used, it is possible to recognize various gesture inputs at low cost while power consumption is reduced. That is, the irradiation section 7 instantly emits infrared light at a predetermined time interval in synchronization with the timing at which an image is captured by the infrared camera 4. For that reason, a clear image can be captured even when the time period when light is emitted is short, and thus it is possible to reduce the power consumption. In addition, when the distance measuring sensor 5 is used, it is not necessary to emit light in a wide range, and the absolute distance to the object to be detected can be calculated by only emitting light in a limited specific direction. Thus, it is possible to reduce the power consumption. For example, when a distance image sensor capable of measuring a distance for each pixel is used with a two-dimensional image, the irradiation section needs to emit light in a wide range. In addition, the electronic apparatus (or the control section) needs to calculate a distance for each pixel, and thus a processing load and power consumption increase. Moreover, the distance image sensor is more expensive than the distance measuring sensor. However, when the infrared camera 4 and the distance measuring sensor 5 are used, it is possible to identify various gesture inputs while it is possible to reduce the cost and power consumption.
In addition to the above-described process, the following processes may be performed by using the portable electronic apparatus 1.
In another application example, for example, a predetermined image may be displayed by using the outer camera 9 and an input using the infrared camera 4 and/or the distance measuring sensor 5. Specifically, while an image captured by the outer camera 9 is displayed on the display 2 in real time, a gesture input performed by the user may be detected by using the infrared camera 4 and/or the distance measuring sensor 5, and an image corresponding to the gesture input may be displayed so as to be superimposed on the image captured by the outer camera 9.
As shown in
For example, when an image including the marker shown in
As described above, a process using the outer camera 9 and the infrared camera 4 and/or the distance measuring sensor 5 may be performed, and an image obtained by combining the real world and the virtual space may be displayed on the display 2.
In another application example, a game may be performed in which an image projected by the projector 8 and an image displayed on the display 2 are linked to each other.
As shown in (A) of
In another application example, the absolute distance to an object (unspecified object) may be calculated by using the distance measuring sensor 5, and on the basis of the calculated distance, an image may be outputted to the display 2 or an image may be projected by using the projector 8. For example, in the application example shown in
In another application example, on the basis of an image captured by the infrared camera 4, a distance (relative distance) of the specific object may be calculated, and the vibrator 11 may be vibrated in accordance with the calculated distance. For example, when the distance to the object is shorter or longer than a predetermined threshold, the vibrator 11 may be vibrated. In addition, the absolute distance to the specific object may be calculated by using the distance measuring sensor 5, and the vibrator 11 may be vibrated in accordance with the calculated distance. In addition, movement of the specific object (e.g., movement of the right hand or movement of another object) may be detected by using the infrared camera 4 and/or the distance measuring sensor 5, and the vibrator 11 may be vibrated in accordance with the movement of the specific object. As a specific example, in accordance with the user moving their right hand as if playing guitar, a sound is outputted and the portable electronic apparatus 1 is vibrated. At that time, an image of guitar strings is displayed on the display 2, and the sound may be changed when the strings displayed on the display 2 are pressed with a finger of the left hand. In such an application example, as the user moves their right hand faster, a louder sound may be outputted and the portable electronic apparatus 1 may be more strongly vibrated.
Although a plurality of the examples of the process using the infrared camera 4 and/or the distance measuring sensor 5 have been described above, any other processes may be performed. For example, a game may be executed in which in accordance with a content instructed on the display 2 or through a voice, the user is caused to make a gesture at predetermined timing. In such a game, when a predetermined gesture is made at the predetermined timing, points are added; and when the timing at which a gesture is made by the user deviates from the predetermined timing or a gesture different from a predetermined gesture is made, no points are added.
As described above, in the portable electronic apparatus 1 of the present embodiment, a predetermined process is performed in accordance with the type of the application to be executed, and the infrared camera 4 and the distance measuring sensor 5 are used for the predetermined process. For example, in a certain application, an image from the infrared camera 4 may be used for a predetermined process. In addition, in a certain application, a distance calculated by the distance measuring sensor 5 is used for a predetermined process. Moreover, in a certain application, both the infrared camera 4 and the distance measuring sensor 5 are used for a predetermined process. For example, in the case where an application is executed in which a specific object (a predetermined object) is detected and a process based on the detection result is performed, the infrared camera 4 is used. On the other hand, in the case where an application is executed in which an unspecified object is detected and a process is performed based the distance to the unspecified object, the distance measuring sensor 5 is used.
In a certain application, either one of the infrared camera 4 or the distance measuring sensor 5 is selected in accordance with a detection state, and the selected one is used for a predetermined process. For example, when a specific object cannot be detected in an image from the infrared camera 4 or the type of a gesture cannot be identified, the distance measuring sensor 5 is selected instead of the infrared camera 4. On the basis of information from the selected distance measuring sensor 5, movement of the object is detected, and predetermined information processing is performed in accordance with the detected movement. That is, of the infrared camera 4 and the distance measuring sensor 5, the infrared camera 4 is preferentially used for detecting an object. Then, on the basis of a result of the predetermined information processing, an image is displayed on the display 2 or projected with the projector 8. Of the infrared camera 4 and the distance measuring sensor 5, the distance measuring sensor 5 may be preferentially used for detecting an object.
In another application example, when the user plays a game while holding the portable electronic apparatus 1 with both hands, if the user releases their right hand, the portable electronic apparatus 1 may display a menu screen on the display 2 instead of (or in addition to) a game image. The user is allowed to, for example, end the game being currently executed, store a progress status of the game, execute another game, or changes settings of the portable electronic apparatus 1, through an operation with the menu screen. Specifically, the portable electronic apparatus 1 detects that the user has released their right hand, by using the infrared camera 4 or the distance measuring sensor 5. Then, when the portable electronic apparatus 1 detects that the user has released their right hand, the portable electronic apparatus 1 pauses the game processing being currently executed, and displays the menu screen.
In the above-described portable electronic apparatus 1, the infrared camera 4, the distance measuring sensor 5, and the like are disposed at the right side surface. However, when the user performs a gesture input with their left hand, the portable electronic apparatus 1 having the attitude shown in
The above-described embodiment is merely an example, and, for example, the following modifications may be made.
For example, the portable electronic apparatus 1 may detect various objects, other than the above-described gesture inputs using the user's hand, may perform predetermined processes in accordance with detection results of the objects, and may output results of the predetermined processes to the display 2 or the projector 8. For example, an input with respect to the portable electronic apparatus may be performed by capturing, with the infrared camera 4, an image of a specific object present in the side surface direction. For example, when the user performs an operation with respect to the specific object, the position or attitude of the specific object changes. When an image of the specific object is captured by the infrared camera 4, an operation performed by the user with respect to the specific object can be detected. Then, a result of a process corresponding to this operation is outputted to the display 2 or the like. In this case, the portable electronic apparatus 1 may detect both the user's hand and an object held by the user, or may detect only the object.
The shape of the portable electronic apparatus 1 may be any shape. For example, the portable electronic apparatus 1 (the display 2) may not be horizontally long, may be vertically long, or may have a square shape.
As described above, in the portable electronic apparatus of the present embodiment, even when the screen is vertically long or horizontally long, the infrared camera 4 and the like are provided in the side surface direction (right-left direction) when the screen is viewed from the front surface. Here, the phrase “the screen is viewed from the front surface” means that the screen is viewed such that a character string, an image, or the like displayed on the screen of the display 2 can be seen from an ordinary direction. For example, in
As used herein, the “side surface” may include a bottom side surface facing downward and an upper side surface facing upward when the screen is viewed from the front surface as shown in
The portable electronic apparatus of the present embodiment may detect the attitude of the portable electronic apparatus by using the attitude detection section 16, and may rotate a character, an image, or the like displayed on the screen, in accordance with the detected attitude.
As shown in
That is, depending on a manner in which the portable electronic apparatus is held, the infrared camera 4, the distance measuring sensor 5, and the like may be located downward of the screen or upward of the screen. When the portable electronic apparatus is held in a certain attitude by the user, the infrared camera 4, the distance measuring sensor 5, and the like are located at a surface facing in the right-left direction relative to the screen when the user views the screen from its front surface (characters, an image, or the like displayed on the screen can be viewed from the ordinary direction). In such an attitude, the user can perform a gesture input with respect to the portable electronic apparatus by making a gesture with their right hand or left hand) from the rightward direction (or leftward direction) relative to the screen as described above. In addition, when the portable electronic apparatus is held in another attitude by the user, the infrared camera 4, the distance measuring sensor 5, and the like are located at a surface facing in the up-down direction relative to the screen when the user views the screen from its front surface. In such an attitude, the user can perform a gesture input with respect to the portable electronic apparatus by making a gesture with their right hand (or left hand) in the upward direction (or downward direction) relative to the screen as described above.
For example, a configuration in which the infrared camera 4, the distance measuring sensor 5, and the projector 8 are provided at not only the side surface but also the upper side surface or the bottom side surface is also included in the scope of the present embodiment. In addition, the infrared camera 4, the distance measuring sensor 5, and the like may be provide at each of the left side surface and the right side surface. In this case, for example, which of the left side or the right side of the portable electronic apparatus is held by the user may be determined on the basis of a distance measured by each distance measuring sensor. For example, the portable electronic apparatus may determine that a side surface (held portion) corresponding to the shorter measured distance is held by the user.
The portable electronic apparatus may have any configuration as long as the user is allowed to perform a gesture input with their right hand (or left hand) from the side surface direction relative to the screen of the display while holding the portable electronic apparatus with their left hand (or right hand) and viewing the screen from the front surface as described above.
In the above-described embodiment, the infrared camera 4, the distance measuring sensor 5, and the like are directed in a direction perpendicular to the side surface (parallel to the screen). However, in another configuration, the infrared camera 4, the distance measuring sensor 5, and the like may be provided so as to be tilted at a predetermined angle relative to the side surface.
In the above-described embodiment, the infrared camera 4 is used. However, instead of the infrared camera 4, a normal camera which captures an image of visible light (a camera which obtains an RGB image) may be used, and a gesture made by the user may be recognized by using the normal camera. In addition, instead of the infrared camera 4, a camera capable of capturing both an RGB image and an infrared image may be used.
In the above-described embodiment, the projector 8 is provided. However, any component may be provided as long as the component notifies the user of a result of the predetermined information processing (S13) corresponding to a gesture input performed by the user, by means of light. For example, instead of or in addition to the projector 8, a light source (e.g., an LED, a halogen lamp, a fluorescent lamp, an EL lamp, etc.) which emits predetermined light may be provided to the portable electronic apparatus 1.
In the above-described embodiment, an image and a distance are obtained separately by using the infrared camera 4 and the distance measuring sensor 5. However, in another embodiment, for example, a TOF (Time Of Flight) type distance image sensor may be provided, and an image and a distance for each pixel may be obtained by using this distance image sensor. The portable electronic apparatus detects a specific object on the basis of the obtained image and distance, and outputs an image to the display 2 or the projector 8 in accordance with a result of the detection.
In the above-described embodiment, the example where the portable electronic apparatus is held with one hand and a gesture input is performed with the other hand with respect to the portable electronic apparatus, has been described. In another embodiment, the portable electronic apparatus may be one that is to be fixed to an arm (body) of the user, such as a watch-type apparatus including a screen.
The shape of the portable electronic apparatus may be any shape, and may be, for example, a plate-like elliptical shape. For example, the electronic apparatus may be foldable.
In the above-described embodiment, the portable apparatus has been described, but a stationary apparatus may be used in another embodiment.
While certain example systems, methods, devices and apparatuses have been described herein, it is to be understood that the appended claims are not to be limited to the systems, methods, devices and apparatuses disclosed, but on the contrary, are intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2014-182587 | Sep 2014 | JP | national |