This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2014-046504, filed on Mar. 10, 2014, the entire contents of which are incorporated herein by reference.
The present technology herein relates to an information processing apparatus, an handheld information processing apparatus, an information processing system and an information processing method utilizing a linear image sensor, an image sensor, an optical sensor or the like.
In recent years, an information processing apparatus which can be carried by a user, such as a mobile phone, a smartphone, a tablet terminal device or a portable game device, has widely been spread. In such an information processing apparatus, a touch panel or the like is mounted to enhance the operability of a user, since a large input device such as a keyboard cannot easily be mounted thereto.
According to an aspect of the embodiment, an information processing apparatus includes a housing having at least one surface provided with a display, a linear image sensor located, facing outside the housing, at a side part or a corner part of the housing when the surface provided with the display is viewed as the front surface, and an information processing unit performing information processing based on an image obtained by the linear image sensor.
The object and advantages of the present technology herein will be realized and attained by means of the elements and combinations particularly pointed out in the claims. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the technology herein.
The above and further objects and features of the present technology herein will more fully be apparent from the following detailed description with reference to the accompanying drawings.
In the game apparatus 1 according to the present example embodiment, a linear image sensor 5 is disposed at each of four side parts (side surfaces) of the housing 2 so as to face outward. That is, the linear image sensor 5 is located at each of the side parts of the housing 2 when the surface provided with the display 3 is viewed as the front surface. Moreover, the linear image sensor 5 is oriented in the direction along the surface provided with the display surface 3. Though only two linear image sensors 5 are illustrated in
Each linear image sensor 5 obtains an image by taking an image of a side of the housing 2. The image to be obtained by each linear image sensor 5 is a linear or rectangular image including one to several pixels on the short side and several hundred pixels to several tens of thousands of pixels on the long side. In the case where the game apparatus 1 is placed on, for example, a horizontal desk, each linear image sensor 5 takes an image in the direction along the desk surface from a side surface of the housing 2 outward, to obtain a linear or rectangular image.
In the present example embodiment, the linear image sensor 5 takes an image by an imaging method in which objects in a foreground to a background, i.e. from short distance to long distance, are all in focus, which is a so-called deep focus. Therefore, though not illustrated, an optical member or optical members such as a lens and/or a diaphragm is/are provided to enable deep focus imaging.
In the present example embodiment, the linear image sensor 5 receives infrared light to take an image. Thus, though not illustrated, the linear image sensor 5 is provided with an optical filter or the like for transmitting infrared light therethrough and blocking light other than infrared light. The optical filter may be located, for example, on a surface of a CCD or CMOS, or lens for the deep focus imaging described above.
Though not illustrated in
The game apparatus 1 according to the present example embodiment uses linear image sensors 5 located at four sides of the housing 2 to provide a new function not offered by an existing game apparatus. For example, the game apparatus 1 can determine an operation by the user performed at a peripheral region of the housing 2 based on an image obtained by the linear image sensor 5, and to accept such operation as an operation for a game. The game apparatus 1 can determine an operation such as making contact with or approaching a side surface of the housing 2 based on an image obtained by the linear image sensor 5, to utilize the linear image sensor 5 in place of a touch panel. The game apparatus 1 can perform information processing such as a game based on both of the operation of the user determined by the linear image sensor 5 and the operation of the user sensed by the operation unit 4, touch panel or the like. For example, the game apparatus 1 is able to detect an object such as a figure placed in a peripheral region of the housing 2 based on an image obtained by the linear image sensor 5 and to reflect the type or the like of the detected object in the game processing.
When, for example, the user is holding the housing while using the game apparatus 1, the game apparatus 1 detects the position of the housing 2 where the user is holding. Based on the result of detection, the game apparatus 1 may determine, for example, the mode of use or the way of holding the game apparatus 1 by the user to reflect it in the game processing or other information processing. The game apparatus 1 may detect a pulse of the user based on the infrared light reflected by a hand or the like of the user who is holding the housing 2. The game apparatus 1 determines a relative position, orientation and the like with respect to a different game apparatus 1 having a similar configuration based on an image obtained by receiving, at its own linear image sensor 5, infrared light emitted from the infrared light source 6 included in the different game apparatus 1. The game apparatus 1 may reflect the result of determination in the communication processing with the different game apparatus 1.
The display 3 is configured with a liquid-crystal panel or the like, which displays an image supplied from the processing unit 10. The operation unit 4 is formed by appropriately combining, for example, a cross key, push buttons and the like. The operation unit 4 notifies the processing unit 10 of the details of operation performed by the user, such as pressing down or releasing of a button, for example. The storage unit 12 is configured using a non-volatile semiconductor memory, a hard disk or the like. The storage unit 12 can store a program such as a game program 91 as well as various kinds of data. The recording medium loading unit 13 is configured to load or remove a recording medium 9 of a card type, cassette type, disk type or the like thereto or therefrom. The processing unit 10 can read out the game program 91 and various kinds of data from the recording medium 9 loaded to the recording medium loading unit 13.
The wireless communication unit 14 transmits/receives data to/from a server apparatus, a different game apparatus 1 or the like via a network such as a mobile telephone network or a wireless LAN (Local Area Network). For example, the game apparatus 1 can download the game program 91 or the like through communication with the server apparatus at the wireless communication unit 14, and store it in the storage unit 12. The acceleration sensor 15 senses an acceleration applied to the game apparatus 1 and notifies the processing unit 10 thereof. The angular velocity sensor 16 senses an angular velocity of the game apparatus 1 and notifies the processing unit 10 thereof. Accordingly, the processing unit 10 can determine, for example, the orientation of the housing 2, based on the gravitational acceleration and/or angular velocity applied to the game apparatus 1.
In the game apparatus 1 according to the present example embodiment, the processing unit 10 executes the game program 91 to implement an operation detection unit 21, a figure detection unit 22, a use mode determination unit 23, a pulse detection unit 24, a different apparatus detection unit 25 and the like as software function blocks. These function blocks are for implementing the functions as described above using the linear image sensor 5. While these function blocks are implemented by the game program 91 in the present example embodiment, these may also be implemented by, for example, an operating system or an application program other than games. Furthermore, it is not necessary for all these function blocks to be implemented by one game program 91. It is, for example, possible to implement some of these function blocks by the processing unit 10 implementing the game program 91.
The operation detection unit 21 of the processing unit 10 performs processing of detecting an operation of the user performed at the side of the housing 2 of the game apparatus 1, based on the image obtained by the linear image sensor 5. The operation detection unit 21 performs processing of detecting an operation such as making contact with or approaching a side surface of the housing 2, based on the image obtained by the linear image sensor 5. The figure detection unit 22 performs processing of detecting a figure for a game placed at the side of the housing 2, based on the image obtained by the linear image sensor 5. The figure detection unit 22 detects the type, position, orientation and the like of a figure. The use mode determination unit 23 performs processing of determining a mode of use (also referred to as “use mode”) of the game apparatus 1, based on the image obtained by the linear image sensor 5, the acceleration sensed by the acceleration sensor 15 and the angular velocity sensed by the angular velocity sensor 16. The use mode determination unit 23 determines whether the housing 2 of the game apparatus 1 is used vertically or horizontally, and which part of the housing 2 the user is holding during use. The pulse detection unit 24 performs processing of detecting the user's pulse based on the image taken by the linear image sensor 5 receiving reflection light obtained when the hand of the user holding the housing 2 reflects the infrared light from the infrared light source 6. The different apparatus detection unit 25 performs processing of determining the position, orientation and the like of a different game apparatus 1, based on the image taken by the linear image sensor 5 receiving infrared light emitted from the infrared light source 6 of the different game apparatus 1.
<Space Pointer Function>
The game apparatus 1 according to the present example embodiment obtains an image by the linear image sensor 5 located on each of the side surfaces on the housing 2 taking an image of the side part. This allows the game apparatus 1 to use the peripheral region of the housing 2 as an acceptance region for user operations. The user of the game apparatus 1 places the game apparatus 1, for example, on a flat desk or the like. The user may perform an operation of indicating (pointing) the peripheral region of the housing 2 of the game apparatus 1 with a finger or the like to perform, for example, an operation for a game. In the present example embodiment, this function of the game apparatus 1 is referred to as a space pointer function.
It is not always necessary for the game apparatus 1 to be placed on a desk or the like to implement the space pointer function. For example, it is also possible for the user to hold the game apparatus 1 with one hand and to perform an operation in the peripheral region with the other hand. The space pointer function of the game apparatus 1 may be implemented in any arbitrary location in a space.
In the game apparatus 1, the linear image sensors 5 receive reflection light of the infrared light emitted from the infrared light source 6 to take an image. A linear image obtained by the linear image sensors 5 taking an image is supplied to the processing unit 10. In the case where no object such as a user's finger is present in the operation acceptance regions 51, the linear image sensors 5 do not receive reflection light, or only receive weak reflection light from an object outside the operation acceptance regions 51. The image obtained by the linear image sensors 5 here is an image with pixels each having a small pixel value. It is to be noted that, in the present example, the image output by the linear image sensors 5 has larger pixel values as the light reception intensity for the infrared light is increased and smaller pixel values as the light reception intensity is decreased. Thus, in the case where the pixel value for the image obtained by the linear image sensor 5 does not exceed a predetermined value, the operation detection unit 21 of the processing unit 10 determines that an object such as a user's finger is not present in the operation acceptance region 51.
On the other hand, in the case where an object such as a user's finger is present in the operation acceptance region 51, the operation detection unit 21 detects the presence of an object in the linear image obtained by the linear image sensor 5, since the pixel value of a pixel corresponding to the position where the object is present exceeds a predetermined value. The reflection light from an object has higher intensity as the distance from a side surface of the housing 2 to the object becomes closer. The operation detection unit 21 can calculate the distance from the housing 2 to an object such as a finger in accordance with the pixel value for the image obtained by the linear image sensor 5. The operation detection unit 21 may obtain the position of an object in the operation acceptance region 51 in accordance with the position of a pixel with the pixel value exceeding a predetermined value in a linear image, and the magnitude of the pixel value of the pixel. It is to be noted that the position of the object may be obtained as coordinates in the vertical and horizontal axes.
The linear image sensor 5 periodically and repeatedly takes images and periodically sends the obtained images to the processing unit 10. The operation detection unit 21 of the processing unit 10 compares multiple images sent from the linear image sensor 5 in time series with each other, to detect the presence/absence of operation, a change in the operating position and the like. For example, in the case where an object is not present in the previous image and an object is present in the current image, the operation detection unit 21 can detect that a new operation is performed by the user. This allows the operation detection unit 21 to detect a pointing operation or the like performed by the user. For example, in the case where the position of an object based on the current image is changed from the position of the object based on the previous image, the operation detection unit 21 can detect that the position of operation by the user is changed. This allows the operation detection unit 21 to detect a sliding operation or the like performed by the user. The sliding operation is, for example, an operation of moving a pointed position. It is to be noted that the operation detection unit 21 may also store the previous detection result, not the previous image, for comparison of detection results.
The operation detection unit 21 either stores an image obtained by the linear image sensor 5 when the image is obtained at step S1, or stores the coordinates when the coordinates are defined at step S5. The operation detection unit 21 compares the image or coordinates stored in the previous time and the current image or coordinates (step S6). From the comparison result, the operation detection unit 21 determines whether an object detected based on the current image is not detected in the previous image (step S7). In other words, the operation detection unit 21 determines whether or not the object in the current image is detected for the first time. If detection of the object is the first time (S7: YES), the operation detection unit 21 determines that the operation performed by the object is a pointing operation for the coordinates calculated at step S5 (step S8), and terminates the processing.
If detection of the object is not the first time (S7: NO), the operation detection unit 21 compares the coordinates based on the previous image and the coordinates based on the current image, and determines whether or not changes occur in both of the coordinates (step S9). If both of the coordinates are changed (S9: YES), the operation detection unit 21 determines that the current operation is the sliding operation (step S10), and terminates the processing. If no change occurs in both of the coordinates (S9: NO), the operation detection unit 21 determines that the pointing operation continues without a change (step S11), and terminates the processing. The operation detection unit 21 repeatedly performs the processing illustrated in the flowchart of
As described above, the game apparatus 1 detects operations using the linear image sensor 5. Thus, the game apparatus 1 may be provided with, in addition to the touch panel 11 on the display 3, the operation acceptance regions 51 in which the user can perform a pointing operation and the like. In the case where an operation is performed with the touch panel 11, the display 3 is partially hidden by a finger or hand of the user, which degrades the visibility of the display 3 due to the operation. On the other hand, an operation in the operation acceptance regions 51 located at four sides of the housing 2 can be performed without degrading the visibility of the display 3. By the liner image sensors 5 respectively provided at four side surfaces of the housing 2, the operation acceptance regions 51 may be provided at four sides of the housing 2. When, for example, multiple users play a game as opponents or cooperators using one game apparatus 1, such a mode of use can be realized that each of the users performs an operation using one of the operation acceptance regions 51.
While, in the present example embodiment, the game apparatus 1 is configured to calculate the distance to an object in the operation acceptance region 51 based on the intensity of the reflection light of the infrared light received by the linear image sensor 5, the present technology herein is not limited thereto. For example, the game apparatus 1 may also measure a distance by the Time of Flight method. Though not described in detail, with the Time of Flight method, the distance to an object can be measured based on the time during which the infrared light from the infrared light source 6 is reflected by an object and reaches the linear image sensor 5.
The game apparatus 1 according to the present example embodiment may, using the space pointer function described above, include an operation unit for accepting a pushing operation or sliding operation in the operation acceptance region 51, in addition to the cross key and push buttons at the operation unit 4.
In the case where, for example, the user selects to use the additional interface function on a menu screen, setting screen or the like in a game, the game apparatus 1 displays on the display 3 a message or the like urging the user to prepare the additional interface sheet 52 and place it at a predetermined position. Here, the game apparatus 1 displays on the display 3 instructions on what kind of picture pattern the additional interface sheet 52 is to have thereon and how the additional interface sheet 52 is to be located with respect to the game apparatus 1. In the additional interface sheet 52, a mark or the like for positional alignment with the game apparatus 1 may also be printed.
In the illustrated example, the additional interface sheet 52 is placed at a position on the lower side of the housing 2 with respect to the game apparatus 1 placed on a desk or the like. In the illustrated additional interface sheet 52, three buttons of X, Y and Z are drawn as an additional interface. The additional interface is for accepting a pushing operation or touching operation of the user for each of the buttons X, Y and Z.
The game apparatus 1 stores information such as the coordinates and ranges of the three buttons drawn on the additional interface sheet 52. The operation detection unit 21 of the game apparatus 1 detects the operation of the user for the additional interface sheet 52 in a method similar to that in the case of the space pointer function described above. In the case where a pointing operation for the additional interface sheet 52 is detected, the operation detection unit 21 compares the coordinates at which the pointing operation is performed with the coordinates and ranges of the three buttons drawn on the additional interface sheet 52. If it is determined that the coordinates at which the pointing operation is performed is in the range of any of the three buttons, the operation detection unit 21 accepts the pushing operation for the button. The operation detection unit 21 reflects the accepted operation on the processing for a game or the like in which the additional interface sheet is used.
If the placement of the additional interface sheet 52 is completed (S22: YES), the operation detection unit 21 performs the operation detection processing illustrated in the flowchart of
As described above, the game apparatus 1 accepts an operation using the additional interface sheet 52. Accordingly, the game apparatus 1 can appropriately add an interface suitable for the operation of a game, which enables enhancement in operability. The additional interface sheet 52 may be realized by, for example, a piece of paper on which a picture pattern is printed, which can easily be realized at low cost.
The additional interface sheet 52 illustrated in
The game apparatus 1 according to the present example embodiment uses the linear image sensor 5 provided on a side surface of the housing 2 to accept an operation of approaching or making contact with a side surface of the housing 2. In the present example embodiment, a side surface touching operation includes a case where a finger or the like of a user actually makes contact with a side surface of the housing 2, and also a case where a finger or the like is placed close to a side surface within a certain distance therefrom without actually being in contact with the side surface.
The operation detection unit 21 of the game apparatus 1 can detect a touching operation for a side surface of the housing 2 by a method similar to that of the space pointer function described above. In the case where, for example, any operation is detected in a processing procedure of the space pointer function illustrated in
The type of processing performed by the game apparatus 1 when the operation detection unit 21 detects a side surface touching operation depends on a game program 91 to be executed by the processing unit 10. The side surface touching operation may appropriately be utilized in accordance with the content of the game program 91, as in the touching operation through the touch panel 11. In the illustrated example, a case is shown where the user performs a sliding operation by sliding a finger in the vertical direction along the right side surface of the housing 2. For such a sliding operation at the side surface, the game apparatus 1 can, for example, increase or decrease the volume of music, voice or the like output from a speaker during a game. For example, the game apparatus 1 can increase or decrease the brightness of the display 3.
The side surface touching operation is not limited to the sliding operation as illustrated. For example, the game apparatus 1 may be configured to perform processing such as recovering from a sleep mode or unlocking the game apparatus 1 when a touching operation is performed on a predetermined portion on the side surface of the housing 2. For example, the game apparatus 1 may be configured to decide a moving direction, attacking direction or the like of a game character in accordance with the portion of the side surface of the housing 2 on which the touching operation is performed.
Thus, the game apparatus 1 has a configuration in which a touching operation by the user on the side surface of the housing 2 is detected using the linear image sensor 5. This allows the game apparatus 1 to detect the touching operation on a side surface of the housing 2 without providing a component, which is similar to the touch panel 11 of an electrostatic capacitance type provided on the surface of the display 3, on the side surface of the housing 2.
The game apparatus 1 according to the present example embodiment implements a game also using a specific figure associated with a specific game program 91.
The
The figure detection unit 22 of the processing unit 10 in the game apparatus 1 performs processing of detecting the type, position and orientation of a
The figure detection unit 22 detects the type of the
The figure detection unit 22 detects the orientation of the
For example, in the example illustrated in
In the illustrated example, the base 61 is provided with four orientation detection barcodes 63 and four type determination barcodes 64. The barcodes on the base 61 are so configured that the images of at least one orientation detection barcode 63 and one type determination barcode 64 may be taken by the linear image sensor 5 even when the figure is placed in any direction.
Each of the four orientation detection barcodes 63 is a different pattern. The figure detection unit 22 of the game apparatus 1 detects the orientation of the
The four type determination barcodes 64 all have the same pattern. The figure detection unit 22 can obtain at least one type determination barcode 64 from the image taken by the linear image sensor 5. There may be a case, however, where the type determination barcode 64 is divided into the former half and the latter half while the images thereof are taken with the orientation detection barcode 63 interposed in between. In such a case, the figure detection unit 22 may obtain one type determination barcode 64 by combining the divided former half and latter half.
The information on the type, position, orientation and the like of the
The figure detecting function of the game apparatus 1 can also be combined with different kinds of board games or the like. For example, the game apparatus 1 is placed at a predetermined position on a game board in a board game, and different kinds of
Subsequently, the figure detection unit 22 performs processing of extracting a barcode from an object determined to be present in the operation acceptance region 51 (step S34). From the result of the processing, the figure detection unit 22 determines whether or not a barcode can be extracted from the object in the operation acceptance region 51 (step S35). If the barcode cannot be extracted (S35: NO), the figure detection unit 22 terminates the processing. If the barcode can be extracted (S35: YES), the figure detection unit 22 determines that the object is a specific
Accordingly, the game apparatus 1 detects the
Though the object arranged at the periphery of the game apparatus 1 is described as the
As described above, the game apparatus 1 can accept operation using a button and the like drawn on the additional interface sheet 52 through the additional interface function. The game apparatus 1 according to the present example embodiment can accept an operation by a stereoscopic additional operation device using the linear image sensor 5.
The rotating dial 65 is an additional operation device which allows the user to perform a rotating operation in any one of the clockwise and counterclockwise directions. The game apparatus 1 detects a rotating direction, a rotating amount and the like of the rotating dial 65 to reflect them in the game processing. The rotating dial 65 has a substantially columnar shape. The rotating dial 65 is so configured that the relative position of its circumferential surface is changed with respect to the game apparatus 1 in accordance with the rotating operation by the user. For example, the rotating dial 65 may be so configured that its upper and circumferential surfaces rotate with respect to its immobile lower surface. For example, the rotating dial 65 may be an integrally-molded component and may be so configured to rotate as a whole in response to the rotating operation performed by the user. On the circumferential surface of the rotating dial 65, a barcode is printed with an infrared reflection coating material. The barcode of the rotating dial 65 is barcoded identification information indicating the type, orientation and the like of the additional operation device, as in the barcode of the
If any object is present in the operation acceptance region 51 by the linear image sensor 5, the operation detection unit 21 of the game apparatus 1 performs processing of extracting a barcode from the object. If a barcode can be extracted from the object, the operation detection unit 21 converts the extracted barcode into digital identification information or the like. The operation detection unit 21 is able to determine that the object is the rotating dial 65 based on the converted identification information. The operation detection unit 21 can detect an angle or the like of the rotating dial 65 based on the information indicating the orientation included in the barcode obtained by the linear image sensor 5. The processing procedures are substantially the same as those in the figure detecting function as described above, which will thus not be described in detail.
The game apparatus 1 periodically and repeatedly takes images by the linear image sensor 5. The operation detection unit 21 detects a change, displacement or the like of the additional operation device present in the operation acceptance region 51, based on multiple images obtained in time series from the linear image sensor 5. The operation detection unit 21 detects, for example, rotation of the rotating dial 65. If the rotation of the rotating dial 65 is detected, the operation detection unit 21 further detects the amount of displacement, i.e. rotation, for the angle of the rotating dial 65. The processing unit 10 of the game apparatus 1 can perform the game processing or other information processing based on the amount of rotation of the rotating dial 65 detected by the operation detection unit 21.
The operation detection unit 21 compares the angle detected and stored based on the previously-obtained image with the angle detected based on the image obtained this time (step S45). The operation detection unit 21 determines whether or not a change occurs in these angles (step S46). If a change occurs (S46, YES), the operation detection unit 21 calculates the amount of rotation of the rotating dial 65 from the difference between the angles (step S47), and terminates the processing. If no change occurs in the angles (S46: NO), the operation detection unit 21 terminates the processing without calculating the amount of rotation. The amount of rotation of the rotating dial 65 detected by the operation detection unit 21 is used in the game processing or the like performed by the processing unit 10.
As described above, the game apparatus 1 detects the additional operation device such as the rotating dial 65 placed in the operation acceptance region 51, using the linear image sensor 5. The game apparatus 1 detects the displacement of the additional operation device by the linear image sensor 5 to reflect the displacement in the game processing. Accordingly, the use of the additional operation device allows the game apparatus 1 to realize the acceptance of a complicated operation, which is difficult to be realized by a planar additional interface sheet 52. The additional operation device can realize such a complicated operation by a simple configuration, e.g., a barcode indicated on a portion to be displaced in accordance with the operation. It is not necessary for an additional operation device to mount thereto an electronic mechanism for detecting an operation, a communication function with the game apparatus 1 and the like. Thus, the additional operation device can be provided at low cost.
Though the description was made by taking the rotating dial 65 as an example for the additional operation device, the additional operation device is not limited thereto. For example, the additional operation device may also be a slide bar which linearly shifts its position in accordance with a sliding operation by the user. For example, the additional operation device may also have such a configuration that a rotating operation by a steering wheel is converted into a linear displacement operation by a rack and pinion mechanism while a barcode or the like is attached to a linearly-displaced portion. Various configurations other than the ones described above may also be employed for the additional operation device.
The game apparatus 1 according to the present example embodiment has a function of determining which one of a horizontal posture and a vertical posture the housing 2 has when the user holds the housing 2 during use. In the present example embodiment, this function of the game apparatus 1 is referred to as a use mode determining function.
The game apparatus 1 according to the present example embodiment changes the orientation of an image displayed on the display 3 in accordance with how the game apparatus 1 is used by the user, i.e. whether it is held horizontally or vertically. As illustrated, regardless of whether the game apparatus 1 is used horizontally or vertically, the menu screen on the display 3 is displayed with the side of the display 3 farther from the user being the top and the side closer to the user being the bottom, while items for selection are arranged in the vertical direction.
Whether the game apparatus 1 is used horizontally or vertically is determined by the use mode determination unit 23 of the processing unit 10. The use mode determination unit 23 determines how the game apparatus 1 is used based on the gravitational acceleration sensed by the acceleration sensor 15 and the image obtained by the linear image sensor 5. By determining the direction of the gravitational acceleration sensed by the acceleration sensor 15, the use mode determination unit 23 can determine which direction the housing 2 of the game apparatus 1 is inclined. In place of or in addition to the gravitational acceleration sensed by the acceleration sensor 15, a configuration of determining the direction using the angular velocity sensed by the angular velocity sensor 16 may also be adopted. The determination using the acceleration sensor 15 or angular velocity sensor 16 is an existing technique, which will thus not be described in detail.
The determination on the mode of use by the acceleration sensor 15 may have a lowered determination accuracy when, for example, the game apparatus 1 is used in the state where the housing 2 is maintained in a substantially horizontal direction. The determination on the mode of use may also be degraded in accuracy when the acceleration sensor 15 may possibly sense an acceleration other than the gravitational acceleration, e.g., when the user is using the game apparatus 1 while moving. The game apparatus 1 according to the present example embodiment thus determines a position of the housing 2 where the user is holding, based on the image obtained by the linear image sensor 5. The game apparatus 1 determines how the game apparatus 1 is used based on the held position. While the game apparatus 1 uses both the determination by the acceleration sensor 15 and the determination by the linear image sensor 5, a result of either one of the determinations may be prioritized. The game apparatus 1 may be so configured that the user can set which determination result is prioritized. In the present example embodiment, the game apparatus 1 prioritizes the result of determination by the linear image sensor 5. The game apparatus 1 makes a determination by the acceleration sensor 15 in the case where the mode of use cannot be determined based on the image obtained by the linear image sensor 5.
For example, as illustrated in
Furthermore, if approximately one fourth to third of the entire linear image which is obtained by the linear image sensor 5 is covered, the use mode determination unit 23 determines that a side surface of the housing 2 is held by a hand of the user. If it is determined that the two opposed side surfaces out of the four side surfaces of the housing 2 are held by a hand of the user, the use mode determination unit 23 determines that the user holds the housing 2 with both of the right and left hands while using the apparatus. The use mode determination unit 23 determines which end in the longitudinal direction on each of the two side surfaces the position held by the user is closer to. Accordingly, the use mode determination unit 23 determines that the user is holding the housing 2 with the end closer to the held position located at the bottom.
When the user is using the game apparatus 1, the use mode determination unit 23 can determine the vertical orientation of the housing 2, and the processing unit 10 can determine the orientation of the image displayed on the display 3 based on the determination result. It is also possible for the user to hold the housing 2 in a manner other than that illustrated in
If two opposed side surfaces of the housing 2 are covered (S54: YES), the use mode determination unit 23 performs processing of determining a position of the housing 2 where the user is holding in accordance with which end of each of the side surfaces in the longitudinal direction the covered portion is closer to (step S55). The use mode determination unit 23 determines the mode of use of the game apparatus 1, i.e. whether the game apparatus 1 is used horizontally or vertically, based on the determined held position (step S57), and terminates the processing.
If it is determined that a side surface of the housing 2 is not covered based on the image obtained by the linear image sensor 5 (S52: NO), that a covered portion on a side surface of the housing 2 has a length not exceeding a predetermined value (S53: NO), or that two opposed side surfaces of the housing 2 are not covered (S54: NO), the use mode determination unit 23 determines the vertical orientation based on the gravitational acceleration sensed by the acceleration sensor 15 (step S56). Based on the result of determination by the acceleration sensor 15, the use mode determination unit 23 determines the mode of use for the game apparatus 1 (step S57), and terminates the processing.
As described above, the game apparatus 1 determines which side surface the user is holding based on the image obtained by the linear image sensor 5 located on each of the four side surfaces of the housing 2, to determine the mode of use for the game apparatus 1. In accordance with the determined mode of use, the game apparatus 1 performs processing of changing the orientation of the image displayed on the display 3. Thus, even in a situation where the acceleration sensor 15 cannot accurately determine the mode of use for the game apparatus 1, the game apparatus 1 can determine the mode of use based on the image obtained by the linear image sensor 5. By the determination additionally using the acceleration sensor 15, the game apparatus 1 can more accurately determine the mode of use.
In the present example, the game apparatus 1 determines the mode of use thereof by determining four conditions, i.e., whether a side surface of the housing 2 is covered, the length of the covered portion, whether two opposed side surfaces of the housing 2 are covered, and which end of each side surface the covered portion is closer to. These conditions are, however, mere examples. Another condition may further be added to the four conditions. Only three or less conditions out of the four conditions may be used in the determination. Some of the four conditions may be combined with another condition in the determination. An example where a menu screen is displayed on the display 3 in the present example, which is a mere example, and any image may be displayed on the display 3.
Though the present example described the processing of changing the orientation of the screen displayed on the display 3 by the game apparatus 1 in accordance with the mode of use determined based on which side surface the user is holding, the processing performed by the game apparatus 1 is not limited thereto. The game apparatus 1 may perform various kinds of processing other than above in accordance with the result of determination on the mode of use. In addition to the processing of changing the orientation of display in accordance with the determination result, the processing in consideration of the position of a side surface where the user is holding may also be performed. For example, the game apparatus 1 can perform processing such as displaying an icon, a game character or the like displayed on the display 3 near the position held by the user.
The game apparatus 1 according to the present example embodiment includes a function of detecting a pulse of the user based on an image obtained by the linear image sensor 5 while the user is holding the housing 2. The game apparatus 1 emits infrared light to the outside of the housing 2 by the infrared light source 6, the infrared light being reflected by a hand of the user who is holding the housing 2 and received by the linear image sensor 5. The intensity of the reflection light of the infrared light received by the linear image sensor 5 changes in accordance with the blood flow in the blood vessels of the user. The reflection intensity of the reflection light is transferred as a pixel value in the image obtained by the linear image sensor 5.
The pulse detection unit 24 of the processing unit 10 in the game apparatus 1 periodically or continuously obtains images by the linear image sensor 5. The pulse detection unit 24 obtains the pixel values from a plurality of images obtained for a predetermined period of time, to determine a change in the reflection intensity of the infrared light. The reflection intensity of the infrared light is repeatedly increased and decreased in accordance with a change in the blood flow, i.e. the pulse, of the user. Accordingly, the pulse detection unit 24 can detect the pulse of the user by calculating the cycle of changes from the pixel values obtained from multiple images.
The pulse detected by the pulse detection unit 24 can be utilized in, for example, an application for managing the user's health. The game apparatus 1 can reflect the detected pulse in the game processing by, for example, changing the facial expression of a character in a game in accordance with the pulse detected by the pulse detection unit 24.
The pulse detection unit 24 of the processing unit 10 in the game apparatus 1 obtains an image taken by the linear image sensor 5 (step S61). The pulse detection unit 24 obtains the intensity of the reflection light of the infrared light by a user's hand, based on the pixel value of the obtained image (step S62). The pulse detection unit 24 stores the obtained intensity in the storage unit 12 or the like (step S63). The pulse detection unit 24 determines whether or not the intensity for a predetermined time corresponding to, for example, several seconds to several tens of seconds is obtained (step S64). If the intensity for a predetermined time is not obtained (S64: NO), the pulse detection unit 24 returns the processing to step S61 to repeatedly obtain the intensity of the reflection light of the infrared light based on the image obtained by the linear image sensor 5.
If the intensity of the reflection light for a predetermined time is obtained (S64: YES), the pulse detection unit 24 reads out multiple intensities stored in the storage unit 12. The pulse detection unit 24 calculates the cycle of changes in the intensities in time series by, for example, detecting a peak value (step S65). The pulse detection unit 24 stores the calculated cycle in the storage unit 12 as a result of pulse detection (step S66), and terminates the processing.
As described above, the game apparatus 1 receives, by the linear image sensor 5, reflection light of the infrared light emitted from the infrared light source 6, the reflection being caused by a user's hand or the like, and detects the pulse of the user based on a change in the intensity of the reflection light. Since the user often holds the housing 2 of the game apparatus 1 when playing a game, the game apparatus 1 can easily detect the pulse based on the image obtained by the linear image sensor 5 located on each side surface of the housing 2. It is easy for the game apparatus 1 to reflect the detected pulse in game processing or the like. It is to be noted that the detected pulse of the user may be used for, not limited to the health management of the user or a change in the facial expression of a game character, but also for other various kinds of processing.
The game apparatus 1 according to the present example embodiment includes a function of detecting the position of a different game apparatus 1 having a similar configuration. The game apparatus 1 detects the position of a different game apparatus 1 based on an image obtained by receiving infrared light emitted from the infrared light source 6 of the different game apparatus 1 by the linear image sensor 5. As the housing 2 of the game apparatus 1 is provided with linear image sensors 5 at four side surfaces thereof, respectively, the game apparatus 1 can detect the position of the different game apparatus 1 in accordance with which one of the linear image sensor located on the respective side surfaces received the infrared light emitted from the different game apparatus 1. The game apparatus 1 can detect the orientation of a different game apparatus 1 based on the intensity of the infrared light received by the linear image sensor 5. In order to detect the position as described above, the game apparatus 1 wirelessly communicates with the different game apparatus 1 and adjusts timing for the infrared light source 6 to emit light.
In the case where the two game apparatuses 1A and 1B perform processing of detecting positions, first, wireless communication is performed between the game apparatuses 1A and 1B, to decide the order and timing for emitting infrared light by the infrared light source 6. If it is decided here that the game apparatus 1A first emits light from the infrared light source 6 and that the light is emitted at time t0, the game apparatus 1A makes the infrared light source 6 located on the side surface 2a at the time to for a predetermined period of time. Then, the game apparatus 1A sequentially makes the infrared light sources 6 on the side surfaces 2b, 2c and 2d independently for a predetermined period of time. The game apparatus 1B not emitting light from the infrared light source 6 receives and takes an image of the infrared light from the game apparatus 1A using all the linear image sensors 5.
In the example illustrated in
After the game apparatus 1A finishes emitting light from the infrared light source 6, the game apparatus 1B emits light from the infrared light source 6. As in the case with the game apparatus 1A, the game apparatus 1B makes the respective infrared light sources 6 emit light in the order of the side surfaces 2a, 2b, 2c and 2d starting from the time t1 and each emitting for a predetermined period of time. When the game apparatus 1B makes the infrared light source on the side surface 2a emit light, the game apparatus 1A receives infrared light by the linear image sensor 5 on the side surface 2c. The game apparatus 1A determines the position, distance, orientation and the like for the game apparatus 1B based on the image obtained by the linear image sensor 5.
By both the game apparatuses 1A and 1B finish emitting light from the infrared light source 6 and receiving light by the linear image sensor 5, and finish determining the position, distance, orientation and the like of the other apparatus, the processing of detecting positions of the game apparatuses 1A and 1B is terminated. Each of the game apparatuses 1A and 1B may also transmit the result of its own determination to the other apparatus through wireless communication. Each of the game apparatuses 1A and 1B can compare the result of its own positional detection processing and the result of the other apparatus, to confirm if there is an error in the processing results.
Though the case where the positional detection is performed by two game apparatuses 1 has been described in the example above, the processing may also be performed in a similar procedure in the case where three or more game apparatuses 1 perform positional detection. For example, three game apparatuses 1 wirelessly communicate with one another to decide the order and timing of making the infrared light sources 6 emit light, and each game apparatus 1 makes the infrared light source 6 emit light in accordance with the decided order and timing of light emission. The remaining two game apparatuses 1 not making the infrared light sources 6 emit light receive light by the linear image sensors 5, to determine the position, distance, orientation and the like of the game apparatus 1 making the infrared light source 6 emit light. Here, the results of detection by the linear image sensors 5 may be exchanged through wireless communication between the two game apparatuses 1 not making the infrared light sources 6 emit light. By sharing the detection results among multiple game apparatuses 1, the position, distance, orientation and the like of the game apparatus 1 making the infrared light source 6 emit light can more accurately be determined.
If it is not the turn of the apparatus itself for emitting light from the infrared light source 6 (S72: NO), the different apparatus detection unit 25 takes and obtains an image by the linear image sensor 5 (step S74). Based on the obtained image, the different apparatus detection unit 25 determines which linear image sensor 5 receives infrared light from a different game apparatus 1, to determine the position of the different game apparatus 1 (step S75). The different apparatus detection unit 25 determines the distance to the different game apparatus 1 based on the pixel values of the obtained image (step S76). The different apparatus detection unit 25 determines the inclination of the different game apparatus 1 based on the distribution of the pixel values in the obtained image (step S77). The different game apparatus detection unit 25 transmits the determination results at steps S75 to S77 to the different game apparatus 1 through the wireless communication unit 14 (step S78), and proceeds to step S79.
After the processing of steps S73 or S78, the different apparatus detection unit 25 determines whether or not light emission from the infrared light source 6 of the apparatus itself and reception of the infrared light from the different game apparatus 1 by the linear image sensor 5 are both completed (step S79). If both the light emission and light reception are not completed (S79: NO), the different apparatus detection unit 25 returns the processing to step S72. If both the light emission and light reception are completed (S79: YES), the different apparatus detection unit 25 terminates the processing.
The game apparatus 1 utilizes the infrared light source 6 and linear image sensor 5 to cooperate with a different game apparatus 1 to detect the position of the different game apparatus 1 and also to allow the different game apparatus 1 to detect the position of the game apparatus 1 itself. Accordingly, it is possible to easily implement a function of, for example, displaying one image on multiple game apparatuses 1 by displaying different parts of a common image respectively on the displays 3 of the game apparatuses 1, which is a so-called multi-display function. For example, in the case where multiple users utilize game apparatuses 1 respectively and play a game for competing or cooperating through wireless communication or the like, the position of each game apparatus 1 can be reflected in the game. It is, for example, possible to perform processing of associating the position of a character operated by each game apparatus 1 with the actual position of the game apparatus 1. It is to be noted that the result of the detection for the position of the different game apparatus 1 by the game apparatus 1 may also be utilized in various kinds of information processing, not limited to the processing described above.
While the positional detection is performed in the state where multiple game apparatuses 1 are placed on a desk or the like in the present example embodiment, it is not limited thereto. It is also possible to detect positions of multiple game apparatuses 1 not placed on one same plane by appropriately setting, for example, the light emission range of the infrared light from the infrared light source 6 and the light reception range of the linear image sensor 5.
The game apparatus 1 according to the present example embodiment can implement the functions as described below by providing a linear image sensors 5 on a side surface of the housing 2.
Having these functions, the game apparatus 1 according to the present example embodiment can attain an operability not provided in the conventional game apparatus. Though the game apparatus 1 is configured to include all of the functions described above in the present example embodiment, it is not limited thereto. The game apparatus 1 may also be configured to include some of the functions described above. Though all of the four side surfaces of the housing 2 are provided with linear image sensors 5 respectively, the present technology herein is not limited thereto. The number of linear image sensors 5 mounted may appropriately be increased or decreased in accordance with the function to be implemented. For example, in the case where the housing of the game apparatus 1 is formed with the first housing and the second housing capable of being folded by a hinge mechanism or the like, such as in the case of a notebook personal computer, the linear image sensor 5 may be located at any one of or both of the first and second housings. In the case where the game apparatus is a stationary type, the linear image sensor 5 may be located at a controller connected with or without wire, not at the main body of the game apparatus. For example, the linear image sensor 5 may be provided facing outward at a corner part of the housing 2 of the game apparatus 1. For example, one linear image sensor 5 may be provided across multiple side surfaces of the housing 2.
While the game apparatus 1 is configured to realize the functions described above using the linear image sensor 5, the present technology herein is not limited thereto. It may also be configured to have, for example, multiple light receiving elements such as photodiodes arranged linearly on a side surface of the housing 2 in place of the linear image sensor 5. For example, a camera or an image sensor capable of capturing an image at a wide angle perspective may be provided on a side surface of the housing 2. Though the linear image sensor 5 has been described to receive infrared light, it is not limited thereto. In the case of not using the pulse detecting function, the linear image sensor 5 may have a configuration of receiving visible light or the like, not limited to infrared light.
While the game apparatus 1 has been described as an example in the present example embodiment, the present technology herein is not limited thereto. It is also possible to apply a similar technique to various information processing devices such as, for example, a general-purpose computer, tablet terminal device, smartphone and mobile phone. Though it is configured that the operation detection unit 21 to different apparatus detection unit 25 are provided as software functional blocks as the processing unit 10 of the game apparatus 1 executes the game program 91, the present technology herein is not limited to this configuration. A part of the functions of the operation detection unit 21 to the different apparatus detection unit 25 may be provided as, for example, a function of OS (Operating System). A part of the functions of the operation detection unit 21 to the different apparatus detection unit 25 may be provided as a hardware functional block.
It is to be understood that elements and the like in singular form preceded by an article “a” or “an” do not exclude more than one elements related thereto when used in the present specification.
The present technique is configured to provide a linear image sensor, an image sensor, an optical sensor or the like on a side surface of a housing, which is used for sensing outside the housing, to perform information processing based on the result of sensing. This allows a region outside the housing of an information processing apparatus to be used for an operation, thereby realizing a various kinds of operation acceptance processing.
Number | Date | Country | Kind |
---|---|---|---|
2014-046504 | Mar 2014 | JP | national |