1. Field of the Invention
The present invention relates to an electronic device.
2. Description of Related Art
Currently, there is a mobile device having a function of outputting a sound, a function of vibrating the housing, and a function of detecting the movement of the housing. In addition, techniques used to generate a collision sound when a virtual object collides in a virtual space are disclosed (for example, refer to Japanese Unexamined Patent Application, First Publication No. 2007-164291).
However, there is a problem in that, when a user moves the housing of a mobile device by shaking or the like, even if a collision sound or vibration is generated by moving a virtual object in the virtual space corresponding to the contour of the housing, it is not practically possible to make the user experience a sense of reality as if something, such as a real ball, is contained inside the housing.
According to an aspect of the present invention, there is provided a technique used to make a user experience a sense of reality as if something, such as a real ball, is contained inside the housing of a mobile device when the user moves the housing by, for example, shaking it.
An electronic device according to an aspect of the present invention is an electronic device including: a detection unit configured to detect movement of a housing; a vibration unit configured to vibrate the housing; a determination unit configured to calculate movement of a virtual container and movement of a virtual object, and detect a relative distance between an inner wall of the virtual container and the virtual object, the virtual container being moved according to the movement of the housing, the virtual object being moved in the virtual container according to the movement of the virtual container; and an output information generation unit configured to output vibration information used to vibrate the vibration unit to the vibration unit when the determination unit determines that the relative distance between the inner wall of the virtual container and the virtual object satisfies a predetermined distance relationship.
An electronic device according to another aspect of the present invention is an electronic device including: a detection unit configured to detect movement of a housing; a sound output unit configured to output a sound; a determination unit configured to calculate movement of a virtual container and movement of a virtual object, and detects a relative distance between the virtual objects, the virtual container being moved according to the movement of the housing, the virtual object being moved in the virtual container according to the movement of the virtual container; and an output information generation unit configured to output sound information used to output the sound to the sound output unit when the determination unit determines that the relative distance between the virtual objects satisfies a predetermined distance relationship.
According to aspects of the present invention, when a user moves the housing of the electronic device according to the present invention by shaking or the like, it is possible for the user to experience a sense of reality as if something, such as a real ball, is contained inside the housing.
Hereinafter, embodiments of the present invention will be described with reference to the diagrams.
As shown in
The vibration unit 31 is a vibration motor, for example, and vibrates a housing of the electronic device 1 (hereinafter, simply referred to as a housing). Specifically, the vibration unit 31 vibrates the housing according to vibration information output from the output information generation unit 22.
The sound output unit 32 is a speaker, for example, and outputs a sound to the outside of the housing. Specifically, the sound output unit 32 outputs a sound to the outside of the housing according to sound information output from the output information generation unit 22.
The display unit 33 is an LCD monitor, for example, and displays various kinds of information. Specifically, the display unit 33 displays image information output from the output information generation unit 22, for example.
The detection unit 20 detects the movement of the housing. For example, the detection unit 20 is an acceleration sensor, and detects the acceleration applied to the housing and detects the movement of the housing on the basis of the detected acceleration.
The collision determination unit 21 is a physical engine that calculates the movement of a virtual container that moves according to the movement of the housing and the movement of a virtual object that moves around in the virtual container according to the movement of the virtual container on the basis of the movement of the housing detected by the detection unit 20. Specifically, the collision determination unit 21 calculates the movement of the virtual container and the virtual object on the basis of the movement information of the housing detected by the detection unit 20 and the information stored in the physical information storage unit 10.
In addition, the collision determination unit 21 determines the presence or absence of a collision of the virtual object to an inner wall of the virtual container and the presence or absence of a collision between the virtual objects by calculating the movement of the virtual container and the virtual object. In addition, details of the information stored in the physical information storage unit 10 and details of the calculation of the movement of the virtual container and the virtual object and the determination regarding the presence or absence of a collision of the virtual object to the inner wall of the virtual container and the presence or absence of a collision between the virtual objects by the collision determination unit 21 will be described later.
The output information generation unit 22 generates output information (vibration information output to the vibration unit 31, sound information output to the sound output unit 32, and image information output to the display unit 33). For example, the output information generation unit 22 generates vibration information output to the vibration unit 31 when the collision determination unit 21 determines that there has been a collision of the virtual object to the inner wall of the virtual container. In addition, the output information generation unit 22 generates sound information output to the sound output unit 32 when the collision determination unit 21 determines that there has been a collision between the virtual objects. In addition, while the collision determination unit 21 determines the presence or absence of a collision described above, the output information generation unit 22 generates video information output to the display unit 33 regardless of the presence or absence of a collision. For example, the output information generation unit 22 generates video information showing the movement of the virtual object that moves around in the virtual container.
Specifically, when outputting the vibration and sound corresponding to the collision speed (speed of collision of the virtual object to the inner wall of the virtual container, speed of collision between the virtual objects), the output information selection section 23 of the output information generation unit 22 generates vibration information and sound information on the basis of the movement information of the virtual container and the virtual object (speed of collision of the virtual object to the inner wall of the virtual container, speed of collision between the virtual objects) calculated by the collision determination unit 21, the determination information regarding the presence or absence of a collision (information regarding the determination of the collision of the virtual object to the inner wall of the virtual container, information regarding the determination of the collision between the virtual objects) determined by the collision determination unit 21, the information (setting information shown in
In addition, the output information selection section 23 generates video information on the basis of the movement information of the virtual object calculated by the collision determination unit 21 and the information stored in the physical information storage unit 10.
In addition, details of the information stored in the output information storage section 11 and details related to the selection of the output information by the output information selection section 23 will be described later.
Subsequently, the information stored in the physical information storage unit 10 will be described.
As described above, a user can be made to select the setting value of each attribute of the virtual container freely by storing the selection information of the virtual container in advance. For example, selectable values of each attribute of the virtual container may be displayed on the display unit 33, so that the user can select one of the selectable values for each attribute as a setting value through an operation receiving unit (not shown). In addition, for an attribute that is not selected, the initial value may be set as a setting value.
In addition, as shown in
In addition, the number of virtual objects is determined by the operation of the user. For example, one virtual object may be added whenever the user touches an add button (not shown) disposed at the display unit 33. Specifically, since the initial number of virtual objects is zero in the case of the example shown in
As described above, a user can be made to select the setting value of each attribute of the virtual object freely by storing the selection information of the virtual object in advance. For example, selectable values of each attribute of the virtual object may be displayed on the display unit 33, so that the user can select one of the selectable values for each attribute as a setting value through an operation receiving unit (not shown). In addition, for an attribute that is not selected, the initial value may be set as a setting value.
In addition, as shown in
The virtual container setting information is a selection result from the selection information of the virtual container shown in
The virtual object setting information is a selection result from the selection information of the virtual object shown in
Then, the calculation of the movement of the virtual container and the virtual object and the determination regarding the presence or absence of a collision of the virtual object to the inner wall of the virtual container and the presence or absence of a collision between the virtual objects by the collision determination unit 21 will be described. The collision determination unit 21 reads and stores the setting information (virtual container setting information and virtual object setting information), which is stored in the physical information storage unit 10, in advance. In addition, the collision determination unit 21 stores the number of virtual objects determined by the operation of the user.
The collision determination unit 21 acquires the movement information of the housing detected by the detection unit 20. The collision determination unit 21 that has acquired the movement information of the housing detected by the detection unit 20 calculates the movement of the virtual container on the basis of the movement information of the housing (that is, generates the movement information of the virtual container). For example, the collision determination unit 21 may determine the movement of the housing to be the movement of the virtual container.
The collision determination unit 21 that has calculated the movement of the virtual container determines the collision (collision of the virtual object to the inner wall of the virtual container and collision between the virtual objects) and calculates the movement of each virtual object (that is, generates the movement information of the virtual object) while calculating the moving direction and moving speed of the virtual object after collision on the basis of the movement information of the virtual container, the stored setting information (shape and size of the virtual container (setting value of the attribute (PA1)), the shape and size of the virtual object (setting value of the attribute (PB1)), the material of the virtual container (setting value of the attribute (PA2)), the material of the virtual object (setting value of the attribute (PB2)), and the like), and the stored information regarding the number of virtual objects. In addition, for example, the materials of the virtual container and the virtual object are used in the above-described calculation as the repulsive force and the friction coefficient.
In addition, it has been described that the collision determination unit 21 determines the presence or absence of a collision of the virtual object to the inner wall of the virtual container and the presence or absence of a collision between the virtual objects by calculating the movement of the virtual container and the virtual object. However, as described above, the collision determination unit 21 determines the presence or absence of a collision of the virtual object to the inner wall of the virtual container and the presence or absence of a collision between the virtual objects in the process of calculating the movement of the virtual container and the virtual object.
Subsequently, the information stored in the output information storage section 11 will be described.
In the electronic device 1, in order to realize an output (vibration, sound) corresponding to setting information (for example, materials of the virtual container and the virtual object), the output information storage section 11 stores output information at the time of collision of the virtual object to the inner wall of the virtual container and output information at the time of collision between the virtual objects for each combination of the setting information. In addition, in the electronic device 1, vibration and sound are output according to the collision speed (speed of collision of the virtual object to the inner wall of the virtual container, speed of collision between the virtual objects). Therefore, the output information storage section 11 stores output information at the time of collision of the virtual object to the inner wall of the virtual container and output information at the time of collision between the virtual objects for each collision speed.
In addition, when the material of the virtual container is plastic or metal and the material of the virtual object is an elastic body of rubber, it is possible to set vibration information to vibrate the housing by generating a sine wave of about 10 Hz, for example.
That is, the output information at the time of collision of the virtual object to the inner wall of the virtual container shown in
That is, the output information at the time of collision between the virtual objects shown in
In addition, in the electronic device 1, when vibration and sound are not output according to the collision speed, the output information storage section 11 may store output information at the time of collision of the virtual object to the inner wall of the virtual container and output information at the time of collision between the virtual objects for each combination of setting information, as shown in
Subsequently, the selection of output information by the output information selection section 23 will be described. The output information selection section 23 acquires the information (setting information shown in
(When Vibration and Sound are Output According to Collision Speed)
The output information selection section 23 acquires the movement information of the virtual container and the virtual object (speed of collision of the virtual object to the inner wall of the virtual container, speed of collision between the virtual objects) and determination information regarding the presence or absence of a collision (collision determination information regarding the collision of the virtual object to the inner wall of the virtual container, collision determination information regarding the collision between the virtual objects) from the collision determination unit 21.
When the timing at which the virtual object has collided with the inner wall of the virtual container is acquired from the collision determination information regarding the collision of the virtual object to the inner wall of the virtual container, the output information selection section 23 acquires the collision speed of the collision from the movement information of the virtual container and the virtual object and selects, from the output information storage section 11, vibration information corresponding to the collision speed among the setting information stored in advance (shape and size of the virtual container (setting value of the attribute (PA1)), the shape and size of the virtual object (setting value of the attribute (PB1)), material of the virtual container (setting value of the attribute (PA2)), material of the virtual object (setting value of the attribute (PB2)), and the like).
In addition, when the timing at which the virtual objects have collided with each other is acquired from the collision determination information regarding the collision between the virtual objects, the output information selection section 23 acquires the collision speed of the collision from the movement information of the virtual container and the virtual object and selects, from the output information storage section 11, sound information corresponding to the collision speed among the setting information stored in advance.
(When Vibration and Sound are not Output According to Collision Speed)
In addition, the output information selection section 23 acquires determination information regarding the presence or absence of a collision (collision determination information regarding the collision of the virtual object to the inner wall of the virtual container, collision determination information regarding the collision between the virtual objects) from the collision determination unit 21.
When the timing at which the virtual object has collided with the inner wall of the virtual container is acquired from the collision determination information regarding the collision of the virtual object to the inner wall of the virtual container, the output information selection section 23 selects, from the output information storage section 11, vibration information corresponding to the setting information stored in advance (shape and size of the virtual container (setting value of the attribute (PA1)), shape and size of the virtual object (setting value of the attribute (PB1)), material of the virtual container (setting value of the attribute (PA2)), material of the virtual object (setting value of the attribute (PB2)), and the like).
In addition, when the timing at which the virtual objects have collided with each other is acquired from the collision determination information regarding the collision between the virtual objects, the output information selection section 23 selects, from the output information storage section 11, sound information corresponding to the setting information stored in advance.
In
The collision determination unit 21 determines the presence or absence of a collision of the virtual object to the inner wall of the virtual container (step S20). When the collision determination unit 21 determines that there has been no collision of the virtual object to the inner wall of the virtual container in step S20 (step S20: No), the process proceeds to step S30 skipping the following steps S22 and S24.
On the other hand, when the collision determination unit 21 determines that there has been a collision of the virtual object to the inner wall of the virtual container in step S20 (step S20: Yes), the collision determination unit 21 outputs determination information, which indicates that there has been a collision of the virtual object with the inner wall of the virtual container, and the speed of collision of the virtual object with the inner wall of the virtual container to the output information selection section 23. The output information selection section 23, which has acquired the determination information indicating that there has been a collision of the virtual object to the inner wall of the virtual container and the speed of collision of the virtual object to the inner wall of the virtual container, selects vibration information corresponding to the collision speed among the stored setting information from the output information storage section 11 (step S22). The output information selection section 23 that has selected the vibration information outputs the vibration information to the vibration unit 31, and the vibration unit 31 vibrates the housing according to the vibration information (step S24).
Subsequent to step S20 (No) or step S24, the collision determination unit 21 determines the presence or absence of a collision between the virtual objects (step S30). When the collision determination unit 21 determines that there has been no collision between the virtual objects in step S30 (step S30: No), the process proceeds to step S40 skipping the following steps S32 and S36.
On the other hand, when the collision determination unit 21 determines that there has been a collision between the virtual objects in step S30 (step S30: Yes), the collision determination unit 21 outputs determination information, which indicates that there has been a collision between the virtual objects, and the speed of collision between the virtual objects to the output information selection section 23. The output information selection section 23, which has acquired the determination information indicating that there has been a collision between the virtual objects and the speed of collision between the virtual objects, selects sound information corresponding to the collision speed among the stored setting information from the output information storage section 11 (step S32). The output information selection section 23 that has selected the sound information outputs the sound information to the sound output unit 32, and the sound output unit 32 outputs a sound (collision sound) according to the sound information (step S36).
Subsequent to step S30 (No) or step S36, it is determined whether or not the virtual collision mode has ended (step S40). When it is determined that the virtual collision mode has not ended (step S40: No), the process returns to step S10. On the other hand, when it is determined that the virtual collision mode has ended (step S40: Yes), the flowchart shown in
As described above, the electronic device 1 outputs a collision sound when virtual objects collide with each other, while the electronic device 1 simply vibrates the housing without outputting a collision sound when the virtual object collides with the virtual container M. Therefore, when the user moves the housing by shaking or the like (refer to part (a) of
In addition, as shown in part (c) of
In addition, in the above embodiment, an aspect in which the housing is made to vibrate without outputting a collision sound when a virtual object collides with a virtual container has been described. In this case, however, it is also possible to output a collision sound at a timing delayed from the vibration timing while vibrating the housing. Also in the aspect in which a collision sound is output at a timing delayed from the vibration timing of the housing while vibrating the housing when a virtual object collides with a virtual container, it is possible for the user to experience a sense of reality as if something, such as a real ball, is contained inside the housing, as in the aspect in which the housing is made to vibrate without outputting a collision sound when a virtual object collides with a virtual container.
Hereinafter, an aspect will be described in which a collision sound is output at a timing delayed from the vibration timing of the housing while vibrating the housing when a virtual object collides with a virtual container.
In the case of the above aspect, the electronic device 1 further includes the timing control information storage unit 19 shown by the dotted line in
In addition, in the case of the aspect, the output information storage section 11 stores sound information and vibration information as output information at the time of collision of the virtual object to the inner wall of the virtual container for each piece of setting information (so as to correspond to the collision speed when outputting vibration and a sound according to the collision speed).
In addition, in the case of the above aspect, when the collision determination unit 21 determines that there has been a collision of the virtual object to the inner wall of the virtual container, the output information selection section 23 selects vibration information and sound information with reference to the output information storage section 11, and outputs the vibration information to the vibration unit 31 and outputs the sound information to the sound output unit 32 at a timing delayed from the output timing of the vibration information to the vibration unit 31 by the delay time based on the timing control information.
As described above, it is possible to realize the aspect in which a collision sound is output at a timing delayed from the vibration timing of the housing while vibrating the housing when a virtual object collides with a virtual container.
The flowchart shown in
When the collision determination unit 21 determines that there has been a collision of the virtual object to the inner wall of the virtual container in step S120 (step S120: Yes), the collision determination unit 21 outputs determination information indicating that there has been a collision of the virtual object to the inner wall of the virtual container, the speed of collision of the virtual object to the inner wall of the virtual container, determination information indicating that there is a collision between the virtual objects, and the speed of collision between the virtual objects to the output information selection section 23. The output information selection section 23 that has acquired such information selects, from the output information storage section 11, vibration information corresponding to the speed of collision of the virtual object to the inner wall of the virtual container and sound information corresponding to the speed of collision between the virtual objects in the stored setting information (step S122).
Subsequent to step S122, the output information selection section 23 outputs the vibration information to the vibration unit 31, and the vibration unit 31 vibrates the housing according to the vibration information (step S124). In addition, the output information selection section 23 outputs the sound information to the sound output unit 32 at a timing delayed from the output timing of the vibration information to the vibration unit 31 by the delay time stored in the timing control information storage unit 19, and the sound output unit 32 outputs a sound (collision sound) according to the sound information (step S126). The following is the same as
In addition, the vibration unit 31 may be a vibration motor that vibrates each of a plurality of portions of the housing. In other words, the vibration unit 31 may be formed by a plurality of vibration motors that vibrate respective portions of the housing. When the vibration unit 31 is formed by the plurality of vibration motors described above, the output information selection section 23 may generate vibration information to vibrate a portion corresponding to the collision point when the collision determination unit 21 determines that there has been a collision of the virtual object to the inner wall of the virtual container. In this case, since the vibration unit 31 vibrates a portion corresponding to the collision point, the user can experience a sense of reality even more.
In addition, the above-described various kinds of processing related to each process of the electronic device 1 according to the embodiment of the present invention may be performed by recording a program, which is used to perform each process of the electronic device 1 according to the embodiment of the present invention, in a computer-readable recording medium, used to read the program recorded in the recording medium into a computer system, and used to execute the program. In addition, the “computer system” referred to herein may include an OS or hardware, such as peripheral devices. In addition, the “computer system” may also include a homepage providing environment (or display environment) if a WWW system is used. In addition, the “computer-readable recording medium” refers to writable nonvolatile memories such as a flexible disk, a magneto-optical disc, a ROM, and a flash memory, portable media such as a CD-ROM, and a storage unit such as a hard disk built in a computer system.
In addition, the “computer-readable recording medium” also includes a medium that stores a program for a predetermined period of time, such as a volatile memory (for example, a dynamic random access memory (DRAM)) in a computer system serving as a server or a client when a program is transmitted through a network, such as the Internet, or a communication line, such as a telephone line. In addition, the program may be transmitted from a computer system, which has a storage unit or the like that stores the program, to another computer system through a transmission medium or through a transmission wave in the transmission medium. Here, the “transmission medium” to transmit a program refers to a medium having a function of transmitting information, such as a network (communication network) including the Internet or a communication line including a telephone line. In addition, the above-described program may be provided to realize some of the functions described above. In addition, the program may be a so-called differential file (differential program) that can realize the above-described functions in combination with a program already recorded in a computer system.
While the embodiments of the invention have been described in detail with reference to the drawings, the specific configuration is not limited to the above-described embodiments and design and the like within the scope that does not depart from the subject matter of the invention are also included.
In an embodiment of the present invention, an electronic device includes: a vibration unit that vibrates a housing on the basis of vibration information; a sound output unit that outputs a sound on the basis of sound information; a detection unit that detects the movement of the housing; a collision determination unit that determines the presence or absence of a collision of the virtual object to an inner wall of the virtual container and the presence or absence of a collision between the virtual objects by calculating the movement of a virtual container and the movement of a virtual object on the basis of the movement of the housing detected by the detection unit, the virtual container being moved according to the movement of the housing, the virtual object being moved around in the virtual container according to the movement of the virtual container; and an output information generation unit that generates the vibration information output to the vibration unit when the collision determination unit determines that there has been a collision of the virtual object to the inner wall of the virtual container and generates the sound information output to the sound output unit when the collision determination unit determines that there has been a collision between the virtual objects.
In the embodiment described above, it is possible to adopt a configuration in which the output information generation unit generates the sound information in addition to the vibration information when the collision determination unit determines that there has been a collision of the virtual object to the inner wall of the virtual container and outputs the sound information to the sound output unit at a timing delayed from the timing at which the vibration information is output to the vibration unit.
In addition, in the embodiment described above, it is possible to adopt a configuration in which the vibration unit is capable to vibrate each of a plurality of portions of the housing and the output information generation unit generates the vibration information in order to vibrate a portion corresponding to the collision point when the collision determination unit determines that there has been a collision of the virtual object to the inner wall of the virtual container.
In addition, in the embodiment described above, it is possible to adopt a configuration in which a display unit, which displays the movement of the virtual object that moves around in the virtual container, is further provided.
Number | Date | Country | Kind |
---|---|---|---|
2011-174144 | Aug 2011 | JP | national |
This is a Continuation Application of International Application No. PCT/JP2012/070194, filed Aug. 8, 2012, which claims priority to Japanese Patent Application No. 2011-174144, filed on Aug. 9, 2011. The contents of the aforementioned applications are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2012/070194 | Aug 2012 | US |
Child | 14173081 | US |