The present invention relates to a control system, a control method, and a program.
There are, for example, games such as racing games in which images of objects such as cars and obstacles are output and a user manipulates his or her own object by looking at the images thereof. The presence or absence of interaction such as collision between an object manipulated by the user and another object and an obstacle is virtually detected by a program, and a detection result thereof is reflected in an image or sound output.
PTL 1 discloses travel of a self-propelled device manipulated by the user on a mat.
[PTL 1]
The present inventor and others have created a game in which a mobile apparatus including a drive mechanism such as a motor is moved on the basis of user manipulation and another game in which a mobile apparatus moved by a program is provided in addition to an apparatus manipulated by a user for competition. In a case where a real apparatus is moved, it is necessary to take into consideration actual physical phenomena. Physical phenomena include, for example, replacement of the apparatus moved by the program and the apparatus manipulated by the user at different locations, tipping-over of these apparatuses, collision of these apparatuses with an obstacle or another object moved by a program, and other phenomena that occur due to external causes as well as physical movement of the mobile apparatuses. Because of difficulty involved in accurately controlling the mobile apparatuses by controlling the drive mechanism alone, it is not easy to detect a physical positional relation between the mobile apparatus moved by the program and the apparatus manipulated by the user. It has been difficult to properly control the games because of these physical phenomena.
The present invention has been devised in light of the foregoing, and it is an object of the present invention to provide a technology that allows physical phenomena to be addressed in a case where an actual object is moved by a user manipulation or by other means.
In order to solve the above problem, a control system according to the present invention includes a mobile apparatus being an apparatus that moves on a sheet where images indicating coordinates are arranged and having a camera for photographing part of the sheet, manipulation acquisition means adapted to acquire a manipulation of the user, travel control means adapted to perform control in such a manner that the mobile apparatus travels according to the manipulation of the user, position detection means adapted to detect a position of the mobile apparatus on the basis of an image photographed by the camera included in the mobile apparatus, determination means adapted to determine, on the basis of the position detection by the position detection means, whether or not the mobile apparatus has moved in a manner estimated the basis of the manipulation the user, and execution means adapted to perform a predetermined procedure in a case where it is determined that the mobile apparatus does not move in the estimated manner.
Also, a control method according to the present invention includes a step of acquiring a manipulation of the user, a step of performing control in such a manner that a mobile apparatus having a camera for photographing part of a sheet where images indicating coordinates are arranged travels on the sheet according to the manipulation of the user, a step of detecting a position of the mobile apparatus on the basis of an image photographed by the camera included in the mobile apparatus, a step of determining, on the basis of the position detection of the mobile apparatus, whether or not the mobile apparatus has moved in a manner estimated on the basis of the manipulation of the user, and a step of performing a predetermined procedure in a case where it is determined that the mobile apparatus does not move in the estimated manner.
Also, a program according to the present invention causes a computer to function as manipulation acquisition means adapted to acquire a manipulation of the user, travel control means adapted to perform control in such a manner that a mobile apparatus having a camera for photographing part of a sheet where images indicating coordinates are arranged travels on the sheet according to the manipulation of the user, position detection control means adapted to control detection of a position of the mobile apparatus, the detection being based on an image photographed by the camera included in the mobile apparatus, determination means adapted to determine, on the basis of the position detection by the position detection control means, whether or not the mobile apparatus has moved in a manner estimated on the basis of the manipulation of the user, and execution means adapted to perform a predetermined procedure in a case where is determined that the mobile apparatus does not move in the estimated manner.
In an embodiment of the present invention, the determination means may determine, on the basis of the position detected by the position detection means, whether or not the mobile apparatus has moved in a manner estimated on the basis of the manipulation of user.
In an embodiment of the present invention, the mobile apparatus may further include a sensor adapted to detect whether or not the mobile apparatus has collided with another object, the determination means may determine, on the basis of output of the sensor, whether or not the mobile apparatus has collided with the another object, and the execution means may perform a predetermined procedure in a case where it is determined that the mobile apparatus does not move in the estimated manner and that the mobile apparatus has collided with the another object.
In an embodiment of the present invention, the execution means may perform control in such a manner that the mobile apparatus is rotated and that an orientation of the mobile apparatus falls, after rotation of the mobile apparatus, within a predetermined directional range on the sheet, in a case where it is determined that the mobile apparatus does not move in the estimated manner and that the mobile apparatus has collided with the another object.
In an embodiment of the present invention, the control system may further include another mobile apparatus having a camera for photographing part of the sheet. The position detection means may detect a position of the another mobile apparatus on the basis of an image photographed by the camera included in the another mobile apparatus.
In an embodiment of the present invention, the determination means may determine, on the basis of the position of the mobile apparatus manipulated by the user and the position of the another mobile apparatus, whether or not the mobile apparatus and the another mobile apparatus are in proximity to each other, the execution means may perform a first procedure in a case where it is determined that the mobile apparatus does not move in the estimated manner, that the mobile apparatus has collided with the another object, and that the mobile apparatus and the another mobile apparatus are in proximity to each other, and the execution means may perform a second procedure different from the first procedure in a case where it is determined that the mobile apparatus does not move in the estimated manner, that the mobile apparatus has collided with the another object, and that the mobile apparatus and the another mobile apparatus are not in proximity to each other.
In an embodiment of the present invention, the determination means may determine, on the basis of detection of another position of the another mobile apparatus by the position detection means, whether or not the mobile apparatus has moved in the manner estimated on the basis of the manipulation of the user, and the execution means may move the another mobile apparatus on the basis of proximity between the position of the mobile apparatus manipulated by the user and the position of the another mobile apparatus, in a case where it is determined that the another mobile apparatus has moved in the estimated manner.
In an embodiment of the present invention, the determination means may determine whether or not the position of the mobile apparatus has been detected by the position detection means, the execution means may output a message to instruct the user to arrange the mobile apparatus on the sheet and may calculate a return range the sheet on the basis of a last position of the mobile apparatus detected by the position detection means, in a case where the position of the mobile apparatus is not detected by the position detection means, and the execution means may output an error message in a case where the position of the mobile apparatus detected by the position detection means is not located within the return range after the instruction message has been output.
In an embodiment of the present invention, a plurality of return ranges may be printed on the sheet, and the execution means may select, on the basis of the last position of the mobile apparatus detected by the position detection means, return range from among the plurality of return ranges and output an instruction message indicating the selected return range.
Also, another control system according to the present invention includes a first apparatus and a second apparatus each being an apparatus that travels on a sheet where images indicating coordinates are arranged and each having a camera for photographing part of the sheet, manipulation acquisition means adapted to acquire a manipulation of the user, first travel control means adapted to perform control in such a manner that the first apparatus travels according to the manipulation of the user, position detection means adapted to detect a position of the first apparatus on the basis of an image photographed by the camera included in the first apparatus and detect position of the second apparatus on the basis of an image photographed by the camera included in the second apparatus, and second travel control means adapted to decide a destination of the second apparatus on the basis of the position of the first apparatus and the position of the second apparatus, the positions being detected by the position detection means, and control travel of the second apparatus on the basis of the decided destination.
In an embodiment of the present invention, the second apparatus may further include a sensor adapted to detect collision with another object, and the second travel control means may control the travel of the second apparatus further on the basis of a signal of the sensor.
According to the present invention, it is possible to address physical phenomena in a case where an actual object is moved by a user manipulation or by other means.
A description will be given below of an embodiment of the present invention on the basis of drawings. Of components that appear, those having the same function will be denoted by the same reference sign, and the description thereof will be omitted. In the embodiment of the present invention, a mobile device that travels according to a user manipulation travels on a sheet.
The processor 11 operates according to a program stored in the storage section 12 and controls the communication section 13, the input/output section 14, and the like. The processor 21 operates according to a program stored in the storage section 22 and controls the communication section 23, the camera 24, the motors 25, and the like. Although stored and provided in a computer-readable storage medium such as a flash memory in the cartridge 18, the above programs may be provided via a network such as the Internet.
The storage section 12 includes a dynamic random access memory (DRAM) and a non-volatile memory incorporated in the device control apparatus 10, a non-volatile memory in the cartridge 18, and the like. The storage section 22 includes a DRAM, a non-volatile memory, and the like. The storage sections 12 and 22 store the above programs. Also, the storage sections 12 and 22 store information and computation results input from the processors 11 and 21, the communication sections 13 and 23, and the like.
Each of the communication sections 13 and 23 includes integrated circuitry, an antenna, and the like for communicating with other equipment. The communication sections 13 and 23 have a function to communicate with each other, for example, according to Bluetooth (registered trademark) protocols. The communication sections 13 and 23 input, under control of the processors 11 and 21, information received from other apparatuses to the processors 11 and 21 and the storage sections 12 and 22 and send information to other apparatuses. It should be noted that the communication section 13 may have a function to communicate with other apparatuses via a network such as a local area network (LAN).
The input/output section 14 includes circuitry for acquiring information from input devices such as the controller 17 and circuitry for controlling output devices such as a sound output device and an image display device. The input/output section 14 acquires an input signal from the input device and inputs, to the processor 11 and the storage section 12, information obtained by converting the input signal. Also, the input/output section 14 causes a speaker to output a sound and the display device to output an image under control of the processor 11 or the like.
The motors 25 are what are called servomotors whose direction, amount of rotation, and rotational speed are controlled by the processor 21. A wheel 254 is assigned to each of the two motors 25, and the motors 25 drive the assigned wheels 254.
The camera 24 is arranged to photograph an area below the cart 20 and photographs a pattern printed on a sheet 31 (refer to
The acceleration sensor 26 measures an acceleration exerted on the cart 20. The acceleration sensor 26 outputs a measured acceleration value. It should be noted that the acceleration sensor 26 may be integral with a gyrosensor.
In the example illustrated in
A detailed description will be given of the pattern printed on the sheet 31 or the like. Unit patterns of a given size (e.g., 0.2 mm square) are arranged in a matrix shape on the sheet 31. Each of the unit patterns is an image obtained by coding the coordinates of the position where each of the pattern is arranged. Of a coordinate space that can be represented by the coded coordinates, a region corresponding to the size of the sheet 31 is assigned to the sheet 31.
In the control system according to the present embodiment, the unit pattern printed on the sheet 31 or the like is photographed by the camera 24 of the cart 20, and the cart 20 or the device control apparatus 10 acquires the coordinates by decoding the unit pattern. This allows the position of the cart 20 on the sheet 31 or the like to be recognized. Also, the cart 20 or the device control apparatus 10 also calculates an orientation of the cart 20 by detecting the orientation of the unit pattern in the image photographed by the camera 24.
This control system can recognize the position of the cart 20 on the sheet 31 or the like with high accuracy by using the patterns printed on the sheet 31 or the like without using any other device such as a stereo camera.
A description will be given below of an operation of this control system.
The manipulation acquisition section 51 acquires a user manipulation from the controller 17 via the input/output section 14. The acquired user manipulation is, for example, a tilt of the controller, whether or not a button has been pressed, and a jog dial position. The manipulation acquisition section 51 acquires these manipulations, for example, as a steering manipulation, an acceleration manipulation, and a braking manipulation of the cart.
The travel control section 52 performs control in such a manner that the manipulated cart 20c travels according to the user manipulation. The manipulated cart 20c is any one of the carts 20, and the travel control section 52 changes the orientation of travel of the manipulated cart 20c according to the user manipulation corresponding to the steering manipulation of the user and increases and decreases a speed of travel of the manipulated cart 20c according to the user manipulations corresponding to the acceleration manipulation and the braking manipulation.
The position detection section 53 recognizes, from the image photographed by the camera 24 of the cart 20, the pattern obtained by coding the coordinates. The position detection section 53 detects the coordinates (position) where the cart 20 is located and the orientation thereof from the coordinates indicated by the pattern. Also, the processor 11 included in the device control apparatus 10 performs control, by executing an application program for realizing some of the functions of the position detection section 53, in such a manner that the coordinates (position) and the orientation are detected on the basis of the photographed image, and in a case where the detection is successful, the processor 11 acquires the detected coordinates (position) and orientation and stores them in the storage section 12. It should be noted that the detection of the position and orientation on the basis of the image may be performed by the cart 20. Alternatively, the detection may be performed as a result of execution of firmware stored in the storage section 12 by the processor included in the device control apparatus 10.
The motion determination section 54 determines, on the basis of the position detection by the position detection section 53, whether or not the cart 20 has moved in a manner estimated from control performed by the travel control section 52. This is, in a case of the manipulated cart 20c, equivalent to determining, by the fact that the motion determination section 54, whether or not the manipulated cart 20c has moved in the manner estimated on the basis of the user manipulation. More specifically, the motion determination section 54 determines, on the basis of the position detected by the position detection section 53, whether or not the cart 20 has moved in the manner estimated from control performed by the travel control section 52, and further, the motion determination section 54 determines whether or not the position of the cart 20 has been detected by the position detection section 53.
The motion processing section 55 performs predetermined procedures in a case where it is determined that the cart 20 does not move in the estimated manner.
A more detailed description will be given below of the processes performed by this control system.
First, the position detection section 53 detects the current coordinates (position) and orientation of the own cart on the basis of the image photographed by the camera 24 (step S101). Also, the position detection section 53 acquires the detected position and orientation in a case where the above detection is successful.
Then, the motion determination section 54 determines whether or not the position of the own cart has been detected on the basis of the image in the detection performed above (step S102). In a case where the position of the own cart cannot be detected on the basis of the image (N in step S102), the own cart has been removed by hand, has gone off a course, or toppled over. Accordingly, the motion processing section 55 performs a return process for bringing the own cart back onto the sheet 31 (desirably into the travel-permitted region 35) (step S103).
Here, the return process will be described in detail.
When the return region is identified, the motion processing section 55 outputs a message sound including information indicating the identified return region (step S203). The information indicating the identified return region may be, for example, the area code 37 printed in the partial region identified as the return region. It should be noted that the message may not include the information indicating the return region.
Then, the motion processing section 55 waits until the position detection section 53 detects the coordinates from the image photographed by the camera 24 of the own cart (step S204). When the position detection section 53 detects the coordinates, the motion processing section 55 determines whether the detected coordinates are located within the identified return region (step S205). In a case where the detected coordinates are located within the identified return region (Y in step S205), the process is terminated assuming that the cart has been successfully brought back, after which the processes illustrated in
Due to output of the information indicating the return region as a message, the user can readily resume the race by arranging the cart 20 in a correct region.
A description will be given below of the processes in step S102 and subsequent steps illustrated in
In a case where the detected coordinates are located within the estimated coordinate range (Y in step S105), the own cart has no difficulty in its movement caused by an external cause. Accordingly, the travel control section 52 performs a normal travel control process (step S106). The normal travel control process will be described later.
In a case where the detected coordinates are located outside the estimated coordinate range (Y in step S105), the motion determination section 54 further performs the following processes to analyze external causes. First, the motion determination section 54 acquires output (acceleration vector) of the acceleration sensor 26 incorporated in the own cart (step S107). Then, the motion determination section 54 determines whether or not the output of the acceleration sensor 26 indicates the occurrence of collision of the own cart with another object, on the basis of whether or not a magnitude of the acceleration vector acquired from the acceleration sensor 26 is greater than a given threshold (step S108). It should be noted that whether the collision has occurred may be determined on the basis of the magnitudes of components of the acceleration vector in the directions other than the vertical direction.
In a case where the output of the acceleration sensor 26 does not indicate the occurrence of collision with the other object (N in step S106), the travel control section 52 performs the normal travel control process (step S106). Meanwhile, in a case where the output of the acceleration sensor 26 indicates the occurrence of collision with the other object (Y in step S106), the motion determination section 54 further determines whether or not the collision occurred with another cart (step S109). Whether or not the collision occurred between the own cart and the other cart 20 may be determined only on the basis of whether or not the own cart and the other cart 20 are in proximity (whether or not the distance therebetween is smaller than a distance threshold) or further on the basis of whether a movement vector of the other cart 20 is oriented in the direction of approaching the own cart.
In a case where it is determined that the collision has occurred with the other cart 20 (Y in step S109), the motion processing section 55 performs a first collision process (step S110), and in a case where it is determined that the collision has not occurred with the other cart 20 (N in step S109), the motion processing section 55 performs a second collision process (step S111). The first process and the second collision process will be described in detail later.
It should be noted that the motion determination section 54 may determine whether the own cart has moved in the estimated manner in a way different from that in the processes in steps S104 and S105. For example, the motion determination section 54 may calculate an estimated movement vector on the basis of most recent control over the movement of the own cart performed by the travel control section 52, calculate a real movement vector from the current coordinates and the coordinates acquired during the previous process, and further determine whether or not a difference between the estimated movement vector and the real movement vector falls within a permissible range. Also, the motion determination section 54 may estimate the coordinates where the own cart is located in the case of the absence of abnormality, on the basis of the coordinates acquired during the last process and most recent control over the movement of the own cart performed by the travel control section 52, and the motion determination section 54 may determine whether or not the difference between the estimated coordinates and the detected current coordinates falls within the permissible range.
A description will be given next of the normal travel control process. The normal travel control process is different between the manipulated cart 20c that travels by a user manipulation and the controlled cart 20d controlled by the program.
When the marker 42 is selected, the travel control section 52 determines whether or not the distance between the own cart and the other cart 20 (e.g., manipulated cart 20c) is equal to or smaller than a control threshold (step S353). In a case where the distance is greater than the control threshold (N in step S353), the selected marker is set as the target point (step S354).
Meanwhile, in a case where the distance is equal to or smaller than the control threshold (N in step S353), the travel control section 52 determines whether or not the other cart 20 is located posteriorly in the course (step S356). Whether or not the other cart 20 is located posteriorly in the course may be determined, for example, by determining whether or not an absolute value of the angle formed between a vector extending from the marker 42 closest to the own cart to the marker ahead thereof and a vector extending from the own cart to the other cart 20 is larger than a given value (e.g., a constant larger than 90 degrees but smaller than 180 degrees).
In a case where the other cart 20 is located posteriorly in the course (N in step S356), the travel control section 52 decides a target point 44 in such manner as to obstruct the travel of the other cart 20 (step S357).
Also, in a case where the other cart 20 is not located posteriorly in the course (N in step S356), the travel control section 52 decides the target point 44 in such a manner that the own cart avoids the other cart 20 (step S359).
It should be noted that, in step S357, the travel control section 52 may also decide the target point 44 in such a manner that the own cart avoids the other cart 20. The operations in steps S357 and S359 may be changed as features of the controlled cart 20d by a user instruction.
When the target point 44 is set or decided, the travel control section 52 controls the motors of the own cart in such a manner that the own cart heads toward the target point 44 (step S360).
As described above, even in a case of causing the real cart 20 to travel instead of controlling a virtual cart output as an image by acquiring the coordinates detected through photographing of the sheet 31 for the own cart (controlled cart 20d) and the other cart 20 (manipulated cart 20c) and by controlling the movement of the controlled cart 20d on the basis of the coordinates, it becomes possible to readily detect a positional relation and perform complex control according to the positional relation between the plurality of carts 20.
A description will be given next of the first collision process.
Then, the motion processing section 55 estimates the orientation of the own cart after the first spinning motion (step S402). The motion processing section 55 may store a variation in the orientation caused by the spinning motion in the storage section 12 in advance and estimate the orientation of the own cart by adding the variation to the current orientation.
Then, in a case where the estimated orientation falls within the directional range Dr (Y in step S403), the motion processing section 55 performs the first spinning motion (step S404). It should be noted that, in this case, the cart 20 is highly likely not to face the user as a result of the first spinning motion.
Meanwhile, in a case where the estimated orientation falls outside the directional range Dr (N in step S403), the motion processing section 55 performs the second spinning motion that brings the orientation within the directional range Dr after the motion (step S405). Here, the first spinning motion and the second spinning motion differ in amount of rotation. The difference in amount of rotation between the first spinning motion and the second spinning motion is (360 degrees—Dr) or more.
Although the orientation after the spinning motion is estimated in steps S402 and S403, this determination may be made in a different way. For example, this determination may be made by storing in advance, in the storage section 12, the determination directional range obtained by adding the variation caused by the spinning motion to the directional range Dr and determining whether or not the current orientation falls within the determination directional range.
It should be noted that the motion processing section 55 may perform control in such a manner that a third spinning motion and a fourth spinning motion are performed instead of the first spinning motion and the second spinning motion further in a case where the relation between the orientation of the collision and the direction of travel satisfies a given condition.
When the first spinning motion or the second spinning motion is performed, the motion processing section 55 determines whether the post-motion position falls within the travel-permitted region 35 (step S406). In a case where the position falls outside the travel-permitted region (N in step S406), the motion processing section 55 moves the own cart to a location within the travel-permitted region 35 (step S407).
The second collision process differs from the first collision process in spinning motion and output sound. There is only a slight difference in the process itself. Accordingly, the description of a processing procedure will be omitted.
As has been described up to this point, it becomes possible to determine whether some kind of event has occurred on the cart 20 due to an external physical cause, on the basis of the detection of the coordinates by the camera 24 of the cart 20 and on the basis of the movement of the cart estimated from control over the motors of the cart and the like performed up to this point, and take an action commensurate with the event. Further, it is possible to take a more elaborate action by detecting the collision by the acceleration sensor and more properly control the game in which the physical cart is caused to travel.
It should be noted that the sheet 31 may be at least partially divided into a lattice as in a maze.
Number | Date | Country | Kind |
---|---|---|---|
2019-107857 | Jun 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/022167 | 6/4/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/250809 | 12/17/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20150196839 | Ehrman | Jul 2015 | A1 |
Number | Date | Country |
---|---|---|
H1071276 | Mar 1998 | JP |
H11244515 | Sep 1999 | JP |
2017161770 | Sep 2017 | JP |
2018025467 | Feb 2018 | WO |
Entry |
---|
English machine translation of WIPO publication WO/2018/025467 by Nakayama, et al. |
International Search Report and Written Opinion dated Aug. 11, 2020, from PCT/JP2020/022167, 11 sheets. |
_t o i o12,Internetarchive waybackmachine,Internet Archive, Jun. 1, 2017, URL:https://web.archive.org/web/20170601051305/https://www.sony.co.jp/SonyInfo/News/Press/201706/17-058/,[Jul. 31, 2020],pp. 1-5,(SONY Corp), non-official translation “Toy platform ‘toio’, which will let children's creativity expand the fun of playing with toys, to be realeased Dec. 2017;Pre-order starts today.”),5 sheets. |
! ┌ t o i o ┘ =—,Business Insider Japan, , Jun. 2, 2017, URL: https://www.businessinsider. jp/post-34081, [retrieved Jul. 31, 2020], pp. 1-9, non-official translation (ITO, Yu, “Same day reservation is sold out! Sony's innovative technology in the new toy ‘toio’”), 9 sheets. |
Number | Date | Country | |
---|---|---|---|
20220241680 A1 | Aug 2022 | US |