Priority is claimed on Japanese Patent Application No. 2022-009875, filed Jan. 26, 2022, the content of which is incorporated herein by reference.
The present invention relates to a mobile object management device, a mobile object management method, and a storage medium.
In the related art, technology for remotely controlling lights provided on mobile objects and light emitters owned by spectators watching a parade of the mobile objects by an instruction controller when the parade is held at an amusement park or the like is known (for example, Japanese Unexamined Patent Application, First Publication No. 2004-39415).
However, in the related art, a mechanism for not only watching an event such as a parade, but also participating in the event is not taken into account. Thus, there is a possibility that a performance effect of the event is not sufficient and is not sufficiently achieved.
An aspect of the present invention has been made in consideration of such circumstances and an objective thereof is to provide a mobile object management device, a mobile object management method, and a storage medium capable of further improving a performance effect of an event.
A mobile object management device, a mobile object management method, and a storage medium according to the present invention adopt the following configurations.
(1): According to an aspect of the present invention, there is provided a mobile object management device for managing a ridable mobile object that a user is allowed to get on and which moves inside of a prescribed area, the mobile object management device including: an acquirer configured to acquire location information of the ridable mobile object; a manager configured to manage the ridable mobile object and a terminal device of the user on the ridable mobile object in association with each other; and an event operation instructor configured to cause the ridable mobile object to execute a prescribed operation corresponding to an event via the terminal device of the user on the basis of the location information and information about the event that is executed inside of the prescribed area, wherein the manager manages whether or not to permit participation in the event of the user on the basis of a state in which the user uses the ridable mobile object.
(2): In the above-described aspect (1), the manager decides on an event in which the user is able to participate on the basis of information including at least one of the number of uses and usage time of the ridable mobile object of the user.
(3): In the above-described aspect (1), the manager sets a possible participation level of the user for the event on the basis of a state in which the user uses the ridable mobile object and manages an operation capable of being executed by the ridable mobile object of the user in accordance with the set possible participation level.
(4): In the above-described aspect (3), the manager manages a performance operation of the event in which the user is able to participate on the basis of the possible participation level.
(5): In the above-described aspect (3), the manager acquires an event in which the user is able to participate on the basis of the possible participation level and notifies the user of information for asking the user about whether or not to participate in the acquired event.
(6): In the above-described aspect (1), the manager restricts participation in an event in which the user was able to participate in the past on the basis of at least one of elapsed time after the user previously participated in the event and elapsed time after the user previously rode the ridable mobile object.
(7): In the above-described aspect (1), the manager transmits information for giving a lecture about an operation of the ridable mobile object to the user before the user participates in the event to a terminal device of the user.
(8): In the above-described aspect (1), the manager provides the ridable mobile object to the user or gives incentives to a service provider that plans the event on the basis of the state in which the user uses the ridable mobile object.
(9): In the above-described aspect (1), the event operation instructor adjusts content of the prescribed operation on the basis of information about the user on the ridable mobile object or a surrounding environment of the ridable mobile object.
(10): In the above-described aspect (1), the event operation instructor adjusts content of the prescribed operation on the basis of setting content from the user on the ridable mobile object.
(11): According to an aspect of the present invention, there is provided a mobile object management method including: acquiring, by a computer of a mobile object management device for managing a ridable mobile object that a user is allowed to get on and which moves inside of a prescribed area, location information of the ridable mobile object; managing, by the computer, the ridable mobile object and a terminal device of the user on the ridable mobile object in association with each other; causing, by the computer, the ridable mobile object to execute a prescribed operation corresponding to an event via the terminal device of the user on the basis of the location information and information about the event that is executed inside of the prescribed area; and managing, by the computer, whether or not to permit participation in the event of the user on the basis of a state in which the user uses the ridable mobile object.
(12): According to an aspect of the present invention, there is provided a computer-readable non-transitory storage medium storing a program for causing a computer of a mobile object management device for managing a ridable mobile object that user is allowed to get on and which moves inside of a prescribed area to: acquire location information of the ridable mobile object; manage the ridable mobile object and a terminal device of the user on the ridable mobile object in association with each other; cause the ridable mobile object to execute a prescribed operation corresponding to an event via the terminal device of the user on the basis of the location information and information about the event that is executed inside of the prescribed area; and manage whether or not to permit participation in the event of the user on the basis of a state in which the user uses the ridable mobile object.
According to the above-described aspects (1) to (12), it is possible to further improve a performance effect of an event.
Hereinafter, embodiments of a mobile object management device, a mobile object management method, and a storage medium of the present invention will be described with reference to the drawings. In the following description, a mobile object management system including a ridable mobile object that moves within a prescribed area where a user is allowed to get thereon and a mobile object management server that manages the ridable mobile object will be described. The prescribed area is, for example, an area of a facility having a prescribed size such as a theme park, leisure land, an amusement park, a zoo, an aquarium, or a shopping mall. The prescribed area may be an area within a range designated by location information such as latitude and longitude.
The mobile object management server 100 manages the user U using the ridable mobile object 300 and controls an operation of the ridable mobile object 300. The mobile object management server 100 manages the ridable mobile object 300 and the terminal device 200 of the user U in association with each other. The terminal device 200 is, for example, a portable terminal with which the user U can get on the ridable mobile object 300 and is specifically a smartphone or a tablet terminal. The terminal device 200 may be a wearable terminal worn by the user U. The terminal device 200 is a terminal device owned by user U. The ridable mobile object 300 is a mobile object that moves within a prescribed area with the user U that is allowed to get thereon. The ridable mobile object 300 is, for example, a device provided (rented) from the service provider side in the mobile object management system 1 for the user U to move within the prescribed area. For example, the ridable mobile object 300 is a vehicle, micromobility, a robot, or the like that can move while the user U is sitting on a seat of the ridable mobile object 300 or standing on steps. The ridable mobile object 300 moves within a prescribed area or executes a prescribed operation in a state in which the user U is allowed to get thereon on the basis of an operation instruction based on a manipulation by the user U or an operation instruction from the mobile object management server 100. The prescribed operation includes, for example, an operation (e.g., movement, rotation, or the like) in accordance with music output in association with the execution of an event executed in the prescribed area or an operation of a physical object associated with the event. The prescribed operation may include an operation of outputting a sound from an audio output provided in the ridable mobile object 300 and an operation of causing a light emitter provided in the ridable mobile object 300 to emit light. The ridable mobile object 300 may guide the user U so that a prescribed operation is executed according to a manipulation of the user U or output information (for example, a voice) for prompting the user U to perform a manipulation, and may perform the prescribed operation regardless of the manipulation of the user U. Events include, for example, a parade that marches along a prescribed route within a prescribed area at a prescribed time or a show (for example, an event such as a play or concert) that is held at a specific place within a prescribed area at a prescribed time. The event may include, for example, an event (a group event) generated by the gathering of a prescribed number of ridable mobile objects 300 within a specific range within a prescribed area. Physical objects related to the event include, for example, people participating in the event (mascot characters, musical instrument performers, dancers, various types of cast members such as puppets), mobile objects (parade cars and drones), and the like. For example, the user U can use the ridable mobile object 300 within a prescribed area by performing a registration process or the like on the mobile object management server 100 via the terminal device 200. Hereinafter, details of the mobile object management server 100, the terminal device 200, and the ridable mobile object 300 will be described. Hereinafter, the prescribed area will be described as a theme park.
The mobile object management server 100 shown in
The storage 180 may be implemented by the various types of storage devices described above, a solid-state drive (SSD), an electrically erasable programmable read-only memory (EEPROM), a read-only memory (ROM), a random-access memory (RAM), or the like. For example, user information 181, event information 182, usage history information 183, operation information 184, programs, and various types of other information are stored in the storage 180. The storage 180 may store map information of the theme park. Details of the user information 181, the event information 182, the usage history information 183, and the operation information 184 will be described below.
The communicator 110 communicates with the terminal device 200 and other external devices via the network NW.
The registrant 120 registers information about the user U using the mobile object management system 1. Specifically, the registrant 120 receives information about the user U from the terminal device 200 and stores the received information in the user information 181 of the storage 180.
For example, when a user registration request has been received from the terminal device 200, the registrant 120 generates an image for inputting various types of information included in the user information 181, causes the terminal device 200 that has received the request to display the generated image, acquires user information input from the terminal device 200, and registers the acquired user information in the user information 181.
The registrant 120 may authenticate the user U who uses the service of the mobile object management system 1 on the basis of the registered user information 181. In this case, the registrant 120 authenticates the user U, for example, at a timing when a service usage request has been received from the terminal device 200. For example, when the usage request has been received, the registrant 120 generates an authentication image for inputting authentication information such as a user ID and a password, causes the request terminal device 200 to display the generated image, and determines whether or not to permit use of the service according to whether the authentication information matching the input authentication information is stored with reference to the authentication information in the user information 181 on the basis of input authentication information input using the displayed image. For example, the registrant 120 permits the use of the service when the user information 181 includes authentication information matching the input authentication information and rejects the use of the service when the matching information is not included or performs a process of performing new registration.
The acquirer 130 acquires information about the ridable mobile object 300 that the user U is on. For example, when the terminal device 200 communicates with the ridable mobile object 300 in a near-field communication scheme such as Bluetooth, the acquirer 130 acquires identification information (for example, a mobile object ID) of the ridable mobile object 300 whose communication is in operation, identification information (for example, a terminal ID) of the terminal device 200, and a user ID from the terminal device 200. Subsequently, the acquirer 130 stores the terminal ID in the terminal information associated with the matching user ID with reference to the user ID of the user information 181 on the basis of the user ID and stores the mobile object ID in the ridable mobile object information. By iterating the above-described process at prescribed timings (for example, prescribed intervals), the mobile object management server 100 can manage a situation in which the ridable mobile object 300 is used.
The acquirer 130 acquires information about events implemented inside of the theme park. For example, the acquirer 130 acquires the event information 182 stored in the storage 180 in advance as the information about the events.
For example, the acquirer 130 acquires location information of the terminal device 200 from the terminal device 200 of the user U on the ridable mobile object 300 (in other words, the terminal device 200 communicating with the ridable mobile object 300 in the near-field wireless communication scheme), and acquires the acquired location information as the location information of the ridable mobile object 300. The acquirer 130 iteratively acquires location information at prescribed intervals while the terminal device 200 and the ridable mobile object 300 are communicating.
The manager 140 manages the entire mobile object management process in the mobile object management system 1.
For example, the user manager 141 manages the ridable mobile object 300 and the terminal device 200 of the user U on the ridable mobile object 300 in association with each other on the basis of the user information 181. The user manager 141 manages a situation in which the ridable mobile object 300 is used for each user U (for example, which ridable mobile object 300 the user U is currently on or the like) on the basis of the user information 181. The user manager 141 manages the location of the ridable mobile object 300 inside of the theme park on the basis of the information acquired by the acquirer 130. When prescribed users (for example, parents, children, and friends) are associated with each other in the user information 181, the user manager 141 may manage mutual location information of the terminal device 200 and the ridable mobile object 300 and the like.
The event manager 142 manages an event execution schedule on the basis of the event information 182 stored in the storage 180. The event manager 142 acquires information about an event to be held within a prescribed period of time from a current point in time with reference to the event information 182 and transmits the acquired information about content of the event and a route or location where the event is executed to the terminal device 200 and notifies the user U of the information. The event manager 142 determines the content of the event whose notification is provided to the user U in accordance with the possible participation level of the user U determined by the usage state of the ridable mobile object 300 of the user U. The usage state includes, for example, at least one of the number of uses and the usage time of the ridable mobile object 300 of the user U. The usage state may include information about a proficiency level of the user U for the ridable mobile object 300 in addition to the number of uses and the usage time. The proficiency level is an index value quantitatively expressing how familiar the user U is with using (manipulating) the ridable mobile object 300. The participation manager 143 sets the proficiency level for each user on the basis of, for example, the number of rotations of the ridable mobile object 300 in a prescribed period of time under the user's manipulation, a distance for stopping (sudden stopping) from a prescribed speed, and whether or not predetermined running (for example, straight running or figure-8 running) is possible. The proficiency level may be set on the basis of, for example, a degree of wobble of the ridable mobile object 300 under the user's manipulation and a frequency of input manipulations at the time of turning (e.g., the number of corrections in the case where the ridable mobile object 300 is bent more or less than expected when the body is tilted to change a traveling direction at a certain curvature). The degree of wobble and the frequency of input manipulations at the time of turning are acquired, for example, on the basis of a detection result of an attitude angle sensor provided in the ridable mobile object 300.
The participation manager 143 manages whether or not the user U can participate in the event, for example, on the basis of a state in which the user uses the ridable mobile object 300. For example, the participation manager 143 performs a management process of preventing the user U who does not satisfy the possible participation conditions of the event from participating in the event or a management process of preventing the user U from participating in an event exceeding the possible participation level of the user U or restricts participation in an event in which the user was able to participate in the past on the basis of elapsed time after the user previously participated in the event and/or elapsed time after the user previously rode the ridable mobile object 300. For example, the participation manager 143 performs a management process so that the guidance for the event, the inquiry about participation in the event, or the like is not performed for the user U below the possible participation level set under the possible participation conditions of the event.
The participation manager 143 manages the user U that participates in each event on the basis of information stored in the event information 182 and manages participation content and the like for each user U (for example, participation locations and performance operations in the parade). For example, the participation manager 143 adjusts the number of participation users to be less than or equal to the number of users who can participate on the basis of lottery, priority, or the like when the number of users who wish to participate exceeds the number of people whose participation is possible, for example, on the basis of the number of people whose participation is possible for each possible participation level included in the event information 182. After the user U participates in the event, the participation manager 143 updates the number of uses and usage time of the ridable mobile object 300 of the user U or performs a process of updating the possible participation level corresponding to the number of uses and the usage time. The participation manager 143 stores the above-described information as the usage history information 183 in the storage 180.
The participation manager 143 increases the possible participation level, for example, as the number of uses increases or the usage time increases. The participation manager 143 may increase the possible participation level in a level-up process when the number of uses is greater than or equal to a first threshold value and the usage time is greater than or equal to a second threshold value. In this case, the participation manager 143 increases the first threshold value or the second threshold value as the level increases. The participation manager 143 determines a performance operation that can be executed by the ridable mobile object 300 of the user U in accordance with the set possible participation level, stores the performance operation in the usage history information 183, and manages the performance operation. For example, the participation manager 143 increases the number of types of executable performance operations or enables a performance operation having a high rotational speed and a high moving speed as the possible participation level increases. For example, because the event information 182 includes a possible participation level under the possible participation conditions for each event, the number of events in which the user U can participate increases or the number of performance operations capable of being executed by the ridable mobile object 300 increases as the possible participation level of the user U increases. Thus, because the user U can participate in a desired event by increasing the level and can executing a desired performance operation, a motivation for participation in the event is further improved.
The participation manager 143 may restrict participation in an event in which the user U was able to participate in the past on the basis of at least one of elapsed time after the user U previously participated in the event (hereinafter referred to as first elapsed time) and elapsed time after the user U previously rode the ridable mobile object 300 (hereinafter referred to as second elapsed time). In this case, for example, the participation manager 143 applies a restriction so that participation in an event in which participation was possible in the past is impossible by deriving the first elapsed time and the second elapsed time from the usage time (previous usage end time) included in the usage history information 183 and decreasing the possible participation level of the user U as the first elapsed time and/or the second elapsed time of the user U increase. The participation manager 143 may apply a restriction so that participation in an event having an event ID “E001” that was able to participate in the past is disabled when the first elapsed time is greater than or equal to prescribed time or may apply a restriction so that participation in events having event IDs “E002” and “E003” that were able to participate in the past is disabled when the second elapsed time is greater than or equal to the prescribed time. The participation manager 143 may make an adjustment so that the number of uses or the usage time for increasing the level increases (or is lengthened) by increasing the above-described first or second threshold value and making an increase in the level difficult as the first elapsed time and/or the second elapsed time increases. Thus, when there is a gap (an interval) in the participation of the user U in the event or in riding of the user U of the ridable mobile object 300, it is possible to further improve the safety of the user U by limiting the number of events in which participation is possible until the user U gets used to riding of the ridable mobile object 300 (until the user U regains his or her senses). Because the user U willingly participates in the event or gets the ridable mobile object 300 so that he or she is not subject to the above-described restrictions, the utilization rate of the ridable mobile object 300 can be improved.
The participation manager 143 may update the proficiency level of the user U on the basis of manipulation content of the ridable mobile object 300 included in the usage history information 183 or may set the possible participation level together with the proficiency level in addition to the number of uses and the usage time described above. The participation manager 143 may transmit information about the possible participation level for each user U, information for raising the level, information for preventing the level from being lowered, and the like to the terminal device 200 of the user U and notify the user U of the information.
The incentive manager 144 manages incentives or the like given to a service provider side such as a provider of the ridable mobile object 300 rented to the user U or a planner who plans an event. The incentives in this case are, for example, equivalent to a service usage fee. The service usage fee may be collected from an admission fee of a theme park or may be collected via the terminal device 200 of the user U when he or she uses the ridable mobile object or participates in the event. The incentive manager 144 may manage the incentives given to the user U who participated in the event. As the incentives of this case, for example, the preferred ridable mobile object 300 is preferentially used, a rental fee for the ridable mobile object 300 is discounted, and award points are given.
The operation selector 150 selects content of a prescribed operation to be executed by the ridable mobile object 300 on the basis of location information of the ridable mobile object 300 located inside of the theme park and information about an event to be executed inside of the theme park. For example, the operation selector 150 selects an operation to be executed by the ridable mobile object 300 for each user U on the basis of a distance between a point where an event is executed (including a location on a route) and the ridable mobile object 300, event execution time, a possible participation level of the user U who participates in the event, participation content designated by the user U, and the like with reference to the user information 181 and the event information 182 stored in the storage 180.
For example, when an event is being executed and a distance between the event execution point and the ridable mobile object 300 that the user U participating in the event is on is within a prescribed distance, the operation selector 150 selects a prescribed operation corresponding to a performance corresponding to an event to be executed by the ridable mobile object 300. The operation selector 150 may select a prescribed operation to be executed by the ridable mobile object 300 when a distance between a physical object and the ridable mobile object 300 is within a prescribed distance on the basis of the distance between the physical object related to the event and the ridable mobile object 300. The operation selector 150 may select an operation for each of the ridable mobile objects 300 located inside of a specific range less than the theme park area in accordance with the number of ridable mobile objects 300 located inside of the specific range. The specific range includes, for example, a predetermined zone such as an adventure area or a park area located inside of a theme park and a range within a prescribed distance centered on a physical object related to the event. The operation selector 150 may select a prescribed operation to be executed on the basis of a condition obtained by combining a plurality of various types of conditions for selecting the above-described operation.
When specifically selecting the content of a prescribed operation, the operation selector 150 decides on specific content of a prescribed operation to be executed by the target ridable mobile object 300 on the basis of information associated with a matching event ID with reference to an event ID of the operation information 184 on the basis of an event ID of a target event satisfying the condition within the event information 182.
The adjustment information is, for example, information for adjusting a part of an operation in accordance with information about the user U, a surrounding environment, and the like with respect to the operation set in the operation content. Information about the user U is information (e.g., age or gender) obtained from the user information 181. The information about the surrounding environment is, for example, information acquired from the event information 182 (for example, location/route information or execution time). In the example of
The event operation instructor 160 generates an operation instruction for the event for the target ridable mobile object 300 on the basis of the operation content decided on (selected) by the operation selector 150. For example, the event operation instructor 160 generates an operation instruction for causing a ridable mobile object 300 that the user U participating in the event gets on and which is located within a prescribed distance from an execution point at the time when the event is executed to execute a prescribed event operation. The event operation instructor 160 may adjust content of an operation of the event (including a degree of operation) on the basis of adjustment information and adjust content of an operation of the event on the basis of setting content (adjustment information) of the user U acquired from the terminal device 200.
The event operation instructor 160 acquires terminal information of the terminal device 200 of the user U on the target ridable mobile object 300 on the basis of the terminal information of the user information 181 and transmits the generated or adjusted operation instruction to the terminal device 200 on the basis of the acquired terminal information. The event operation instructor 160 may transmit map information of an area (a theme park) and the like to the terminal device 200 in addition to (or in place of) the operation instruction.
Next, a configuration of the terminal device 200 will be described.
The terminal-side storage 270 may be implemented by the above-described various types of storage devices, an EEPROM, a ROM, a RAM, or the like. The terminal-side storage 270 stores, for example, a mobile object management application 272, a program, and various other types of information. The terminal-side storage 270 may store user information such as a terminal ID and a user ID or may store map information acquired from the mobile object management server 100 or the like.
The terminal-side communicator 210 communicates with the mobile object management server 100, the ridable mobile object 300, and other external devices using, for example, the network NW. The terminal-side communicator 210 may perform wireless communication on the basis of, for example, Wi-Fi, Bluetooth, dedicated short range communication (DSRC), or other communication standards or may have a near-field communication function of performing near-field communication (NFC) with the ridable mobile object 300.
The input 220 receives an input of the user U by, for example, a manipulation on various types of keys and buttons. The input 220 may include a motion sensor that detects an operation of the terminal device 200 and may receive an input of the user U on the basis of the operation of the terminal device body detected by the motion sensor (for example, an operation in which the user U shakes or turns the terminal device 200). The input 220 includes an audio input such as a microphone and a voice of the user U and a sound around the terminal device 200 are input by the audio input and the input of the user U may be accepted by analyzing the input sound. The output 230 outputs information to the user U. The output 230 is, for example, a display or a speaker (an audio output). The display is, for example, a liquid crystal display (LCD), an organic electro-luminescence (EL) display, or the like. The input 220 may be integrally configured with the display as a touch panel. The display displays various types of information according to the embodiment under the control of the output controller 260. For example, the speaker outputs a prescribed sound (a voice, music, an alarm sound, a sound effect, or the like) under the control of the output controller 260.
The location information acquirer 240 acquires location information of the terminal device 200 by, for example, a built-in Global Positioning System (GPS) device (not shown). The location information includes, for example, latitude and longitude.
The application executor 250 is implemented by executing the mobile object management application 272 stored in the terminal-side storage 270. For example, the mobile object management application 272 is downloaded from an external device via a network NW and installed on the terminal device 200. The mobile object management application 272 is an application program for controlling the output controller 260 so that the user U causes the display to output an image provided by the mobile object management server 100 or causes a voice corresponding to the information provided by the mobile object management server 100 to be output from the speaker.
The application executor 250 transmits information input by the input 220, information stored in the terminal-side storage 270, or the like to the mobile object management server 100 or the ridable mobile object 300 via the terminal-side communicator 210. The information input by the input 220 includes, for example, information about registration and authentication of the user U, adjustment information of an operation by the user U when the ridable mobile object 300 operates in response to an event, and the like. The application executor 250 transmits information obtained from the mobile object management server 100, location information of the terminal device 200, map information, and the like to the ridable mobile object 300 that the user U is on or transmits information obtained from the ridable mobile object 300 to the mobile object management server 100 together with a user ID or location information.
The output controller 260 controls the content and display mode of an image to be displayed on the display of the output 230 and the content and output mode of a voice to be output by the speaker according to control of the application executor 250.
Next, the ridable mobile object 300 will be described.
The small-diameter wheel 312B is a wheel that can rotate around an axis orthogonal to a straight line in a radial direction in the central cross-section in the width direction of the large-diameter wheel 312A. The omnidirectional moving wheel 312 includes a plurality of small-diameter wheels 312B. The plurality of small-diameter wheels 312B are disposed at substantially equal intervals along a circumferential direction of the large-diameter wheel 312A. The plurality of small-diameter wheels 312B are partially or wholly rotated at the same time by, for example, the second motor MT2.
The turning wheel 312C is a wheel that can rotate around the y-axis. The turning wheel 312C has a smaller diameter than the large-diameter wheel 312A. The turning wheel 312C is rotated by the third motor MT3. The omnidirectional moving wheel 312 moves the ridable mobile object 300 by rotating at least one of the large-diameter wheel 312A, the small-diameter wheels 312B, and the turning wheel 312C. An operation of the omnidirectional moving wheel 312 will be described below.
Referring back to
The ridable mobile object 300 may include a light emitter 316 such as a lamp, a speaker 317 that outputs a voice, or the like. The light emitter 316 can perform lighting or flashing in one or more prescribed colors. The speaker 317 outputs a prescribed sound (a voice, music, a warning sound, a sound effect, or the like). It is only necessary for the light emitter 316 and the speaker 317 to be attached to any one or more locations on the ridable mobile object 300, and the attachment locations thereof are not limited to the attachment locations shown in
Subsequently, details of the operation of the omnidirectional moving wheel 312 of the ridable mobile object 300 will be described.
The large-diameter wheel 312A is a wheel that mainly implements straight-ahead movement in the forward-rearward direction. The small-diameter wheel 312B is a wheel that mainly implements lateral movement on the spot by rotating around a rotation direction (a circumferential direction) of the large-diameter wheel 312A as an axis. On the other hand, the turning wheel 312C, which is the rear wheel, has a smaller diameter than the large-diameter wheel 312A and is a wheel that mainly implements turning movement by rotating on a rotation axis orthogonal to the rotation axis of the large-diameter wheel 312A.
The omnidirectional moving wheel 312 includes motors MT1 to MT3 that can independently control the rotations of the large-diameter wheel 312A, the small-diameter wheel 312B, and the turning wheel 312C described above. According to this configuration, the omnidirectional moving wheel 312 can also implement agile movements such as bending and turning on the spot as well as movements in various directions such as a right-lateral direction and a diagonal direction using a lateral movement speed difference of the front and rear wheels in addition to the front-rear movement.
Here, the forward direction of the ridable mobile object 300 is the positive direction of the y-axis (a +y-axis direction) in
As shown in an operation example M2 (left-right movement) of
As shown in an operation example M3 (turning on the spot) of
As shown in an operation example M4 (turning and traveling) of
A method of implementing the omnidirectional moving wheel 312 is not limited to the method of
Next, a functional configuration of the ridable mobile object 300 will be described.
The communication device 320 performs wireless communication on the basis of, for example, Wi-Fi, Bluetooth, DSRC, and other communication standards. The communication device 320 receives an electrical signal transmitted by the terminal device 200 and outputs the electrical signal to the control device 350. The communication device 320 transmits an electrical signal output by the control device 350 to the terminal device 200. In place of or in addition to the communication device 320, a near-field communication function of executing near-field communication (NFC) with the terminal device 200 may be provided.
The sensor 340 includes, for example, a sitting sensor 341, a surroundings sensor 342, an acceleration sensor 343, and an angular velocity sensor 344. The sitting sensor 341 detects a sitting state of whether or not the user U (a rider) is sitting on the seat 313. The sitting sensor 341 outputs a sitting signal indicating the sitting state of the user U to the control device 350.
The surroundings sensor 342 is a sensor that detects a physical object in the vicinity of the ridable mobile object 300. The surroundings sensor 342 detects, for example, a distance between the detected physical object and the ridable mobile object 300. The surroundings sensor 342 outputs a nearby physical object signal related to the detected physical object and the distance between the detected physical object and the ridable mobile object 300 to the control device 350. The surroundings sensor 342 may be, for example, an ultrasonic sensor using ultrasonic waves as a medium, an optical sensor using light as a medium, or an image sensor that captures an image of the surroundings of the ridable mobile object 300.
The acceleration sensor 343 is attached to any location on one or both of the mobile object body 310 and the seat 313. The acceleration sensor 343 detects the acceleration acting on the attachment location and outputs the acceleration to the control device 350. The angular velocity sensor 344 (a gyro sensor) is also attached to any location on one or both of the mobile object body 310 and the seat 313. The angular velocity sensor 344 detects an angular velocity acting on the attachment location and outputs the angular velocity to the control device 350. In addition to the above-described sensors, the sensor 340 may include an attitude angle sensor that detects the attitude angle (inclination) of the ridable mobile object 300.
The control device 350 controls an operation of the ridable mobile object 300 on the basis of information obtained from the communication device 320 and the sensor 340 and the like. The control device 350 includes, for example, an authentication processor 360, an instruction generator 370, a motor controller 380, and an output controller 390. The authentication processor 360 includes, for example, an authenticator 361 and a canceller 362. The instruction generator 370 includes, for example, a determiner 371, a detector 372, a generator 373, a center-of-gravity estimator 374, and a balance controller 375.
These components are implemented, for example, by a hardware processor such as a CPU executing a program (software). Some or all of the above components may be implemented by hardware (including a circuit; circuitry) such as an LSI circuit, an ASIC, an FPGA, or a GPU or may be implemented by software and hardware in cooperation. The program may be prestored in a storage device (a storage device including a non-transitory storage medium) such as an HDD or a flash memory in the ridable mobile object 300 or may be stored in a removable storage medium (a non-transitory storage medium) such as a DVD or a CD-ROM and installed in the storage device when the storage medium is mounted in a drive device. The storage device may store a mobile object ID assigned to its own vehicle, a terminal ID obtained from the terminal device 200, location information, map information, operation instructions, and the like.
The authenticator 361 authenticates the user U who will get on (or is on) the ridable mobile object 300. The authenticator 361 performs near-field communication with the terminal device 200 located nearby (within a prescribed distance) using Bluetooth or the like and the terminal device 200 where communication is first established or the nearest terminal device 200 acquires information about the terminal device 200 (for example, a terminal ID or a user ID) and the usage authority is set for the user U who has the terminal device 200 on the basis of the acquired information. The authenticator 361 may perform the above-described authentication, for example, when the sitting sensor 341 determines that the user U is sitting on the seat 313. In a state in which the usage authority is set, the authenticator 361 is in communication with the terminal device 200 and the ridable mobile object 300. The authenticator 361 does not communicate with other terminal devices 200 when the usage authority is set for one user U (when the usage authority is not canceled) (i.e., the usage authority is not set for a plurality of users at the same time).
The canceller 362 measures elapsed time after the user U leaves the ridable mobile object 300. The canceller 362 cancels the authority of the user U to use the ridable mobile object 300 by determining that a cancellation condition is satisfied when a prescribed time has elapsed after the user U left the ridable mobile object 300. The prescribed time may be uniform or may fluctuate according to specific conditions. The specific conditions may be, for example, a stop location of the ridable mobile object 300, a time period, the number of people who visit a specific area with the user U, human relationships such as family members and friends, and the like. The cancellation condition may be any other condition. For example, the user U may perform a manipulation indicating his or her intention to cancel the usage authority and a condition that the canceller 362 has acquired a signal corresponding to the manipulation may be used as a cancellation condition. The canceller 362 may compare an operation of the ridable mobile object 300 (hereinafter referred to as an actual operation) based on the user's manipulation after the authentication of the authenticator 361 with an operation associated with the possible participation level of the user U (hereinafter referred to as a level operation) and may cancel the authority of the user U to use the ridable mobile object 300 when the actual operation has not reached the level operation or may add a usage restriction so that some manipulations are unable to be executed. Thereby, it is possible to further improve the safety of the user U by preventing another person from using the terminal device 200 while impersonating the user U due to the transfer or theft of the terminal device 200 or by limiting the use of the user U when there is a gap in riding of the ridable mobile object 300.
For example, the instruction generator 370 generates instructions for operation control and output control for the ridable mobile object 300. The determiner 371 determines whether or not the user U is sitting on the basis of a sitting signal output by the sitting sensor 341. When the determiner 371 determines that the user U is sitting on the seat 313 according to the sitting signal and then determines that the user U is not sitting on the seat 313, the determiner 371 may determine that the user U has left the ridable mobile object 300.
The detector 372 detects information about the manipulation content of the user U for the ridable mobile object 300 and the event acquired from the terminal device 200 (an event operation instruction). The detector 372 may detect a surrounding situation of the ridable mobile object 300 detected by the surroundings sensor 342. The surrounding situation is, for example, the behavior of other ridable mobile objects 300 that are located nearby and characters and vehicles that are parading or the like.
The generator 373 generates an event operation instruction for the ridable mobile object 300. For example, the generator 373 generates an event operation instruction corresponding to an event such as a parade or a show that is performed nearby on the basis of the event operation instruction generated by the mobile object management server 100 acquired via the terminal device 200. The event operation instruction to be generated is, for example, an instruction for driving the omnidirectional moving wheel 312 by the motor controller 380, causing the light emitter 316 to perform lighting or flashing in a prescribed color by the output controller 390, or outputting a prescribed sound from the speaker (the audio output), for example, according to an operation instruction from the mobile object management server 100. The generator 373 may generate an operation instruction for moving the ridable mobile object 300 so that the ridable mobile object 300 does not come into contact with a nearby physical object obtained from the surroundings sensor 342. The generator 373 outputs control information based on the generated operation instruction (including an event operation instruction) to the motor controller 380 and the output controller 390.
The center-of-gravity estimator 374 and the balance controller 375 mainly function when the user U is on the ridable mobile object 300. The center-of-gravity estimator 374 estimates the center of gravity of a physical object including the user U on the ridable mobile object 300, the mobile object body 310, and the seat 313 on the basis of outputs of the acceleration sensor 343 and the angular velocity sensor 344.
The balance controller 375 generates control information (an operation instruction) in a direction in which a location of the center of gravity estimated by the center-of-gravity estimator 374 is returned to a reference location (a center-of-gravity location in a stationary state). For example, when the location of the center of gravity is biased to the right backward from the reference location, the balance controller 375 generates information indicating acceleration toward the right rear as control information. When a manipulation (an action instruction) by the user U is accelerated forward movement and the location of the center of gravity is behind the reference location, the balance controller 375 may suppress acceleration so that the location of the center of gravity is not biased further back by the accelerated forward movement or may start the accelerated forward movement after rearward movement is performed once and the location of the center of gravity moves forward through guidance. The instruction generator 370 outputs control information (an operation instruction) generated by the balance controller 375 to the motor controller 380.
The motor controller 380 individually controls each motor attached to the omnidirectional moving wheel 312 on the basis of the control information output by the instruction generator 370. For example, in the motor controller 380, different control processes may be executed when the user U is on the ridable mobile object 300 and when the user U is and is not (sitting) on the ridable mobile object 300.
When the user U is on the ridable mobile object 300, the above-described control can cause the ridable mobile object 300 to move in a desired direction when the user U on the ridable mobile object 300 moves the center of gravity in the desired direction according to a change in his or her own posture. That is, the ridable mobile object 300 recognizes the center-of-gravity movement of the user U as a maneuvering manipulation on the ridable mobile object 300 and performs a movement operation corresponding to the maneuvering manipulation.
The output controller 390 causes the light emitter 316 to perform lighting or flashing in a prescribed color or causes the speaker 317 to output a prescribed sound (a voice, music, a warning sound, a sound effect, or the like) on the basis of the control information output by the instruction generator 370.
A function to be executed by the ridable mobile object 300 is executed by electric power supplied from a battery (not shown) mounted thereinside. The battery may be charged by a charging device provided outside of the ridable mobile object 300 or may be detachable so that it can be replaced with another battery. The battery can also be charged with electricity regenerated by a motor of the omnidirectional moving wheel 312.
Next, a process executed by the mobile object management system 1 will be described.
In the example of
The mobile object management server 100 performs a user management process by receiving information from the terminal device 200 and storing a terminal ID and a mobile object ID in the user information 181 in association with a user ID matching the received user ID with reference to the user ID of the user information 181 (step S108). The mobile object management server 100 manages the location information of the terminal device 200 acquired by the terminal device 200 as the location information of the ridable mobile object 300 (step S110). The processing of steps S104 to S110 may be iteratively executed at prescribed intervals while the terminal device 200 and the ridable mobile object 300 are connected by near-field communication.
Subsequently, the mobile object management server 100 manages an event execution schedule with reference to the event information 182 and acquires information about an event that is held within a prescribed time from a present point in time (for example, content, location/route information, execution time, or information about a possible participation condition of the user U associated with the event) (step S112). Subsequently, the mobile object management server 100 transmits the information about the event to the terminal device 200 (step S114). In the processing of step S114, the mobile object management server 100 may acquire a possible participation level for each user U with reference to the usage history information 183 and transmit information about an event to the user U whose possible participation level satisfies a possible participation condition.
The terminal device 200 causes the output 230 (the display) to display the information about the event received from the mobile object management server 100 (step S116).
When the mobile object management server 100 receives information indicating that the user will participate in the event from the user U of the terminal device 200, the mobile object management server 100 manages the participation of the user U in the event (step S120). The mobile object management server 100 selects the operation of the ridable mobile object 300 that the user U gets on in accordance with the event in which the user U participates or the possible participation level (step S122). The mobile object management server 100 may ask the user U about participation content if a plurality of different pieces of participation content are included in a selected event in which the user U will participate.
In the switch display area AR22, icons IC21 to IC23 for selecting any one of pieces of possible participation content set for each participation event are displayed. In the example of
The mobile object management server 100 may ask the user U about whether or not to execute a performance operation corresponding to the user's current possible participation level for the user U who participates in the event.
When the icon IC31 has been selected, for example, a performance operation corresponding to the possible participation level of the user U or the level selected by the user U from the image IM20 shown in
The mobile object management server 100 may refer to the user information 181 and make an adjustment such as a process of making the rotational speed at the time of turning when the user U on board is a child (for example, 12 years old or younger) lower than that when the user U is an adult and cancelling the settings of the user U. Thereby, a safer operation can be executed for each user U. Therefore, even if the user U is a child, a parent can allow the child to use the ridable mobile object 300 or participate in the event safely.
Referring back to
The terminal device 200 transmits an operation instruction transmitted from the mobile object management server 100 to the ridable mobile object 300 (step S128). The ridable mobile object 300 executes an operation mode and an output mode on the basis of the event operation instruction obtained from the terminal device 200 (step S130). Execution results are transmitted to the terminal device 200 (step S132). The terminal device 200 transmits the execution results transmitted from the ridable mobile object 300 to the mobile object management server 100 (step S134).
The mobile object management server 100 updates a usage state such as the number of uses and the usage time of the ridable mobile object 300 included in the usage history information 183 for each user (step S136). Subsequently, the mobile object management server 100 rents the ridable mobile object 300 or provides incentives corresponding to a rental fee, an event participation fee, or the like to the service provider who planned the event (step S138). The mobile object management server 100 may provide incentives to a user who participated in the event. Thereby, the process of this sequence ends. According to the above-described process, the mobile object management server 100 can manage the participation of the user U in the event and can provide an event with a higher performance effect.
Hereinafter, specific examples of services provided by the mobile object management system 1 will be described.
The manager 140 of the mobile object management server 100 manages which physical object passes through which point at what time and the operation mode and an output mode to be executed by each of the objects OB1 to OB3 on the basis of the event information 182. For example, the manager 140 performs a performance for rotation in a prescribed rotation direction when the object OB1 shown in
In the example of
When there is a ridable mobile object 300 that a user who is not participating in the parade is on within a prescribed distance from the objects OB1 to OB3, the mobile object management server 100 may cause the ridable mobile object 300 to perform a performance operation corresponding to the motion of each of the objects OB1 to OB3 even if the user is not participating in the parade. In the example of
When a fixed camera CAM1 for photographing an event participant is provided at a prescribed location (a point P13 in
When a first user (for example, a child) who participates in the parade and a second user (for example, a parent) who photographs the user with a camera are managed in association, the mobile object management server 100 may perform an operation control process in which the ridable mobile object 300 of the first user is stopped in front of the ridable mobile object of the second user or the ridable mobile object 300 of the second user tracks the ridable mobile object 300 of the first user on the basis of location information of the ridable mobile objects 300 that the first and second users are on. In the example of
The mobile object management server 100 may provide information about a participation result to the user U who participated in the event after the event ends.
The manager 140 generates the image IM40 including the above-described information, and transmits the generated image IM40 to the terminal device 200 of the target user U via the network NW. The terminal device 200 receives the image IM40 and causes the output 230 to display the image IM40. The mobile object management server 100 may transmit information for generating the image IM40 to the terminal device 200 of the target user, generate the image IM40 on the terminal device 200 side, and cause the output 230 to output the image IM40. Thus, it is possible to allow the user U to clearly ascertain a participation state (a usage state of the ridable mobile object 300) and further improve the motivation to participate (the motivation to use) by providing the image IM40 to the user U.
For example, in a service provided by the mobile object management system 1 of the embodiment, a lecture about a performance operation of the ridable mobile object may be given in advance to the user U scheduled to participate in an event and the user U may be permitted to participate in the event after the lecture.
The instructor U10 gives a lecture to the users U1 to U13 who gathered in the area AAA at the meeting time about the operation of the ridable mobile object 300 (for example, rotation, forward movement, rearward movement, or the like) and the operations of the users U1 to U13 (for example, waving, looking up, clapping, holding hands with other users nearby, and the like) to be executed during the event. In this case, the participation manager 143 generates information about a performance operation at the time of event execution by the event operation instructor 160, transmits the generated information to the terminal device 200-10 of the instructor U10, and causes a sample operation by the ridable mobile object 300-10 of the instructor U10 to be executed. The participation manager 143 transmits information about performance operations of the ridable mobile objects 300-11 to 300-13 to the terminal devices 200-11 to 200-13 of the users U11 to U13 and causes the ridable mobile objects 300-11 to 300-13 to execute the performance operations in the event.
The participation manager 143 may be configured to be able to notify the terminal devices 200-11 to 200-13 of the users U11 to U13 of a moving image when the event scheduled for participation is executed in the past and display the moving image from the output 230. Thereby, each of the users U11 to U13 can more accurately ascertain what type of motion they will perform by watching the moving image.
After the lecture from the instructor U10 is completed, the terminal devices 200-11 to 200-13 of the users U11 to U13 are asked again about whether or not to participate in the event and the participation manager 143 permits the user to participate in the event when information indicating that the user participates in the event has been received. In place of the terminal devices 200-11 to 200-13 of the users, the terminal device 200-10 of the instructor U10 may be asked about whether or not the users U11 to U13 may participate in the event and the event participation of the user determined to be able to participate in the event from a viewpoint of the instructor U10 may be permitted. Thereby, users can participate in the event more safely.
By gathering users who plan to participate in the event in the same place and giving lectures, the burden on the instructor can be reduced and a sense of camaraderie can be strengthened because users can compare operations and teach each other. Therefore, a plurality of users can perform a performance in a group more enjoyably.
In the embodiment, instead of deciding on a level at which the user U can participate in the event in accordance with a usage state of the ridable mobile object 300 of the user U, the participation manager 143 may make an adjustment for temporarily increasing the possible participation level of the user U, for example, when specific conditions related to a date and time such as a period when there is an event such as a birthday for the user U or a certain event such as Christmas are satisfied. Thereby, because the user can experience a special performance on a special day, the performance effect can be further improved.
Although the ridable mobile object 300 is allowed to execute operations and outputs in accordance with the execution of the event in the above-described embodiment, the operation and output may also be controlled in accordance with the execution of the event from the terminal device 200 in addition thereto. In this case, for example, the mobile object management server 100 generates an event operation instruction for operating a vibration function provided inside of the terminal device 200 in accordance with the event, outputting a prescribed sound from a speaker, or causing a display or the like to emit light and transmits the generated event operation instruction to the terminal device 200. In this way, it is possible to further improve the performance effect on the user U by operating various devices in accordance with the event.
In the embodiment, when the ridable mobile object 300 includes a location acquirer instead of using the location information of the terminal device 200, the mobile object management server 100 may communicate directly with the ridable mobile object 300 without involving the terminal device 200.
According to the above-described embodiment, the mobile object management server (mobile object management device) 100 for managing the ridable mobile object 300 that the user U gets on and which moves inside of a prescribed area includes the acquirer 130 configured to acquire location information of the ridable mobile object 300; the manager 140 configured to manage the ridable mobile object 300 and the terminal device 200 of the user U on the ridable mobile object 300 in association with each other; and the event operation instructor 160 configured to cause the ridable mobile object 300 to execute a prescribed operation corresponding to an event via the terminal device 200 of the user U on the basis of the location information and information about the event that is executed inside of the prescribed area, wherein the manager 140 manages whether or not to permit participation in the event of the user U on the basis of a state in which the user U uses the ridable mobile object 300, whereby it is possible to perform the event participation type performance and further improve the event performance effect.
According to the embodiment, by allowing the user to participate in the event, it is possible to further entertain users who like events such as parades and fans of dancers and characters participating in the event. According to the embodiment, by changing an operation capable of being executed by the ridable mobile object 300 in accordance with the user's possible participation level, for example, various performance operations such as an operation in which a performance in which only the light emitter 316 of the ridable mobile object 300 initially emits light can be operated in accordance with a sound and operations that are performed like those of the character can be experienced.
The embodiment described above can be represented as follows.
A mobile object management device including:
a storage medium storing instructions capable of being read by a computer of the mobile object management device for managing a ridable mobile object that a user is allowed to get on and which moves inside of a prescribed area; and
a processor connected to the storage medium,
wherein the processor executes the instructions capable of being read by the computer to:
acquire location information of the ridable mobile object;
manage the ridable mobile object and a terminal device of the user on the ridable mobile object in association with each other;
cause the ridable mobile object to execute a prescribed operation corresponding to an event via the terminal device of the user on the basis of the location information and information about the event that is executed inside of the prescribed area; and
manage whether or not to permit participation in the event of the user on the basis of a state in which the user uses the ridable mobile object.
While preferred embodiments of the invention have been described and illustrated above, it should be understood that these are exemplary of the invention and are not to be considered as limiting. Additions, omissions, substitutions, and other modifications can be made without departing from the spirit or scope of the present invention. Accordingly, the invention is not to be considered as being limited by the foregoing description, and is only limited by the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2022-009875 | Jan 2022 | JP | national |