Priority is claimed on Japanese Patent Application No. 2018-223502, file Nov. 29, 2018, the content of which is incorporated herein by reference.
The present invention relates to an information providing device, a vehicle control system, an information providing method, and a storage medium.
Since the past, a technique in which a process related to an order is executed in a customer's terminal device so that the customer who is on board a vehicle can simplify an order at a drive-through has been known (for example, Japanese Unexamined Patent Application, First Publication No. 2012-027731).
However, in the technique of the related art, it was impossible for a user to confirm that an ordered commercial product was being appropriately cooked.
An aspect of the present invention was contrived in view of such circumstances, and one object thereof is to provide an information providing device, a vehicle control system, an information providing method, and a storage medium that makes it possible for a user to confirm that an ordered commercial product is appropriately cooked.
An information providing device, a vehicle control system, an information providing method, and a storage medium according to this invention have the following configurations adopted therein.
(1) According to an aspect of this invention, there is provided an information providing device including: an image acquirer that acquires a cooking image obtained by capturing an image of a process of cooking food ordered by a customer outside a store using an imaging device; and a provider that provides the cooking image acquired by the image acquire to a terminal device which is used by the customer.
(2) In the aspect of the above (1), the provider provides information relating to foodstuffs of the ordered food to the terminal device.
(3) In the aspect of the above (1), the image acquire further acquires an order process image obtained by capturing an image of a process until cooking is started after the food is ordered using an imaging device, and the provider further provides the order process image to the terminal device.
(4) In the aspect of the above (1), the cooking image includes an image representing identification information for identifying the customer who has ordered the food.
(5) In the aspect of the above (1), the information providing device further includes an acceptor that accepts a request for start of a dialogue with a cook of the food or a producer of foodstuffs used in the food from the customer who has ordered the food, and the provider starts a dialogue through the terminal device in a case where the cook or the producer responding to the request for start of a dialogue accepted by the acceptor is able to respond thereto.
(6) In the aspect of the above (5), the dialogue includes an inquiry from the customer for a cook of the food or the producer of foodstuffs used in the food, and the cooking image includes an image representing a reply of the cook or the producer to the inquiry.
(7) In the aspect of the above (1), the information providing device further includes: an order deriver that derives a cooking order on the basis of order information in which the food and identification information for identifying the customer who has ordered the food are associated with each other and information indicating a time of arrival of the customer at the store; and a presentator that presents the cooking order derived by the order deriver to a cook.
(8) In the aspect of the above (1), the information providing device further includes a time deriver that derives a recommended time of arrival at the store for each customer on the basis of order information in which the food and identification information for identifying the customer who has ordered the food are associated with each other and information indicating a cooking time of the food, and the provider provides the recommended arrival time derived by the time deriver to the terminal device or a vehicle control device of the corresponding customer.
(9) According to an aspect of this invention, there is provided a vehicle control system including: the information providing device according to the aspect of the above (8); and a vehicle control device including a recognizer that recognizes an object including another vehicle which is present in the vicinity of a host vehicle and a driving controller that generates a target trajectory of the host vehicle on the basis of a state of the object recognized by the recognizer and controls one or both of speed and steering of the host vehicle on the basis of the generated target trajectory, wherein the vehicle control device controls the host vehicle so as to arrive at the store at the recommended arrival time provided from the information providing device or through the terminal device.
(10) According to an aspect of this invention, there is provided an information providing method including causing a computer to: acquire a cooking image obtained by capturing an image of a process of cooking food ordered by a customer outside a store using an imaging device; and provide the acquired cooking image to a terminal device which is used by the customer.
(11) According to an aspect of this invention, there is provided a storage medium having a program stored therein, the program causing a computer to: acquire a cooking image obtained by capturing an image of a process of cooking food ordered by a customer outside a store using an imaging device; and provide the acquired cooking image to a terminal device which is used by the customer.
According to (1) to (14), it is possible for a user to confirm that an ordered commercial product is being appropriately cooked.
According to (2), it is possible for a user to confirm that an ordered commercial product is being cooked using appropriate foodstuffs. As a result, it is possible to enhance a user's sense of security with respect to a commercial product.
According to (3), it is possible for a user to confirm that an ordered commercial product is being cooked in an appropriate order. As a result, it is possible for a user to confirm that the user's order is not neglected or unduly postponed.
According to (6), it is possible to realize a dialogue with a cook and a user, and to eliminate a user's doubt about a commercial product.
According to (8), it is possible to provide a user with a commercial product at an appropriate time.
Hereinafter, an embodiment of an information providing device, a vehicle control system, an information providing method, and a storage medium of the present invention will be described with reference to the accompanying drawings.
The terminal device 20 is, for example, a terminal device which is used by a user, and is realized by a portable communication terminal device such as a smartphone, a portable personal computer such as a tablet-type computer (tablet PC), or the like. Hereinafter, the terminal device 20 is assumed to include a touch panel capable of both data input and data display. In the present embodiment, a user moves to the store SF while on board a vehicle V, and orders food before arriving at the store SF. A user orders food of the store SF from an order processing device DD through data communication using the terminal device 20. The order processing device DD is a device that accepts an order from a user and notifies a corresponding store SF of the accepted order. A user may order food through, for example, communication with a person employed by a store SF. A user is an example of a “customer of a store SF.”
Referring back to
The vehicle system 3 includes, for example, a camera, a radar device, a viewfinder, a human machine interface (HMI), a navigation device, a map positioning unit (MPU), a driving operator, an autonomous driving control device, a traveling drive force output device, a brake device, and a steering device. These devices or instruments are connected to each other through a multiplex communication line such as a controller area network (CAN) communication line, a serial communication line, a wireless communication network, or the like. The configuration of the vehicle system 3 is merely an example, and portions of the configuration may be omitted, or other configurations may be further added.
The camera is a digital camera using a solid-state imaging element such as, for example, a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS). The camera, for example, repeatedly captures an image of the vicinity of the vehicle V periodically. The radar device radiates radio waves such as millimeter waves to the vicinity of the vehicle V, and detects radio waves (reflected waves) reflected from an object to detect at least the position (distance to and orientation of) of the object. The viewfinder is a light detection and ranging (LIDAR) viewfinder. The viewfinder irradiates the vicinity of the vehicle V with light, and measures scattered light. The viewfinder detects a distance to an object on the basis of a time from light emission to light reception.
The navigation device includes, for example, a global navigation satellite system (GNSS) receiver and a route determiner. The navigation device holds first map information in a storage device such as a hard disk drive (HDD) or a flash memory. The GNSS receiver specifies the position of the vehicle V on the basis of a signal received from a GNSS satellite. The route determiner refers to the first map information to determine, for example, a route (hereinafter referred to as a route on a map) from the position (or any input position) of the vehicle V specified by the GNSS receiver to a destination (the store SF in this example) input by an occupant using a navigation HMI. The route on a map is output to the MPU.
The MPU includes, for example, a recommended lane determiner, and the recommended lane determiner divides the route on a map provided from the navigation device into a plurality of blocks (for example, divides the route on a map every 100 [m] in the traveling direction of the vehicle V), and determines a recommended lane for each block with reference to second map information. The recommended lane determiner makes a determination on which lane from the left to travel along. The second map information is map information having higher accuracy than that of the first map information. The second map information includes, for example, information of the center of a lane, information of the boundary of a lane, or information of the type of lane, or the like.
The MPU may change a recommended lane on the basis of a recommended arrival time which is provided from the information providing device 10. The details of the recommended arrival time which is provided from the information providing device 10 will be described later.
The driving operator includes, for example, an accelerator pedal, a brake pedal, a shift lever, a steering wheel, a variant steering wheel, a joystick, and other operators. A sensor that detects the amount of operation or the presence or absence of operation is installed on the driving operator, and the detection results are output to the autonomous driving control device, or some or all of the traveling drive force output device, the brake device and the steering device.
The autonomous driving control device includes, for example, a first controller, a second controller, and a storage. The first controller and the second controller are each realized by a processor such as, for example, a central processing unit (CPU) executing a program (software). Some or all of these components may be realized by hardware (a circuit unit; including circuitry) such as a large scale integration (LSI), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a graphics processing unit (GPU), and may be realized by software and hardware in cooperation. The program may be stored in the storage of the autonomous driving control device in advance, may be stored in a detachable storage medium such as a DVD or a CD-ROM, or may be installed in the storage by the storage medium being mounted in a drive device.
The first controller includes, for example, a recognizer and a behavior plan generator. The recognizer recognizes the surrounding situation of the vehicle V on the basis of information input from which is the camera, the radar device and the viewfinder.
The recognizer recognizes, for example, a lane in which the vehicle V is traveling (traveling lane). The behavior plan generator generates a target trajectory along which the vehicle V will travel in the future so that the vehicle travels in a recommended lane determined by the recommended lane determiner in principle and autonomous driving corresponding to the surrounding situation of the vehicle V is further executed. The target trajectory includes, for example, a speed element.
The second controller acquires, for example, information of the target trajectory generated by the behavior plan generator, and controls the traveling drive force output device or the brake device, and the steering device. The behavior plan generator and the second controller are an example of a “driving controller.”
The traveling drive force output device outputs a traveling drive force (torque) for the vehicle V to travel to a driving wheel. The traveling drive force output device controls an internal-combustion engine, an electric motor, a transmission, and the like in accordance with, for example, information which is input from the second controller or information which is input from the driving operator. The brake device, for example, causes a brake torque according to a braking operation to be output to each wheel.
The steering device drives the electric motor in accordance with the information which is input from the second controller or the information which is input from the driving operator, and changes the direction of a turning wheel.
The information providing device 10 is included in, for example, a store SF. An imaging device 40 and a display device 50 are connected to the information providing device 10. The imaging device 40 is installed in various places in the store SF. Whenever an order is accepted, the imaging device newly captures an image of a process until cooking is started after food is ordered by a user or an image of a process of cooking ordered food, and supplies a generated image to the information providing device 10. In the following description, an image obtained by capturing a process until cooking is started after food is ordered is described as an “order process image,” an image obtained by capturing a process of cooking food is described as a “cooking image,” and in a case where the “order process image” and the “cooking image” need not be distinguished from each other, these images are described as a “process image.” Examples of cooking images to be captured preferably include a kitchen, a cook, foodstuffs, a seasoning, a cookware, food which is being cooked, and the like. The display device 50 displays various images on the basis of control of the information providing device 10, and presents various types of information to a cook or a person employed by a store such as a receptionist.
The information providing device 10 may be provided in places other than a store SF. In this case, the information providing device 10 mutually communicates with the imaging device 40 and the display device 50 through a WAN, a LAN, the Internet, or the like, and functions as a cloud server that transmits and receives various types of data.
The information providing device 10 includes a controller 100 and a storage 120. The controller 100 realizes each functional unit of a first acquirer 101, a second acquirer 102, a provider 103, an acceptor 104, a third acquirer 105, a first deriver 106, a second deriver 107, a presentator 108, and a fourth acquirer 109, for example, by a hardware processor such as a CPU executing a program (software) stored in the storage 120.
Some or all of these components may be realized by hardware (including a circuit unit) such as an LSI, an ASIC, an FPGA, or a GPU, or may be realized by software and hardware in cooperation.
The storage 120 may be realized by a storage device (a storage device including a non-transitory storage medium) such as, for example, an HDD or a flash memory, may be realized by a detachable storage medium such as a DVD or a CD-ROM (a non-transitory storage medium), or may be a storage medium which is mounted in a drive device. Some or all of the storage 120 may be accessible external devices of the information providing device 10 such as an NAS or an external storage server. The storage 120 stores, for example, order information 121, ingredient information 122, cooking time information 123, and arrival time information 124. The details of the various types of information will be described later.
The first acquirer 101 acquires the order information 121 indicating an order from a user. The second acquirer 102 acquires a process image from the imaging device 40. The provider 103 provides the process image acquired by the second acquirer 102 or various types of information to the terminal device 20. The second acquirer 102 is an example of an “image acquirer.”
The acceptor 104 accepts an inquiry about a cook or a store SF from a user. The third acquirer 105 acquires a reply to the inquiry accepted by the acceptor 104. The provider 103 provides the reply acquired by the third acquirer 105 to the terminal device 20.
The first deriver 106 derives a time when it is desirable for the user to arrive at the store SF (hereinafter referred as a recommended arrival time). The provider 103 provides the recommended arrival time derived by the first deriver 106 to the terminal device 20.
The second deriver 107 derives a cooking order for a cook to cook food ordered by a user on the basis of an arrival time when the user arrives at a store SF. The presentator 108 displays the cooking order derived by the second deriver 107 on the display device 50, and presents the displayed cooking order to the cook. The fourth acquirer 109 acquires an image captured by the user. The presentator 108 displays the image captured by the user which is acquired by the fourth acquirer 109 on the display device 50, and presents the displayed image to a person employed by a store. Hereinafter, the details of each functional unit will be described.
The first acquirer 101 acquires an order record indicating the user's order content, and stores the acquired order record as the order information 121 in the storage 120.
The user ID is, for example, a user's name or registration information registered when an order service using the order processing device DD is used. The first acquirer 101 acquires an order record from the order processing device DD using, for example, a cellular network, a Wi-Fi network, Bluetooth, a WAN, a LAN, the Internet, or the like, and stores the acquired order record inclusive of the order information 121 in the storage 120. The first acquirer 101 may acquire an order record by a person employed by a store who has accepted an order in a store SF directly inputting the order content to an input unit (not shown) which is connected to the information providing device 10, and store the acquired order record inclusive of the order information 121 in the storage 120.
The second acquirer 102 acquires a process image from the imaging device 40 in accordance with the first acquirer 101 acquiring an order record.
The second acquirer 102 causes the display device 50 to display a person employed by a store or an image representing an instruction for a cooking image during the acquisition of a process image for each user. For example, the second acquirer 102 causes the display device 50 installed at a position that can be visually recognized by a person employed by a store to display a message image of “Please transfer Ms. A's order to a cook,” and acquires an order process image captured by the imaging device 40a during display as an order process image of Ms. A. Similarly, the second acquirer 102 causes the display device 50 installed at a position that can be visually recognized by a cook to display a message image of “Please cook fried rice and Sichuan-style bean curd that Ms. A ordered,” and acquires cooking images captured by the imaging devices 40b and 40c during display as cooking images of Ms. A.
The provider 103 provides the terminal device 20 with a process image acquired by the second acquirer 102. The provider 103 provides the terminal device 20 immediately (for example, in real time) with, for example, an image acquired by the second acquirer 102.
The provider 103 may provide the terminal device 20 with the process image acquired by the second acquirer 102 inclusive of an image representing various types of information. Hereinafter, in a case where the provider 103 provides the terminal device 20 with a process image inclusive of information relating to foodstuffs of food will be described. The provider 103 refers to the ingredient information 122 to specify foodstuffs used in cooking of a menu item included in an order record.
The acceptor 104 accepts an inquiry from a user who has ordered food. An inquiry made by a user is, for example, a production area of a foodstuff, the presence or absence of an additive of a seasoning used by a cook, or the like.
A user makes various inquiries for a cook or a store SF using, for example, an application which is executed in the terminal device 20.
The acceptor 104 may acquire the inquiry information IQ using methods other than the inquiry application. For example, the acceptor 104 may acquire the inquiry information IQ by a person employed by a store SF directly inputting inquiry content transmitted by a user through communication with the person employed by a store to an input unit (not shown) connected to the information providing device 10. The acceptor 104 may acquire the inquiry information IQ on the basis of a text message transmitted to the information providing device 10 (or, a store SF) by a user.
The third acquirer 105 acquires a cook's reply. For example, the third acquirer 105 causes the display device 50 installed at a position that can be visually recognized by a cook to display the inquiry content indicated in the inquiry information IQ. A microphone (not shown) capable of detecting a cook's speech is installed in vicinity of the cook, and the microphone collects the cook's speech as a reply to the inquiry content displayed on the display device 50. The acceptor 104 converts the cook's speech content collected by the microphone into text through voice recognition, and acquires the result as the cook's reply. The provider 103 provides the terminal device 20 with an image processed so that an image representing the content of an inquiry indicated by the inquiry information IQ and the cook's reply acquired by the third acquirer 105 is included in a process image (hereinafter referred to as a second image IM2).
The third acquirer 105 may acquire the cook's reply using methods other than the acquisition of speech through a microphone. For example, the third acquirer 105 may acquire a reply by a person employed by a store transferring the content of the inquiry information IQ to a cooking image and directly inputting a reply obtained from a cook to an input unit (not shown) connected to the information providing device 10.
The cook's reply may be collected by microphones included in the imaging devices 40b and 40c, and be included in a process image. The cook may not reply through speech. For example, in a case where the content of an inquiry is “want to know the raw ingredient of a seasoning used in cooking” or the like, a cook may make a reply to an inquiry by bringing information indicating the raw ingredient of the seasoning (for example, a label attached to the container of a seasoning) close to the imaging devices 40b and 40c. In this case, the second acquirer 102 is an example of the “third acquirer.”
The third acquirer 105 may acquire a reply of a producer of foodstuffs used in cooking of food. For example, the producer has a terminal device TM capable of communication with the information providing device 10, the third acquirer 105 transmits the inquiry information IQ to the terminal device TM, and the terminal device TM causes a display to display inquiry content indicated in the received inquiry information IQ. The terminal device TM includes a microphone capable of detecting a producer's speech, and the microphone collects the producer's speech as a reply to an inquiry displayed on the display of the terminal device TM. The terminal device TM transmits collected voice information to the information providing device 10. The acceptor 104 converts the producer's speech content into text through voice recognition on the basis of the voice information received from the terminal device TM, and acquires the result as the producer's reply. The provider 103 provides the terminal device 20 with an image processed so that an image representing the content of an inquiry indicated by the inquiry information IQ and the producer's reply acquired by the third acquirer 105 is included in a process image. The third acquirer 105 may previously acquire the producer's replies such as a reply related to foodstuffs used in cooking of food, and provide a user with an appropriate reply among the replies which are previously acquired in a case where an inquiry about the producer from the user is made. The acceptor 104 and the third acquirer 105 may perform parallel processing so that a dialogue with a cook or a producer of foodstuffs is not mutually exchanged but can be performed in real time using a moving image as an image.
The first deriver 106 derives a recommended arrival time for each user on the basis of the order information 121 and the cooking time information 123.
By confirming such an image, a user can move on to a store SF in accordance with a time at which cooking of food is completed while confirming the ordered food is being appropriately cooked. As a result, it is possible to provide a user with food at an appropriate time.
In the above, a case where the provider 103 generates various images and provides the images to the terminal device 20 has been described, but there is no limitation thereto. For example, a function of processing the message images IMm1 to IMm3 so that these images are included in a process image may be included in the terminal device 20. In this case, the provider 103 provides the terminal device 20 with pieces of information indicating the messages MS1 to MS7 and the process images. The terminal device 20 generates, for example, the first image IM1 to the third image IM3 on the basis of the pieces of information indicating the messages MS1 to MS7 provided from the information providing device 10 and the process images, and causes the display to display the generated images.
The terminal device 20 may provide the recommended arrival time to the vehicle system 3. The vehicle system 3 controls the vehicle V so as to arrive at a store SF at the recommended arrival time provided from the terminal device 20. In this case, the terminal device 20 and the vehicle system 3 communicate with each other through a Wi-Fi network, Bluetooth (registered trademark), a universal serial bus (USB (registered trademark)) cable, or the like, and transmit and receive information indicating the recommended arrival time. The information providing device 10 may provide the recommended arrival time directly to the vehicle system 3. In this case, information in which a user ID and an address of a communication device of the vehicle system 3 mounted in the vehicle V that a user identified by the user ID boards are associated with each other is stored in the storage 120 in advance, and the terminal device 20 transmits the recommended arrival time to the vehicle system 3 on the basis of the information.
For example, the MPU included in the vehicle system 3 redetermines a recommended route so as to arrive at a store SF at a recommended arrival time on the basis of the recommended arrival time provided by the provider 103. A behavior planner included in the vehicle system 3 generates a target trajectory so as to travel in a recommended route redetermined by the MPU.
The second deriver 107 derives the cooking order of food which is cooked by a cook on the basis of the order information 121 and the arrival time information 124.
In this case, the first acquirer 101 may acquire the order record indicating order content, store an order ID, a user ID, an order time, menu items, and the number which are included in an order record as the order information 121 in the storage 120, and store the order ID, the user ID, and the arrival time as the arrival time information 124 in the storage 120. The first acquirer 101 may store the order record as the order information 121, and the order information 121 may include the arrival time information 124.
The second deriver 107 generates cooking order information OF on the basis of the arrival time information 124.
The presentator 108 causes the display device 50 installed at a position that can be visually recognized by a cook to display information indicating the content of the cooking order information OF generated by the second deriver 107. Thereby, the presentator 108 causes a cook to confirm the cooking order information OF, and thus can prompt the cook to cook food in an appropriate order.
The fourth acquirer 109 acquires an image captured by a user (hereinafter referred to a customer image) from the user. For example, the user's face is shown in the customer image. For example, when an order service using the order processing device DD is used, the fourth acquirer 109 acquires the customer image registered by a user. The customer image may be included as one element of an order record. In this case, a user transmits the customer image to the order processing device DD or the information providing device 10 every order.
The presentator 108 causes the display device 50 installed at a position that can be visually recognized by a person employed by a store (particularly, a person employed by a store who is a receptionist) to display the customer image acquired by the fourth acquirer 109.
As described above, the information providing device 10 of the present embodiment includes the second acquirer 102 that acquires a cooking image obtained by capturing an image of a process of cooking food ordered by a customer (user) outside a store using the imaging device 40 and the provider 103 that provides the terminal device 20 used by the customer with the cooking image acquired by the second acquirer 102, and makes it possible for the user to confirm that ordered commercial product (food in this example) is being appropriately cooked.
In the above, a process of the information providing device 10 in a case where a user visits a store SF has been described, but there is no limitation thereto. For example, in a case where food is delivered to a delivery place desired by a user, the information providing device 10 may perform a process of presenting an image of a process of cooking or information relating to foodstuffs in addition to an expected delivery time of which the user's terminal device is notified at the time of an order.
While preferred embodiments of the invention have been described and illustrated above, it should be understood that these are exemplary of the invention and are not to be considered as limiting. Additions, omissions, substitutions, and other modifications can be made without departing from the spirit or scope of the present invention. Accordingly, the invention is not to be considered as being limited by the foregoing description, and is only limited by the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2018-223502 | Nov 2018 | JP | national |