Embodiment of the disclosure relate to a robot and a method of controlling the serving robot.
Recently, automation technologies using robots are being provided in varies fields. For instance, products are being manufactured by using robots in factories. Moreover, food items ordered by customers are cooked in restaurants by robots and are further served or delivered to the customers by robots.
A robot may receive input of an order for food by a touch method or a voice method from a user, and cook the food according to a set recipe. Also, a robot may identify a user's location and the surrounding environment, and serve the ordered food to the user's location in consideration of the identified surrounding environment.
According to an aspect of the disclosure, there is provided a serving robot including: a main body; a loading part configured to load an object; an arm including a first end rotatably connected to the main body, and a second end rotatably connected to the loading part; a driver part; and at least one processor is configured to: control the driver part to maintain horizontality of the loading part at a time when the arm is rotated in a first direction or a second direction opposite to the first direction.
The serving robot may further include: a sensor configured to detect acceleration or deceleration of the main body, wherein the object is a food item, and wherein the at least one processor is configured to: based on the food item being loaded on the loading part and the serving robot moving, obtain, from the sensor, first information based on the detected acceleration or deceleration of the main body, and based on the first information, control the driver part to rotate the arm in the first direction or in the second direction to offset an effect of the acceleration or deceleration of the main body in the loading part.
The at least one processor may be configured to: based on the serving robot accelerating in the first direction, control the driver part to rotate the arm in the second direction from an initial location, by a first angle centered around an axis on the first end of the arm connected to the main body based on the first information of the main body, and control the driver part to rotate the loading part so that the loading part maintains horizontality based on the first angle.
The at least one processor may be configured to: based on the serving robot maintaining a constant velocity after the acceleration, control the driver part to rotate the arm in the first direction centered around the axis on the first end of the arm to move the arm to the initial location, and control the driver part to rotate the loading part so that the loading part maintains horizontality based on the rotation of the arm in the first direction.
The at least one processor may be configured to: based on the serving robot decelerating in the first direction, control the driver part to rotate the arm in the first direction from an initial location, by a second angle centered around an axis on the first end of the arm connected to the main body based on the first information of the main body, and control the driver part to rotate the loading part so that the loading part maintains horizontality based on the second angle.
The at least one processor may be configured to: based on the serving robot maintaining a constant velocity after the deceleration, control the driver part to rotate the arm in the second direction centered around the axis on the first end of the arm to move the arm to the initial location, and control the driver part to rotate the loading part so that the loading part maintains horizontality based on the rotation of the arm in the second direction.
The at least one processor may be configured to: obtain, from the sensor, tilt information of the main body, and control the driver part to rotate at least one of the arm or the loading part based on the tilt information of the main body to maintain horizontality of the loading part.
The loading part may include a first surface including a tray, and a second surface including a display, and wherein the at least one processor may be configured to: control the driver part to rotate the loading part to face the tray or the display in an upward direction, and based on the display facing in the upward direction, control the display to display information related to the food item.
The serving robot may further include a camera, wherein the at least one processor may be configured to: control the camera to capture an image, and based on identifying an customer from the captured image while food item is not loaded on the loading part, control the driver part to rotate the loading part so that the display is facing toward the upward direction.
The at least one processor may be configured to identify food item selected by the customer.
The at least one processor may be configured to: control the driver part to rotate the loading part so that the tray is facing toward the upward direction for loading the food item, and based on the identified customer and the food item selected by the customer, control the driver part to rotate the arm and the loading part for serving the food item to the customer.
According to another aspect of the disclosure, there is provided a control method of a serving robot, the method including: moving the serving robot based on an object being loaded on a loading part of the serving robot; rotating an arm connected to a main body of the serving robot, the arm including a first end rotatably connected to the main body, and a second end rotatably connected to the loading part; and maintains the horizontality of the loading part at a time when the arm is rotated in a first direction or a second direction opposite to the first direction.
Hereinafter, various embodiments will be described in more detail with reference to the accompanying drawings. The embodiments described in this specification may be modified in various ways. Also, specific embodiments may be illustrated in the drawings, and described in detail in the detailed description. However, specific embodiments disclosed in the accompanying drawings are just for making the various embodiments easily understood. Accordingly, the technical idea of the disclosure is not restricted by the specific embodiments disclosed in the accompanying drawings, and the embodiments should be understood as including all equivalents or alternatives included in the idea and the technical scope of the disclosure.
Also, in the disclosure, terms including ordinal numbers such as ‘the first’ and ‘the second’ may be used to describe various components, but these components are not limited by the aforementioned terms. The aforementioned terms are used only for the purpose of distinguishing one component from another component.
In addition, in this specification, terms such as “include” and “have” should be construed as designating that there are such characteristics, numbers, steps, operations, elements, components, or a combination thereof described in the specification, but not as excluding in advance the existence or possibility of adding one or more of other characteristics, numbers, steps, operations, elements, components, or a combination thereof. Further, the description in the disclosure that an element is “coupled with/to” or “connected to” another element should be interpreted to mean that the one element may be directly coupled with/to or connected to the another element, but still another element may exist between the elements. In contrast, the description that one element is “directly coupled” or “directly connected” to another element can be interpreted to mean that still another element does not exist between the one element and the another element.
Meanwhile, “a module” or “a part” for the elements used in this specification performs at least one function or operation. Also, “a module” or “a part” may perform a function or an operation by hardware, software, or a combination of hardware and software. Also, a plurality of “modules” or a plurality of “parts” except “a module” or “a part” that needs to be implemented in specific hardware or is performed in at least one processor may be integrated into at least one module. Further, singular expressions include plural expressions, unless defined obviously differently in the context.
Also, in the description of the disclosure, the order of each step should be understood in a nonrestrictive way, unless a preceding step should necessarily be performed prior to a subsequent step in a logical and temporal sense. That is, excluding an exceptional case as above, even if a process described as a subsequent step is performed prior to a process described as a preceding step, there would be no influence on the essence of the disclosure, and the scope of the disclosure should also be defined regardless of the orders of steps. Further, the description “A or B” in this specification is defined to include not only a case wherein one of A or B is selectively referred to, but also a case wherein both of A and B are included. In addition, the term “include” in this specification includes a case wherein elements other than elements listed as being included are further included.
Further, in this specification, only essential elements necessary for describing the disclosure are described, and elements not related to the essence of the disclosure are not mentioned. Also, the descriptions of the disclosure should not be interpreted to have an exclusive meaning of including only the elements mentioned, but to have a non-exclusive meaning of also including other elements.
In addition, in describing the disclosure, in case it is determined that detailed explanation of related known functions or features may unnecessarily confuse the gist of the disclosure, the detailed explanation will be abridged or omitted. Meanwhile, each embodiment of the disclosure may be independently implemented or operated, but it may also be implemented or operated in combination with another embodiment.
Referring to
The main body 1 may include various components configured to perform various operations of the server robot 100, such as driving the serving robot 100. For example, the main body 1 may include components such as a processor, a communication interface, a sensor, a camera, a microphone, a speaker, a memory, etc. However, the disclosure is not limited thereto, and as such, according to another embodiment, some of these components may be omitted from the main body 1 and/or other components may be further included in the main body 1.
According to an embodiment, the arm 10 may be connected to the main body 1. For example, one end (e.g., a first end) of the arm 10 may be rotatably connected to the main body 1, and the other end (e.g., a second end) of the arm 10 may be rotatably connected to the loading part 20. The longitudinal direction of the arm 10 may be towards in the upper direction with respect the bottom surface of the main body 1. For example, the longitudinal direction of the arm 10 may be arranged perpendicularly to the bottom surface of the main body 1. The arm 10 may rotate forward or rotate backward based on a connection axis connected to the main body 1 according to a movement of the serving robot 100. As an example, the arm 10 may rotate within a range of 30 degrees forward to 30 degrees backward based on the vertical axis with respect to the bottom surface of the main body 1. Meanwhile, the aforementioned rotation angles are merely an example, and as such, according to another embodiment, rotation angles are not limited thereto. Accordingly, the arm 10 may rotate in a range different than 30 degrees forward or backward based on the vertical axis with respect to the bottom surface of the main body 1. However, according to another embodiment, the arm 10 may not rotate more than 90 degrees forward and more than 90 degrees backward. In
The loading part 20 may be rotatably connected to the arm 10. For example, the center area of the side surface of the loading part 20 may be connected to the arm 10. In case food is loaded on the loading part 20, the loading part 20 may rotate to maintain horizontality according to a rotation of the arm 10. The loading part 20 may rotate forward or rotate backward based on the connection axis connected to the arm 10. As an example, the loading part 20 may rotate in 360 degrees forward or backward. Although
According to an embodiment, one surface (e.g., a first surface) of the loading part 20 may include a tray, and the other surface (e.g., a second surface) may include a display. For example, if the serving robot 100 identifies a user or a customer while food is not loaded, the serving robot 100 may rotate the loading part 20 so that the display is facing toward the upper direction. In this manner, the customer or the user may be able to view a menu displayed on the display. The display facing toward the upper direction may mean that the display surface is facing in the upper direction of the vertical axis of the bottom surface of the main body 1 or that the display surface is facing in an upward direction with respect to the bottom surface of the main body 1. When the display is facing toward the upper direction, the serving robot 100 may display information related to food on the display. Also, the serving robot 100 may rotate the loading part 20 so that the tray surface is facing toward the upper surface for loading food and serving the food. When food is loaded on the loading part 20, the serving robot 100 may maintain the loading part 20 to be horizontal for stable delivery of food.
The wheel 30 may be located in the lower end part of the main body 1. The wheel 30 may move the serving robot 100. In
Hereinafter, the components included in the serving robot 100 will be described in detail.
Referring to
The processor 110 may control each component of the serving robot 100. The serving robot 100 may include a processor 110. However, the disclosure is not limited thereto, and as such, the sever robot 100 may include a plurality of processors 110. The processor 110 may control an overall operation of the serving robot 100. The processor 110 may include one or more cores (e.g., a homogeneous multi-core or a heterogeneous multi-core). The processor 110 may be or include, for example, at least one of a central processing unit (CPU), an image signal processing unit (ISP), a digital signal processing unit (DSP), a graphics processing unit (GPU), a vision processing unit (VPU), and a neural processing unit (NPU). The processor 111 may execute various application programs loaded onto a memory (e.g., on-chip memory).
The processor 110 may control the driving part 120 so that the serving robot 100 moves or operates. The driving part 120 may be a driver. The processor 110 may control the driving part 120 to rotate the arm 10 forward or backward according to acceleration and deceleration of the serving robot 100. For example, the processor 110 may control the driving part 120 so that the loading part 20 maintains horizontality according to a forward rotation or a backward rotation of the arm 10. Also, the processor 110 may control the driving part 120 to move the serving robot 100.
According to an embodiment, the processor 110 may control the driving part 120 to rotate the arm 10 so that the loading part 20 is accelerated and decelerated to offset the acceleration and deceleration of the main body 1. For example, when the serving robot 100 is accelerated, the processor 110 may control the driving part 120 to rotate the arm 10 backward centered around an axis on one end of the arm 10 connected to the main body 1 (the connection axis connected to the main body) by a first angle corresponding to the acceleration information of the main body 1. Then, the processor 110 may control the driving part 120 to rotate the loading part 20 so that the loading part 20 maintains horizontality based on the first angle. The first angle may be a predetermined angle.
If the serving robot 100 maintains a constant velocity after it was accelerated, the processor 110 may control the driving part 120 to rotate the arm 10 forward centered around the axis on one end of the arm 10 to move the arm 10 to the initial location based on the information on the change of the velocity of the main body 1. Accordingly, the processor 110 may control the driving part 120 to rotate the loading part 20 so that the loading part 20 maintains horizontality based on the forward rotation of the arm 10.
If the serving robot 100 is decelerated, the processor 110 may control the driving part 120 to rotate the arm 10 forward centered around the axis on one end of the arm 10 connected to the main body 1 by a second angle corresponding to the deceleration information of the main body 1. Accordingly, the processor 110 may control the driving part 120 to rotate the loading part 20 so that the loading part 20 maintains horizontality based on the second angle. The second angle may by a predetermined angle.
If the serving robot 100 maintains a constant velocity after it was decelerated, the processor 110 may control the driving part 120 to rotate the arm 10 backward centered around the axis on one end of the arm 10 to move the arm 10 to the initial location based on the information on the change of the velocity of the main body 1. Accordingly, the processor 110 may control the driving part 120 to rotate the loading part 20 so that the loading part 20 maintains horizontality based on the backward rotation of the arm 10.
The serving robot 100 may move on a slope. In the case, the serving robot 100 may move on a slope, the processor 110 may control the driving part 120 to rotate the arm 10 and/or the loading part 20 so that the loading part 20 maintains horizontality based on the tilt of the main body 1 according to the tilt of the slope.
The driving part 120 may rotate the arm 10 and/or the loading part 20. According to an embodiment, the driving part 120 may move the serving robot 100.
Referring to
For example, the first motor 121 may rotate the arm 10 forward or backward by control by the processor 110. The second motor 122 may rotate the loading part 20 forward or backward to maintain the horizontality of the loading part 20 by control by the processor 110. According to an embodiment, the second motor 122 may rotate the loading part 20 so that the tray surface or the display surface of the loading part 20 is facing toward the upper direction. The third motor 123 may drive the wheel 30 so that the serving robot 100 moves by control by the processor 110. According to an embodiment, the driving part 120 may further includes connecting portions, such as gears, which connect the first motor 121 to the arm 10, the second motor 122 to the loading part 20, and the third motor 123 to the wheel 30.
Referring to
The input interface 130 may receive input of a control instruction from a user. For example, the input interface 130 may receive input of instructions from a user. The input instructions may include, but is not limited to, power on/off instructions, setting instructions (e.g., setting values for setting operational features of the serving robot 100), menu selection instructions, etc. The input interface 130 may include a keyboard, a button, a keypad, a touch pad, and a touch screen. The input interface 130 may also be referred to as an input device, an input part, an input module, etc.
The communication interface 135 may perform communication with an external device. For example, the communication interface 135 may perform communication with an external device by at least one communication method among communication methods of Wi-Fi, Wi-Fi Direct, Bluetooth, Zigbee, 3rd Generation (3G), 3rd Generation Partnership Project (3GPP), and Long Term Evolution (LTE). The communication interface 135 may receive a driving map, menu information, serving related information, etc. from the external device, and transmit information on the menu selected by the user, information on the state of the serving robot 100, etc. to the external device. The communication interface 135 may also be referred to as a communication device, a communication part, a communication module, a transceiver, etc.
The sensor 140 may detect acceleration and deceleration information of the main body 1, and acceleration and deceleration information of the loading part 20 according to a movement of the serving robot 100. For example, the sensor 140 may include a first sensor to detect acceleration and deceleration of the main body 1, and a second sensor to detect acceleration and deceleration of the loading part 20. Moreover, the sensor 140 may detect tilt information of the bottom surface or tilt information of the serving robot 100. For example, the sensor 140 may include a third sensor to detect tilt of the bottom surface or tilt of the serving robot 100. The detected information may be transmitted to the processor 110. The processor 110 may control the rotation of the arm 10 and/or the loading part 20 based on the information transmitted from the sensor 140. According to an embodiment, the sensor 140 may detect the surrounding environment. The processor 110 may perform a control operation including a movement of the serving robot 100 based on the detected information.
For example, in case the sensor 140 includes an infrared sensor, an ultrasonic sensor, a Time-of-Flight (ToF) sensor, a LiDar, a laser sensor, etc., the sensor 140 may transmit data regarding a transmitted signal and a received signal to the processor 110, and the processor 110 may identify the state of the road surface, whether there are bumps on the bottom, whether there is an obstacle, etc. based on the transmitted transmission data and reception data. According to an embodiment, in case the sensor 140 includes a motion recognition sensor, a heat detection sensor, etc., the sensor 140 may transmit the detected information to the processor 110, and the processor 110 may identify whether there is an obstacle, etc. based on the transmitted information. For example, the sensor 140 may include an angle sensor, an acceleration sensor, a gravity sensor, a gyro sensor, a geomagnetic sensor, a direction sensor, an infrared sensor, an ultrasonic sensor, a Time-of-Flight (ToF) sensor, a LiDar, a laser sensor, a motion recognition sensor, a heat detection sensor, an image sensor, a tracking sensor, a proximity sensor, an illumination sensor, a voltmeter, an ammeter, a barometer, a hygrometer, a thermometer, a touch sensor, etc.
The camera 145 may capture the surrounding environment of the serving robot 100. The captured image may be a still image (e.g., photograph) or a moving image (e.g., a video or multiple frames). The processor 110 may acquire information on the surrounding environment based on the image captured at the camera 145. For example, the processor 110 may acquire information related to the user and information on an obstacle based on the photographed image. The processor 110 may perform a control operation based on the acquired information. As an example, the serving robot 100 may include various kinds of cameras performing different functions. According to an embodiment, the serving robot 100 may include a plurality of cameras of the same kind. For example, the camera 145 may include a CCD sensor and a CMOS sensor. Also, the camera 145 may include an RGB camera, a depth camera, a wide angle camera, a telephoto camera, etc.
The microphone 150 may receive input of a user's voice. For example, the microphone 150 may receive input of a voice ordering food from a user. The processor 110 may recognize the food ordered by the user based on the input voice of the user, and perform a control operation related to serving of the ordered food. For example, the serving robot 100 may include at least one microphone 150. The microphone 150 may include a general microphone, a surround microphone, a directional microphone, etc.
The display 155 may output data processed at the processor 110 as an image. The display 155 may display information, and output a screen corresponding to a recognized instruction of the user. As an example, the display 155 may be arranged on the loading part 20. That is, one surface of the loading part 20 may include a tray, and the other surface may include the display 155. If the display surface of the loading part 20 is facing toward the upper direction, the display 155 may display information related to food (e.g., a menu) by control by the processor 110. The display 155 may be implemented as a liquid crystal display (LCD), organic light emitting diodes (OLED), a flexible display, a touch screen, etc. In case the display 155 is implemented as a touch screen, the serving robot 100 may receive input of a control instruction through the touch screen.
The speaker 160 may output a sound signal. For example, the speaker 160 may output information on an input instruction of a user, information related to food ordered by a user, serving information, information related to warning, information related to the state of the serving robot 100, information related to an operation, etc. as a voice or a notification sound.
The memory 165 may store data performing the functions of the serving robot 100, an algorithm, etc., and may store programs, instructions, etc. driven at the serving robot 100. According to an embodiment, the memory 165 may store data related to food, data related to serving, acceleration and deceleration data of the serving robot 100, data related to driving including a driving map, etc. The algorithm or the data stored in the memory 165 may be loaded on the processor 110 by control by the processor 110, and perform functions related to serving. For example, the memory 165 may be implemented as types such as a ROM, a RAM, an HDD, an SSD, a memory card, etc. According to an embodiment, the memory 165 may be an on-chip memory, which may store various application programs that are executable by the processor 110. The on-chip memory may operate as a cache memory adjacent to the processor 110. The on-chip memory may store a command, an address, and data to be processed by the processor 110 or may store a processing result of the processor 110. The on-chip memory may be, for example, a storage medium or a working memory including a latch, a register, a static random access memory (SRAM), a dynamic random access memory (DRAM), a thyristor random access memory (TRAM), a tightly coupled memory (TCM), etc.
The serving robot 100 may include all of the aforementioned components, or include some components. So far, the configuration of the serving robot 100 was described. Hereinafter, a control operation of the serving robot 100 according to moving of the serving robot 100 will be described.
Referring to
Referring to
If the serving robot 100 in a halt state moves, acceleration is applied from the rear side to the liquid object 71 loaded on the serving robot 100. According to an embodiment, if the serving robot 100 moving in a constant velocity accelerates, acceleration is applied from the rear side to the liquid object 71 loaded on the serving robot 100. By the law of inertia, the liquid object 71 in a halt state (or, a state in a constant velocity) may exhibit a property of trying to keep maintaining the previous state. However, as the liquid object 71 has fluidity, it may exhibit the state illustrated in
According to an embodiment, the serving robot 100 may maintain the constant velocity again if it accelerates by greater than or equal to a specific velocity. The feature that the accelerating serving robot 100 maintains a constant velocity means that the acceleration is 0, and also means that (−) acceleration is applied to the serving robot 100 until the acceleration becomes 0. According to an embodiment, the serving robot 100 may decelerate according to the surrounding situation, and (−) acceleration is applied to the serving robot 100. In this case, the liquid object 71 loaded on the serving robot 100 may exhibit a state opposite to the state illustrated in
According to an embodiment, the liquid object 71 may vibrate according to a change of the velocity of the serving robot 100. That is, the water level of the liquid object 71 may become higher alternatingly in the left side direction and the right side direction of the container. As described above, if the water surface in one direction of the liquid object 71 becomes higher or vibrates, the liquid object 71 may overflow. Accordingly, a serving robot 100 that can stably serve the liquid object 71 even if the serving robot 100 moves (or, the velocity changes) is needed.
According to an embodiment, a liquid object 71 that maintains a stable state even if acceleration is applied to the serving robot 100 is illustrated. The serving robot 100 according to the disclosure may detect acceleration (or, a change of the velocity) of the serving robot 100, and rotate the arm 10 and/or the loading part 20 based on the detected acceleration information. Rotation of the arm 10 and/or the loading part 20 applies acceleration in an opposite direction of a size similar to the acceleration applied to the serving robot 100 to the liquid object 71, and accordingly, the liquid object 71 can maintain the stable state as illustrated in
According to another embodiment, as illustrated in
As illustrated in
Referring to
The serving robot 100 may detect acceleration information of the main body 1 and acceleration information of the loading part 20. The serving robot 100 may rotate the arm 10 and the loading part 20 so that the acceleration information of the loading part 20 is offset based on the detected acceleration information of the main body 1. According to an embodiment, the serving robot 100 may store acceleration information in the memory 165. The serving robot 100 may rotate the arm 10 and the loading part 20 based on the stored acceleration information. For example, if acceleration of ‘a’ is applied to the main body 1 according to a movement or acceleration of the serving robot 100, the serving robot 100 may rotate the arm 10 backward so that the acceleration of the loading part 20 becomes 0 (or, becomes closer to 0). The serving robot 100 may rotate the arm 10 backward by a predetermined angle centered around the axis on one end of the arm 10 connected to the main body 1 based on the detected acceleration information of the main body 1.
As illustrated in
If the serving robot 100 is accelerated to a specific velocity, it may maintain a constant velocity. The accelerating serving robot 100 maintains a constant velocity means that acceleration of ‘-a’ is applied to the serving robot 100 until the velocity becomes a constant velocity. If the serving robot 100 maintains a constant velocity after it is accelerated, the serving robot 100 may control the arm 10 and the loading part 20 based on the detected information on the change of the velocity of the main body 1. That is, the serving robot 100 may rotate the arm 10 forward centered around the axis on one end of the arm 10, and move the arm 10 to the initial location. Accordingly, the serving robot 100 may rotate the loading part 20 so that the loading part 20 maintains horizontality based on the forward rotation of the arm 10.
Accordingly, even if the serving robot 100 moves or accelerates, the acceleration applied to the food loaded on the loading part 20 becomes 0 (or, become closer to 0), and thus the serving robot 100 can stably serve the food.
Referring to
The serving robot 100 may detect deceleration information of the main body 1 and deceleration information of the loading part 20. The serving robot 100 may rotate the arm 10 and the loading part 20 so that the deceleration information of the loading part 20 is offset based on the detected deceleration information of the main body 1. According to an embodiment, the serving robot 100 may store deceleration information in the memory 165. The serving robot 100 may rotate the arm 10 and the loading part 20 based on the stored deceleration information. For example, if acceleration of ‘a’ is applied to the main body 1 according to deceleration of the serving robot 100, the serving robot 100 may rotate the arm 10 forward so that the acceleration of the loading part 20 becomes 0 (or, becomes closer to 0). The serving robot 100 may rotate the arm 10 forward by a predetermined angle centered around the axis on one end of the arm 10 connected to the main body 1 based on the detected deceleration information of the main body 1.
When the arm 10 rotates forward by the predetermined angle, the loading part 20 may also be tilted by the predetermined angle. In this case, the food loaded on the loading part 20 may be in an unstable state. Accordingly, if the arm 10 rotates forward by the predetermined angle, the serving robot 100 may maintain the horizontality of the loading part 20 by rotating the loading part 20.
If the serving robot 100 is decelerated to a specific velocity, it may maintain a constant velocity (including a case wherein the velocity is 0). The feature that the decelerating serving robot 100 maintains a constant velocity means that acceleration of ‘−a’ is applied to the serving robot 100 until the velocity becomes a constant velocity. If the serving robot 100 maintains a constant velocity after it is decelerated, the serving robot 100 may control the arm 10 and the loading part 20 based on the detected information on the change of the velocity of the main body 1. That is, the serving robot 100 may rotate the arm 10 backward centered around the axis on one end of the arm 10, and move the arm 10 to the initial location. Accordingly, the serving robot 100 may rotate the loading part 20 so that the loading part 20 maintains horizontality based on the backward rotation of the arm 10.
Accordingly, even if the serving robot 100 decelerates, the serving robot 100 can stably serve the food.
For example, depending on various scenarios, the serving robot 100 may repeat acceleration and deceleration while serving food. In case the serving robot 100 repeats acceleration and deceleration, liquid food loaded on the loading part 20 may vibrate as illustrated in
In
When the serving robot 100 moves on a slope, the serving robot 100 may detect the tilt of the main body 1. Then, the serving robot 100 may maintain the loading part 20 to be horizontal by rotating the arm 10 and/or the loading part 20 based on the detected tilt of the main body 1.
As illustrated in
As illustrated in
Accordingly, the serving robot 100 can stably serve food regardless of the angle of slope on which the serving robot 100 moves. For example, the serving robot 100 is able to take into consideration the angle of the slope and adjust the rotation of the loading part 20, the rotation of the arm 10 or the rotation of both the loading part 20 and the arm 10 based on the angle of slope.
Referring to
For example, the serving robot 100 may capture an image regarding the surrounding environment by using the camera 145. The serving robot 100 may identify users 81, 82 in the captured image. If the serving robot 100 identifies the users 81, 82 while food is not loaded on the loading part 20, the serving robot 100 may rotate the loading part 20 so that the display 155 is facing toward the upper direction. When the display 155 is facing toward the upper direction, the serving robot 100 may display information related to food on the display 155. The serving robot 100 may rotate the arm 10 and the loading part 20 including the display 155 by an appropriate angle for receiving orders from the users 81, 82. For example, the serving robot 100 may recognize the face areas (e.g., facial regions) of the users 81, 82 in the captured image, and may rotate the arm 10 and the loading part 20 including the display 155 based on the angles determined from the location of the display 155 of the loading part 20 and the locations of the recognized face areas of the users 81, 82. That is, the serving robot 100 may rotate the arm 10 and the loading part 20 based on the capture image (e.g., based on a location of the face of a user identified in the capture image. However, the disclosure is not limited thereto, and as such, according to another embodiment, the serving robot 100 may rotate the arm 10 and the loading part 20 based on another criteria satisfied in the capture image.
As an example, if the first user 81 is an adult, and the face area of the first user 81 is a location higher than the location of the display 155, the serving robot 100 may rotate the arm 10 forward by 10 degrees, and rotate the loading part 20 including the display 155 forward by 5 degrees. Meanwhile, if the second user 82 is a kid, and the face area of the second user 82 is a location similar to the location of the display 155, the serving robot 100 may rotate the arm 10 forward by 30 degrees, and rotate the loading part 20 including the display 155 forward by 20 degrees. Also, in case the first user 81 and the second user 82 are located to be adjacent, the serving robot 100 may change only the direction of the display 155 so that it is facing toward the first user 81 or the second user 82. According to an embodiment, in case the first user 81 and the second user 82 are located to be distanced from each other, when input of an order from the first user 81 is completed, the serving robot 100 may move to the second user 82 and receive input of an order from the second user 82.
The serving robot 100 may receive input of an order by a touch input through the display 155 implemented as a touch screen, and receive input of an order by a voice through the microphone 150. Accordingly, the serving robot 100 may give help so that the user can conveniently select food.
When a user selects food, the serving robot 100 may identify the user who selected food and the selected food. The serving robot 100 may use information on the identified user and food when serving the food. Also, the serving robot 100 may transmit the information on the selected food to an external device through the communication interface 135. For example, the external device may be an electronic device located in the kitchen. The cooker may cook the food based on the information on the food received from the electronic device located in the kitchen.
Referring to
When the serving robot 100 arrives at the users 81, 82, the serving robot 100 may serve the food by rotating the arm 10 and the loading part 20 based on the identified user and the identified food. As an example, the first user 81 may order A food, and the second user 82 may order B food. The serving robot 100 may rotate the arm 10 forward by 15 degrees for serving the A food to the first user 81. Then, the serving robot 100 may rotate the loading part 20 together so that the loading part 20 on which the food is loaded maintains horizontality. Also, the serving robot 100 may output a guide message such as “Here is food A you ordered” to the first user 81 through the speaker 160. In case serving of the A food to the first user 81 was completed, the serving robot 100 may serve the B food to the second user 82. The serving robot 100 may rotate the arm 10 forward by 30 degrees for serving the B food to the second user 82. Then, the serving robot 100 may rotate the loading part 20 together so that the loading part 20 on which the food is loaded maintains horizontality. Also, the serving robot 100 may output a guide message such as “Here is food B you ordered” to the second user 82 through the speaker 160. As described in
So far, various embodiments of the serving robot 100 were described. Hereinafter, a control method of the serving robot 100 will be described.
Referring to
For example, the serving robot may detect acceleration and deceleration information of the main body and acceleration and deceleration information of the loading part. The serving robot may rotate the arm forward or backward so that the acceleration and deceleration information of the loading part is offset based on the detected acceleration and deceleration information of the main body.
If the serving robot is accelerated, the serving robot may rotate the arm backward by a predetermined first angle centered around the axis on one end of the arm connected to the main body based on the detected acceleration information of the main body. Then, the serving robot may maintain the horizontality of the loading part by rotating the loading part based on the predetermined first angle. If the serving robot maintains a constant velocity after it is accelerated, the serving robot may rotate the arm forward centered around the axis on one end of the arm based on the detected information on the change of the velocity of the main body, and move the arm to the initial location. Then, the serving robot may maintain the horizontality of the loading part by rotating the loading part based on the forward rotation of the arm.
If the serving robot is decelerated, the serving robot may rotate the arm forward by a predetermined second angle centered around the axis on one end of the arm connected to the main body based on the detected deceleration information of the main body. Then, the serving robot may maintain the horizontality of the loading part by rotating the loading part based on the predetermined second angle. If the serving robot maintains a constant velocity after it is decelerated, the serving robot may rotate the arm backward centered around the axis on one end of the arm based on the detected information on the change of the velocity of the main body, and move the arm to the initial location. Then, the serving robot may maintain the horizontality of the loading part by rotating the loading part based on the backward rotation of the arm.
Also, in case the serving robot moves on a slope, the serving robot may detect the tilt of the main body. The serving robot may rotate at least one of the arm or the loading part so that the loading part maintains horizontality based on the detected tilt of the main body.
One surface of the loading part of the serving robot may include a tray, and the other surface may include a display. The serving robot may rotate the loading part so that the tray or the display is facing toward the upper direction. When the display is facing toward the upper direction, the serving robot may display information related to food.
The serving robot may capture an image regarding the surrounding environment. The serving robot may identify an customer from the photographed image while food is not loaded on the loading part. If the serving robot identifies the customer, the display may rotate the loading part so that the display is facing toward the upper direction. The serving robot may receive input of an order from the customer. For example, the serving robot may receive input of an order by a touch input through the display implemented as a touch screen or a voice input through the speaker. The serving robot may identify the food selected by the customer.
The serving robot may rotate the loading part so that the tray is facing toward the upper direction for loading food. Then, the serving robot may control the arm and the loading part by using acceleration and deceleration information for safely transmitting the food to the customer. When the serving robot arrives at the customer, the serving robot may rotate the arm and the loading part for serving the food to the customer based on the identified customer and the identified food.
Accordingly, the serving robot according to the disclosure can safely move items such as food, so that a user can conveniently order and receive items such as food. Also, the serving robot according to the disclosure can reinforce interaction with a user.
The effects according to the disclosure are not limited to the effects mentioned above, and other effects that were not mentioned could be clearly understood by a person skilled in the art from the following descriptions.
Also, a control method of a serving robot according to the aforementioned various embodiments can be provided as a computer program product. A computer program product may include an S/W program itself, or a non-transitory computer readable medium having an S/W program stored therein.
A non-transitory computer readable medium refers to a medium that stores data semi-permanently, and is readable by machines, but not a medium that stores data for a short moment such as a register, a cache, and a memory, etc. Specifically, the aforementioned various applications or programs may be provided while being stored in a non-transitory computer readable medium such as a CD, a DVD, a hard disk, a blue-ray disk, a USB, a memory card, a ROM and the like.
Also, while preferred embodiments of the disclosure have been shown and described, the disclosure is not limited to the aforementioned specific embodiments, and it is apparent that various modifications may be made by those having ordinary skill in the technical field to which the disclosure belongs, without departing from the gist of the disclosure as claimed by the appended claims. Further, it is intended that such modifications are not to be interpreted independently from the technical idea or prospect of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
10-2022-0116619 | Sep 2022 | KR | national |
10-2022-0123306 | Sep 2022 | KR | national |
This application is a bypass continuation of International Application No. PCT/KR2023/013833, filed on Sep. 14, 2023, which is based on and claims priority to Korean Patent Application No. 10-2022-0116619, filed on Sep. 15, 2022, and Korean Patent Application No. 10-2022-0123306, filed on Sep. 28, 2022, in the Korean Intellectual Property Office, the disclosures of which are incorporated by reference herein in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/KR2023/013833 | Sep 2023 | US |
Child | 18529557 | US |