The present disclosure generally relates to a baby transport, and a method for operating the same.
A general way for moving a conventional baby transport such as a baby stroller requires a caregiver to push or pull the stroller with a handle. A baby walker, on the other hand, could be unsafe when obstacles are present. Therefore, it is desirable to provide an autonomous baby transport for safety and convenience.
In one aspect of the present disclosure, a baby transport is provided. A baby transport includes a car body, a first sensing unit, a second sensing unit, and a processing unit. The car body is configured to carry a baby. The first sensing unit, coupled to the car body, is configured to sense a biological signal of the baby. The second sensing unit, coupled to the car body, is configured to sense an environment context. The processing unit, coupled to the first sensing unit and the second sensing unit, is configured to determine a target of interest according to the biological signal of the baby and the environmental context; plan a route according to the environment context and the target of interest; and control the car body to move according to the route.
In another aspect of the present disclosure, a method of operating a baby transport is provided. The method includes the following actions. A biological signal of the baby is sensed by a first sensing unit. An environmental context is sensed by a second sensing unit. A target of interest is determined by the processing unit according to the biological signal of the baby and the environmental context. A route is planned by the processing unit according to the environment context and the target of interest. And, a baby car body is controlled, by the processing unit, to move according to the route.
The following description contains specific information pertaining to exemplary implementations in the present disclosure. The drawings in the present disclosure and their accompanying detailed description are directed to merely exemplary implementations. However, the present disclosure is not limited to merely these exemplary implementations. Other variations and implementations of the present disclosure will occur to those skilled in the art. Unless noted otherwise, like or corresponding elements among the figures may be indicated by like or corresponding reference numerals. Moreover, the drawings and illustrations in the present disclosure are generally not to scale, and are not intended to correspond to actual relative dimensions.
In one implementation, the first sensing unit 110 may include an image capturing unit (e.g., camera) for capturing the images of the baby. The first sensing unit 110 may be a depth-sensing camera with a depth sensor, an RGB camera, or an infrared (IR) camera. In some embodiments, the first sensing unit 110 may include a light source (e.g., an IR illuminator or a visible light illuminator) for lighting the environment. The camera may be a camera module further includes an image processing unit such as high dynamic range (HDR) imaging for improving the image quality, or a format conversion such as serdes. The image may be processed by a processing unit to understand the baby status including emotion, sleeping, vigilance, comfortableness, health, or activity from either a static image frame or a video consists of image stream. In another implementation, the first sensing unit 110 further includes a voice recording unit configured to record a voice or sound of the baby. In some implementations, the first sensing unit 110 further includes a thermal sensor configured to sense the body temperature of the baby. In some other implementations, the first sensing unit 110 further includes a heartbeat rate (HBR) monitor configured to detect the HBR of the baby. In some other implementations, the first sensing unit 110 further includes a breath monitor. The HBR monitor or the breath monitor may be realized by an image capturing device to detect the HBR via image recognition.
The second sensing unit 120, coupled to the car body 140, is configured to sense an environment context. For example, the environment context may include, but not limited to, an image of the environment, or information about the distance to an object/obstacle, a location, a position, or a movement of an object/obstacle. In one implementation, the second sensing unit 120 may include an image capturing unit for capturing images of the environment, such as, a photo sensor, a depth-sensing camera with a depth sensor, an RGB color camera, or an infrared (IR) camera. In some embodiments, the second sensing unit 120 further includes a light source (e.g., an IR illuminator or a visible light illuminator) for illuminating the environment. In another implementation, the second sensing unit 120 may include a lidar sensor, a radar, or an ultrasonic sensor, for detecting object(s)/obstacle(s) and provide information about the object(s)/obstacle(s). In some implementations, the second sensing unit 120 may include a Global Positioning Satellite (GPS) receiver and/or an inertial measurement unit (IMU) for global/local positioning and obtaining trajectory of the vehicle, externally applied forces, vehicle movement, and road dynamics. In some implementations, the second sensing unit 120 may include an accelerometer for detecting bumping or for positioning.
The processing unit 130 is coupled to the car body 140, the first sensing unit 110, and the second sensing unit 120. The processing unit 130 may receive data, process data, and generate instructions for the baby transport. In one embodiment, the processing unit 130 may be a hardware module comprising one or more central processing unit (CPU), microcontroller(s), ASIC, or a combination of above but is not limited thereof. The processing unit 130 may perform computer vision technique, such as object detection and recognition, and/or image processing. In one embodiment, the processing unit 130 is configured to perform a biometric detection/recognition according to the images captured by the first sensing unit 110. The biometric detection/recognition may include face detection, facial recognition, head pose detection, eyes openness detection, yawning detection, gaze detection, body skeleton detection, gender detection, age detection, or a combination of above, but is not limited thereof. In some other embodiments, the processing unit 130 may further determine a biological status including drowsy, sleep, microsleep, vigilance, emotion, comfortableness, intrigued, hungry, etc., based on the biometric detection/recognition and the biological signal sensed by the first sensing unit 110. Furthermore, the processing unit 130 may detect the breathing, the heartbeat rate of the baby, and/or perform other biological recognitions via computer vision technique to obtain the biological status of the baby and determine whether the baby is breathing normally. In addition, the processing unit 130 may monitor the movement of the baby and track the activity of the baby. In some embodiments, the processing unit 130 may further perform a voice recognition based on the voice or sound of the baby recorded by the first sensing unit 110, for example, a crying or a yelling. In an embodiment, the processing unit 130 may detect and process the environment context sensed from the second sensing unit 120 and provide the representation of the environment dynamic. In one embodiment, the processing unit 130 may process the images captured by the second sensing unit 120, and perform object detection. In another embodiment, the processing unit 130 may process the sensed data of a lidar, a radar, or an ultrasonic sensor, and perform an obstacle detection. In yet another embodiment, the processing unit 130 may track the object and the obstacle by a fusion of multiple sensor data. In one embodiment, the processing unit 130 may perform a localization, determine an orientation of the baby transport, and build a map according to the sensed data from a camera, GPS, IMU, and/or an encoder for motor angular and velocity. In another embodiment, the processing unit 130 may create a point cloud and a cost map according to the sensed data from the second sensing unit. In some embodiments, the processing unit 130 performs path/route planning and controls the motion of the baby transport.
In some other embodiments, the baby transport 100 further includes an escalation unit configured to give a sound or voice, and/or send an alarm or notification to a mobile device via a wireless transceiver. The wireless transceiver may be a WIFI, mobile 4G/5G module, BLE, but is not limited thereof, for data uplink and downlink communications.
In another implementation, the target of interest is further determined according to a biological status of the baby. The processing unit determines the biological status of the baby by performing biometric detection/recognition. The biological status may include, but not limited to, drowsy, sleep, microsleep, emotion, comfortableness, hungry, intrigued, or a body language. Each biological status corresponds to an object, a person, a location, a direction, or an area. For instance, when a drowsy status is identified, the processing unit may infer the bedroom as the target of interest.
In some implementations, the first sensing unit further includes a microphone adapted to record the sound or voice of the baby. The processing unit may perform sound recognition, voice recognition and/or speech recognition to determine the target of interest. For example, when the baby calls his/her mom or dad, the processing unit determines the baby's target of interest is his/her mom or dad. Additionally, a baby's voice or sound may be recorded in advance to represent an object, a location, a direction, or a person, and thus the target of interest of the baby could be identified by sound recognition, voice recognition and/or speech recognition. Moreover, the processing unit may determine the target of interest further according to other types of biological signal such as a heartbeat rate or a breath of the baby. For example, when a drowsy or sleep status is inferred according to a heartbeat rate, the bedroom may be inferred as the target of interest.
In action 320, the processing unit plans a route according to the environment context and the target of interest inferred from action 310. For example, the processing unit may determine the coordinate of the target of interest according to a map established according to the environment context and a position of the baby transport, then plan a route accordingly. When the target of interest is an object, a person, or an area, the processing unit sets a destination near the object, the person, or the area, and plans the route to the destination coordinate accordingly. When the target of interest is a direction, the processing unit may set a destination by a specific distance along the direction. Meanwhile, the processing unit further process the environment context sensed by the second sensing unit and provide a representation of static or dynamic environment that includes a map, a cost map, an obstacle position and its shape, an orientation/heading of the baby transport, an object, a recognized object, an object ID, a point cloud, a context primitive, a road quality, a friction of the road surface, or a slope of a tilt road. The map may be a global map or a local map constructed by SLAM (Simultaneous localization and mapping). The cost map is built from the sensed data for characterizing the cost of traveling through the environment. The obstacle position and shape may be obtained from a 3D camera, lidar, a radar, or an ultrasonic sensor. Each object may be given an object ID, and the object recognized by the image recognition may be correlated to the object ID. A point cloud including sensed data points in space may be constructed by the obstacle information to establish a relationship between perception from different sensors, and may be used for tracking the obstacle dynamic in an image stream of the environment, where each obstacle being tracked may be given an object ID. The context primitive is a pre-defined scenario or state that represents a combination of the environment context dynamic. For example, a man is drinking water is a pre-defined context primitive. The context primitive may also be a combination or relation of environment objects that characterizes a scenario, for instance, a congestion level, or an environment safety evaluation. According to the environment context and the target of interest, the processing unit plans the route to navigate the baby transport to the target of interest without colliding with obstacles.
In action 330, the processing unit controls the car body to move according to the route. For instance, the processing unit may include a motor controller to generate a control command for steering the baby transport to move to a waypoint according to the planned route. The control command may include a steering angle, an angular velocity, a throttle command and/or a brake. As a result, the baby transport infers a baby's target of interest to navigate the baby to the target autonomously and safely according to the environment context, and thus relieves the burden of the caregiver.
In another embodiment, a reward may be used for justifying and modifying the inferred target of interest. Firstly, the processing unit determines an inferred target according to the biological signal, the biological status, and/or the environment context. Secondly, the processing unit may plan the route and actuate a motor to move the baby transport to the target of interest. When the vehicle is moving, the environment context will be changed dynamically, and thus a baby may react on the change of the environment context. As a result, the reaction of the baby may be monitored and evaluated as a reward to justify the level of confidence for the inferred target of interest. The reaction of the baby may be, for example, a feedback of his/her emotion status, where the emotion may be characterized from his/her face appearance or facial expression, voice, breath rate, and/or HBR, etc. For instance, when the emotion is rated as excited or positive, the reward is given by a higher value such that the baby transport remains the route of the navigation, or accelerate/decelerate the baby transport. On the other hand, if the baby's emotion is rated as fear or negative, the reward is given a lower value such that the baby transport may implement a maneuver, such as, stop moving, decelerate, clear, and avoid this target of interest, make a turn to change the heading of the car body, and/or re-evaluate a new target of interest.
In one embodiment when the baby transport has reached its target of interest, or there's not a target of interest been inferred at a given time point, the baby transport may stop and result in no movement. For instance, when the baby is detected as sleep, the baby transport may stop or move to a designated place. In another embodiment, the baby transport may further tilt down the seat of the car body when the biological status of the baby is detected as sleep.
In another embodiment when there is no new target of interest inferred, the baby transport may remain on the route to the last target of interest.
In yet another embodiment when a new target of interest is inferred while the baby transport has not reached the former target of interest, the baby transport may determine a new goal and re-plan a route according to the new target of interest. For example, when the baby transport is on the way to a first goal according to a first target of interest, the processing unit may re-evaluate a second goal according to the second target of interest.
In summary, the baby transport provides an autonomous function such that the operation is smoothly executed without using hands, and thus the caregiver could carry other stuff with his/her spare hands. On top of that, the baby transport considers the environment condition and the baby's interest, and thus the safety of the baby is ensured as the nearby obstacle are avoided. Furthermore, giving the baby the view he/she wants brings a comfortable experience to the baby, and thus the baby transport relieves the burden of the caregiver.
Based on the above, several baby transports and methods for operating a baby transport are provided in the present disclosure. The implementations shown and described above are only examples. Even though numerous characteristics and advantages of the present technology have been set forth in the foregoing description, together with details of the structure and function of the present disclosure, the disclosure is illustrative only, and changes may be made in the detail, including in matters of shape, size and arrangement of the parts within the principles of the present disclosure up to, and including, the full extent established by the broad general meaning of the terms used in the claims.