Aspects of the present disclosure relate to self-driving luggage methods, systems, devices, and components thereof, having multiple operational modes.
Passengers in airports can experience problems and time delays. For example, it can be difficult and time-consuming for passengers to find specific locations within an airport, such as a boarding gate. Such issues can also cause passengers to miss connecting flights.
Therefore, there is a need for new and improved self-driving luggage systems that are able to assist passengers in finding and arriving at specific locations within airports.
Aspects of the present disclosure relate to self-driving luggage methods, systems, devices, and components thereof, having multiple operational modes.
In one implementation, a self-driving system includes a piece of luggage. The piece of luggage includes one or more motorized wheels. The self-driving system includes a central processing unit configured to switch between a following mode and a leading mode. In the following mode the central processing unit instructs the piece of luggage to follow a user. In the leading mode the central processing unit instructs the piece of luggage to lead the user to a destination.
In one implementation, a method of operating a self-driving system includes defaulting to a following mode for a piece of luggage. The method also includes determining if one or more leading requirements are met for a leading mode. The method also includes starting the leading mode. The method also includes moving the piece of luggage toward a destination.
So that the manner in which the above recited features of the present disclosure can be understood in detail, a more particular description of the present disclosure, briefly summarized above, may be had by reference to implementations, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only common implementations of the present disclosure and are therefore not to be considered limiting of its scope, for the present disclosure may admit to other equally effective implementations.
To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. It is contemplated that elements disclosed in one implementation may be beneficially utilized on other implementations without specific recitation.
Aspects of the present disclosure relate to self-driving luggage methods, systems, devices, and components thereof, having multiple operational modes. Although the embodiments of the self-driving systems are described and illustrated herein with respect to a luggage system, the embodiments may be used with other types of portable equipment. Additionally, although the embodiments of the self-driving systems are described and illustrated herein with respect to an airport, the embodiments may be used with other types of facilities, such as an office or a factory.
The self-driving system 100 includes a handle 110 coupled to the piece of luggage 102. The handle 110 is configured to allow a user of the self-driving system 100 to move, push, pull, and/or lift the piece of luggage 102. The handle 110 is located on a left side 108 of the luggage 102, but can be located on any side of the piece of luggage 102, such as on a right side 104 that opposes the left side 108. The handle 110 includes a pull rod 112 coupled to a connecting rod 118, which is coupled to the luggage 102. The pull rod 112 forms a “T” shape with, and telescopes within, the connecting rod 118. An upper portion 112a of the pull rod 112 is elongated and oriented horizontally and is perpendicular to a lower portion 112b of the pull rod 112. That is, the lower portion 112b of the pull rod 112 is oriented vertically and is perpendicular to the upper portion 112a.
One or more sensors 120a, 120b are disposed on the upper portion 112a of the pull rod 112. The sensors 120a, 120b are cameras configured to take photographs and/or videos of objects in a surrounding environment of the piece of luggage 102. In one example, the cameras 120a, 120b take photographs and/or videos of nearby targets and/or users. The one or more cameras 120a, 120b are disposed on one or more outer elongated portions of the pull rod 112, and face outwards from the piece of luggage 102. The first sensor 120a is a front camera 120a that faces the front side 105 of the piece of luggage 102. The second sensor 120b is a back camera 120b that faces the back side 107.
The self-driving system 100 includes one or more sensors 114a-114d (four are shown) disposed on one or more of the pull rod 112 and/or the connecting rod 118 of the handle 110. The sensors 114a-114d are cameras configured to take photographs and/or videos of objects in a surrounding environment of the piece of luggage 102. In one example, the cameras 114a-114d take photographs and/or videos of nearby targets and/or users. The cameras 114a-114d are disposed on the lower portion 112b of the pull rod 112. In one example, one of the four cameras 114a-114d is coupled to one of four sides of the lower portion 112b of the pull rod 112. Each of the four sides of the lower portion 112b corresponds to the left side 108, the right side 104, the front side 105, and the back side 107. A left side camera 114a faces the left side 108, a front camera 114b faces the front side 105, a right side camera 114c faces the right side 104, and a back camera 114d faces the back side 107.
The cameras 114a-114d and the cameras 120a, 120b are disposed on the pull rod 112 to facilitate reduced damage to the cameras in case of the piece of luggage 102 colliding with an object, for example when the pull rod 112 is retracted into the piece of luggage 102.
Each of the cameras 114a-114d is configured to take images of a target, such as a user, so that the self-driving system 100 can determine a distance of the target relative to the piece of luggage 102. Each of the cameras 114a-114d may include a wide-angle lens. Images taken by a camera 114a-114d include one or more targets such that the larger a target appears in the image, the farther it is from the piece of luggage 102 and the camera 114a-114d that took the image.
The self-driving system 100 includes one or more laser emitters 116a-116d disposed on the lower portion 112b of the pull rod 112 and below the cameras 114a-114d. Each of the four laser emitters 116a-116d corresponds to one of the four cameras 114a-114d. Each laser emitter 116a-116d is disposed on the same side of the lower portion 112b of the pull rod 112 as the corresponding one of the cameras 114a-114d. Each laser emitter 116a-116d is disposed on one of the four sides of the lower portion 112b of the pull rod 112. Each of the laser emitters 116a-116d is configured to shoot light, such as lasers, in an outward direction from the lower portion 112b of the pull rod 112 and towards one or more targets, such as a user. The light emitted by the laser emitters 116a-116d reflects off of the one or more targets. The light emitted by the laser emitters 116a-116d is invisible to the human eye. Each of the cameras 114a-114d includes an optical filter to identify the light emitted from the laser emitters 116a-116d and reflected off of a target to facilitate determining the proximity of the target relative to the piece of luggage 102. The cameras 114a-114d are configured to take an image of a target that includes light emitted from a respective one of the laser emitters 116a-116d that is reflected off of the target. Images taken by a camera 114a-114d include one or more targets and reflected light such that the higher the reflected light appears in the image, the farther the target is from the piece of luggage 102 and the camera 114a-114d that took the images.
As shown in
The self-driving system 100 includes one or more proximity sensors 170a, 170b disposed on the piece of luggage 102. Two proximity sensors 170a, 170b are shown coupled to a side of the luggage 102 adjacent to a top end of the piece of luggage 102. Any number of proximity sensors 170a, 170b can be used and located at different positions and/or on any side of the piece of luggage 102. The proximity sensors 170a, 170b are configured to detect the proximity of one or more objects. In one example, the proximity sensors 170a, 170b detect the proximity of a user. In one example, the proximity sensors 170a, 170b detect the proximity of objects (e.g., obstacles) other than the user to facilitate the piece of luggage 102 avoiding the objects as the piece of luggage 102 follows and/or leads the user.
The proximity sensors 170a, 170b include one or more of ultrasonic sensors, sonar sensors, infrared sensors, radar sensors, and/or LiDAR sensors. The proximity sensors 170a, 170b may work with the cameras 120a, 120b, the lower cameras 114a-114d, and/or the laser emitters 116a-116d to facilitate the piece of luggage 102 avoiding obstacles (such as objects other than the user) as the piece of luggage 102 follows and/or leads the user. Obstacles may include other people or objects in the travel path of the luggage 102 when moving in a rear following position, a side following position, or a front leading position relative to the user. When an obstacle is identified, the self-driving system 100 will take corrective action to move the piece of luggage 102 and avoid a collision with the obstacle based on the information received from the self-driving system 100 components, such as one or more of the proximity sensors 170a, 170b, the cameras 120a, 120b, the lower cameras 114a-114d, and/or the laser emitters 116a-116d.
The onboard ultra-wideband device 200 has a positioning device that includes a control unit 204 and one or more transceivers 202a, 202b, 202c (three are shown). In one example, the control unit 204 is a central processing unit. The onboard ultra-wideband device 200 includes a crystal oscillator 206. The crystal oscillator 206 is an electronic oscillator circuit that uses the mechanical resonance of a vibrating crystal of piezoelectric material to create an electric signal. The electric signal has a frequency that is used to keep track of time to provide a stable clock signal. The transceivers 202a, 202b, 202c share the same crystal oscillator 206 so that they each have the exact same stable clock signal. In one example, the transceivers 202a, 202b, 202c determine from which side a transmitter 402 of a mobile ultra-wideband device 400 is located by calculating the time difference of arrival based on the arrival time of the signal from the transmitter 402 as detected by each one transceiver 202a, 202b, 202c relative to each other transceiver 202a, 202b, 202c. The one or more transceivers 202a, 202b, 202c may be antennas configured to receive one or more signals, such as radio wave signals, from the mobile ultra-wideband device 400. The one or more transceivers 202a, 202b, 202c may be disposed within the onboard ultra-wideband device 200 (as illustrated in
In one embodiment, which can be combined with other embodiments, the onboard ultra-wideband device 200 determines the angle of arrival of a signal transmitted by the transmitter 402 of the mobile ultra-wideband device 400 to determine the position of a user relative to the luggage 102. The control unit 204 and the crystal oscillator 206 continuously calculate the angle at which the transmitter 402 is located relative to two of the three transceivers 202a, 202b, and 202c. The self-driving system 100 is configured to determine the position of the piece of luggage 102 relative to the mobile ultra-wideband device 400 using (1) the proximity of the transmitter 402 as continuously calculated by the onboard ultra-wideband device 200 using the angle of arrival calculation, and (2) the location of the transmitter 402 as continuously calculated by the onboard ultra-wideband device 200 using the time difference of arrival calculation. When a user includes or wears the mobile ultra-wideband device 400, the self-driving system 100 is configured to determine a position of the piece of luggage relative to the user. In one example, a user wears the mobile ultra-wideband device 400 on a waist of the user, such as on a belt of the user. In one example, a user wears the mobile ultra-wideband device 400 on an arm of the user, such as on a wrist of the user.
In one example, the transmitter 402 is integrated into the mobile ultra-wideband device 400. The transmitter 402 may be in the form of hardware disposed within the mobile ultra-wideband device 400 and/or software programmed into the mobile ultra-wideband device 400. In
When the self-driving system 100 is in the vision monitoring mode, one or more laser emitters 116a-116d emit one or more flat beams of light 140 towards a user 500. The wavelength of the flat beams of light 140 (such as laser beams) emitted by the laser emitters 116a-116d is within a range of 800 nm to 815 nm, such as 803 nm to 813 nm. The one or more of the cameras 114a-114d and/or one or more of the cameras 120a, 120b take one or more images of the user 500. The one or more beams of light 140 reflect off of the user 500 as a horizontal line 142 and at a height h1, as illustrated in the image 150. The one or more images, such as the image 150, taken by the cameras 114a-114d include the user 500 and the horizontal line 142 of light reflected off of the user 500. The one or more cameras 114a-114d and/or the one or more cameras 120a, 120b continuously take images of the user 500 and the surrounding environment of the piece of luggage 102.
The image 150 includes the horizontal line 142 of light being reflected off of the user 500. The horizontal line 142 of light reflected off of the user 500 includes the height h1. In the vision monitoring mode, the self-driving system 100 determines a distance D (illustrated in
In response to the images taken by the cameras 114a-114d, the self-driving system 100 instructs one or more motorized wheels 106a-106d to move the luggage 102 in a given direction, such as in a given direction towards the user 500 or in a given direction towards a destination. In an example where the position of the user 500 relative to the piece of luggage 102 is determined by the self-driving system 100, the self-driving system 100 will continuously monitor and follow and/or lead the user 500 in a rear following position, a side following position, or a front leading position. In one embodiment, which can be combined with other embodiments, the laser emitters 116a-116d emit light towards a plurality of targets (such as the user 500 and an object). The self-driving system 100 instructs the piece of luggage 102 to follow the target (such as the user 500) that has the smallest height of a horizontal line of reflected light off of that target (such as the height h1 of the horizontal line 142 that is less than a height of an object, such as an obstacle). In one example, the self-driving system 100 instructs the one or more motorized wheels 106a-106d to move the luggage 102 in a given direction towards the target having the smallest height of a horizontal line of reflected light off of that target.
The self-driving system 100 uses the position of the user 500 relative to the piece of luggage 102 to calculate the distance D between the user 500 and the piece of luggage 102. In response to the information received by the onboard ultra-wideband device, the self-driving system 100 may instruct one or more motorized wheels 106a-d to move the luggage 102 in a given direction.
The self-driving system 100 is configured to switch between a following mode and a leading mode. In the following mode, the self-driving system 100 instructs the motorized wheels 106a-106d to move the piece of luggage 102 in a given direction towards the user 500. In the following mode, the piece of luggage 102 follows the user 500. In the leading mode, the self-driving system 100 instructs the motorized wheels 106a-106d to move the piece of luggage 102 in a given direction towards a destination, such as a location within an airport, for example a boarding gate in an airport. In the leading mode, the piece of luggage 102 leads the user 500 such that the user 500 may follow the piece of luggage 102.
In each of the following mode and the leading mode, the self-driving system 100 may be in the vision monitoring mode or the radio wave monitoring mode.
The cellular phone 499 is used by the user 500 described above and below. The transmitter 498 is configured to transmit ultra-wideband signals. Both the mobile ultra-wideband device 400 having a transmitter 402 and the cellular phone 499 having a transmitter 498 may communicate with the communication modules 61, 75, respectively, via ultra-wideband, radio frequency identification (active and/or passive), Bluetooth (low energy), WiFi, and/or any other form of communication known in the art. The cellular phone 499 and the mobile ultra-wideband device 400 are configured to receive information from the CPU 124 regarding the operation of the self-driving system 100. The mobile ultra-wideband device communication module 75 and the phone communication module 61 may each be a separate unit from, or integrated into, the onboard ultra-wideband device 200. The cellular phone 499 may perform one or more of the same functions as the mobile ultra-wideband device 400.
The CPU 124 is configured to switch between the following mode and the leading mode, each of which is discussed above. The CPU 124 defaults to the following mode. The CPU 124 of the self-driving system 100 is configured to switch between the vision monitoring mode and the radio wave monitoring mode, each of which is discussed above.
When the self-driving system 100 is in the vision monitoring mode, the CPU 124 is configured to receive from the one or more cameras 114a-114d one or more images (such as image 150) of a target (such as user 500) that include the light reflected off of the target (such as the horizontal line 142 of light that is reflected off of the user 500). In response to receiving the images from the one or more cameras 114a-114d, the CPU 124 is configured to determine a distance (such as the distance D) to the target based on a height (such as the height h1) at which the light emitted by a laser emitter 116a-116d is reflected off of the target. The CPU 124 is configured to generate instructions regarding a position of the piece of luggage 102 in relation to the user 500 using the distance D and/or the first height h1. The present disclosure contemplates that the self-driving system 100 described throughout the present disclosure may include a graphics processing unit (GPU) that includes one or more of the aspects, features, and/or components of the CPU 124 described throughout the present disclosure. The self-driving system 100 may include a GPU that performs one or more of the functions performed by the CPU 124 described throughout the present disclosure. As an example, the self-driving system 100 may include a GPU that is configured to receive from the one or more cameras 114a-114d one or more images (such as image 150) of a target (such as user 500) that include the light reflected off of the target, when the self-driving system 100 is in the vision monitoring mode.
When in the radio wave monitoring mode, the CPU 124 receives information from one or more of the onboard ultra-wideband device 200 (such as from the control unit 204) and/or the mobile ultra-wideband device 400 regarding a position of the mobile ultra-wideband device 400 relative to the piece of luggage 102. The CPU 124 uses the information regarding the position of the mobile ultra-wideband device 400 relative to the piece of luggage 102 to determine a distance (such as the distance D) between the piece of luggage 102 and the mobile ultra-wideband device 400. The CPU 124 is configured to generate instructions regarding a position of the piece of luggage 102 in relation to the user 500 using the information regarding the position of the mobile ultra-wideband device 400 relative to the piece of luggage 102 and/or the determined distance between the piece of luggage 102 and the mobile ultra-wideband device 400.
In one example, the CPU 124 and the control unit 204 of the onboard ultra-wideband device 200 are separate units. In one example, the CPU 124 and the control unit 204 are integrated into a single processing unit disposed on the piece of luggage 102. In one example, the CPU 124 and the onboard ultra-wideband device 200 are separate units. In one example, the CPU 124 and the onboard ultra-wideband device 200 are integrated into a single processing unit disposed on the piece of luggage 102.
The CPU 124 sends the generated instructions regarding the position of the piece of luggage 102 in relation to the user 500 to a wheel control module 160. In the following mode the CPU 124 generates and sends instructions for the wheel control module 160 to move the piece of luggage 102 in a given direction at a given speed towards the user 500. In the leading mode the CPU 124 generates and sends instructions for the wheel control module 160 to move the piece of luggage 102 in a given direction at a given speed towards the destination at the airport at which the piece of luggage 102 is located.
Upon receiving instructions from the CPU 124, the wheel control module 160 is configured to control the direction and/or speed of the piece of luggage 102 relative to the user 500 and/or the surrounding environment based on the instructions received from the CPU 124. The wheel control module 160 communicates with a wheel speed sensor 162 and a wheel rotating motor 164. The wheel control module 160 also communicates information regarding the one or more motorized wheels 106a-106d to the CPU 124. Although only one wheel control module 160 is show, each of the one or more motorized wheels 106a-106d may include a separate wheel control module 160 in communication with the CPU 124. Each of the one or more motorized wheels 106a-106d may include a separate wheel rotating motor 164. In one example, the wheel control module 160 can be integrated into the CPU 124 as a single processing unit. In one example, the CPU 124 includes a single wheel control module 160 to control each of the one or more motorized wheels 106a-106d.
The wheel control module 160 controls the direction and/or speed of the piece of luggage 102 by increasing, decreasing, or stopping power supplied to one or more of the motorized wheels 106a-106d and/or by controlling the direction of the one or more motorized wheels 106a-106d with the wheel rotating motor 164. In one example, one or more of the power distribution module 71, the CPU 124, the onboard ultra-wideband device 200, and the wheel control module 160 are integrated into a single processing unit coupled to the luggage 102.
A positioning module 74 communicates information regarding the position of the luggage 102 to the CPU 124, the onboard ultra-wideband device 200, and/or the user 500 (via the cellular phone 499 and/or the mobile ultra-wideband device 400 for example). The positioning module 74 may be a separate unit or integrated into the onboard ultra-wideband device 200. The positioning module 74 may include one or more of a computer vision based module, GPS module, 4G module, 5G module, WiFi module, iBeacon module, Zigbee module, and/or Bluetooth module so that the user 500 can find the location of the self-driving system 100 at any time, such as in the event that the self-driving system 100 is lost.
An accelerometer 51 is configured to communication information regarding the overall acceleration and/or speed of the self-driving system 100 to the CPU 124. A wheel orientation sensor 166 is configured to communicate information regarding the orientation of the one or more motorized wheels 106a-106d to the CPU 124. The CPU 124 is also in communication with an inertial measurement unit (IMU) 77, and the proximity sensors 170a, 170b. The IMU 77 communicates information regarding the dynamic movements of the self-driving system 100, such as the pitch, roll, yaw, acceleration, and/or angular rate of the self-driving system 100 to the CPU 124. In one example, when the IMU 77 detects that the self-driving system 100 is tilting or about to fall over, the CPU will instruct a wheel control module 160 to cut power to one or more of the motorized wheels 106a-106d to prevent the self-driving system from falling over. The proximity sensors 170a, 170b are configured to communicate information regarding the presence of targets near the self-driving system 100 to the CPU 124.
The CPU 124 is in communication with the status indicator 300 and the one or more infrared sensors 310. The CPU 124 is configured to generate instructions regarding a status of the piece of luggage 102. The status of the piece of luggage 102 is determined by the CPU 124 based on information received from the various components (e.g., one or more of cameras 120a, 120b, proximity sensors 170a, 170b, cameras 114a-114d, laser emitters 116a-116d, the various modules 61, 74, 75, 160, the mobile ultra-wideband device 400, and/or the onboard ultra-wideband device 200) of the self-driving system 100. The CPU 124 is configured to automatically switch to a manual pull mode when the infrared sensors 310a, 310b (illustrated in
The self-driving system 100 includes a data storage 320. The data storage 320 stores data, such as data relating to the airport at which the piece of luggage 102 is located. The data storage 320 stores map data 321 relating to a map of the airport. The data storage 320 also stores a plurality of image feature points 322 for the airport.
The self-driving system 100 includes a remote server 340. The remote server 340 may include data regarding the airport at which the piece of luggage 102 is located, such as map data relating to the map of the airport and a plurality of image feature points for the airport. The remote server 340 may also emit radio wave signals. The self-driving system 100 includes a direct communication module 350. The direct communication module 350 may include one or more of a computer vision based module, GPS module, 4G module, 5G module, WiFi module, iBeacon module, Zigbee module, and/or Bluetooth module. The CPU 124 may communicate with the remote server 340 using the cellular phone 499 and/or the direct communication module 350. In one example, data and/or radio wave signals are sent from the remote server 340 to the cellular phone 499 of the user 500, and then relayed through the phone communication module 61 to the CPU 124. In one example, the data and/or radio wave signals are sent from the remote server 340 to the direct communication module 350, and then relayed to the CPU 124. Data received from the remote server 340, such as map data and image feature points, may be stored in the data storage 320.
In one example, the image 419 is taken at the current location of the piece of luggage 102. The plurality of image features points 420 are associated with a set of the plurality of image feature points stored in the data storage 320 and/or provided by the remote server 340 to determine the current location of the piece of luggage. In one example, the CPU 124 associates the plurality of image feature points 420 of the image 419 with the plurality of image feature points stored in the data storage 320 that correspond to the second location 412. The CPU 124 hence determines that the current location of the piece of luggage 102 is at the second location 412.
Images 419 may be taken along a path from the current location (e.g., the second location 412) to the destination (e.g., the first location 411) to determine if the image feature points along the path correspond to the plurality of image feature points 322 stored in the data storage 320 for locations along the path.
If the one or more leading requirements are met, then the self-driving system 100 prompts the user 500 to switch to the leading mode at block 509. The self-driving system 100 prompts the user 500 by sending a prompt to the user's cellular phone 499. A message is also displayed on the cellular phone 499 to the user 500 that the leading mode is ready. On the cellular phone 499 and in response to the prompt, the user 500 may select a destination, whether to turn a follower proximity function on, and/or whether to switch the self-driving system 100 from the following mode to the leading mode. The user 500 may also select other parameters in response to the prompt, such as an obstacle avoidance mode and a speed for the piece of luggage 102. At block 511, the self-driving system 100 receives user input from the cellular phone 499 of the user 500. The user input includes the user's selections, such as the destination and a decision to switch to the leading mode. The destination may be a location of an airport in which the piece of luggage 102 is located, such as a boarding gate or an information desk.
At block 513, the leading mode is started. The leading mode is started by using the CPU 124 to switch from the following mode to the leading mode. At block 515, the CPU 124 instructs the one or more motorized wheels 106a-106d to move the luggage 102 in a given direction towards the destination of the user input. In the leading mode, the self-driving system 100 instructs the piece of luggage 102 to lead the user 500 to the destination. At block 517, the self-driving system 100 determines if the follower proximity function is on. If the follower proximity function is not on, the piece of luggage 102 proceeds to lead the user 500 to the destination until the piece of luggage 102 arrives at the destination at block 521. If the follower proximity function is on, the self-driving system 100 monitors a proximity of the user 500 relative to the piece of luggage 102 at block 519. In the leading mode, one or more of the sensors 114a-114d (such as the back sensor 114d) and/or one or more of the sensors 120a, 120b (such as the second sensor 120b) may monitor the proximity of the user 500 by taking one or more images of the user 500. One or more of the sensors 114a-114d (such as the front sensor 114b) and/or one or more of the sensors 120a, 120b (such as the first sensor 120a) may monitor the front side 105 of the piece of luggage 102 to avoid obstacles.
The distance D (illustrated in
At block 537, the self-driving system 100 determines if at least one of a vision based navigation or a radio wave based navigation is available for the airport determined at block 527. Determining if the vision based navigation is available includes determining whether a map and a plurality of image feature points of the airport are available, and determining a current location of the piece of luggage 102 using the map and the plurality of image feature points. The determining if the map and the plurality of image feature points of the airport are available includes: determining if the map and the plurality of image feature points are stored in the data storage 320, if the map and plurality of image feature points are not stored in the data storage 320, the map and the plurality of image feature points are downloaded from the remote server 340. In one example, the map and the plurality of image feature points are downloaded through the cellular phone 499 of the user 500.
The determining the current location of the piece of luggage 102 includes taking one or more images 149 using one or more of the cameras 114a-114d and/or one or more of the cameras 120a, 120b. The images 149 include a plurality of image feature points 420. The plurality of image feature points 420 are associated with the downloaded and/or stored plurality of image feature points of a location of the airport to determine the current location of the piece of luggage 102. That is, the downloaded and/or stored plurality of image feature points that match up with the plurality of image feature points 420 correspond to the location that is the current location of the piece of luggage 102.
Determining if the radio wave based navigation is available includes prompting the remote server 340 by asking whether the airport determined at block 527 is supporting radio wave based navigation. If the airport is supporting radio wave based navigation, the remote server 340 will transmit a radio wave signal. The radio wave signal is received and the self-driving system 100 determines if the radio wave signal is sufficient to determine a current location of one or more of the user 500 and/or the piece of luggage 102. If the radio wave signal is sufficient, the current location is determined. If the radio wave signal is insufficient, or if the radio wave signal is not received by the self-driving system 100, then a message is displayed on the cellular phone 499 of the user 500 for the user 500 to move to a new location so that the remote server 340 may be prompted again. The new location is different than the current location. The piece of luggage 102 may also be prompted to move to the new location.
The CPU 124 of the self-driving system 100 may prompt the remote server 340 for the radio wave signal and/or receive the radio wave signal from the remote server 340 using one or more of the cellular phone 499, the direct communication module 350, and/or the positioning module 74.
If the CPU 124 determines that the vision based navigation is available, the vision based navigation is used to navigate the piece of luggage 102 through the airport during the leading mode after the leading mode starts at block 513. If the CPU 124 determines that the radio wave based navigation is available, the radio wave based navigation is used to navigate the piece of luggage 102 through the airport during the leading mode after the leading mode starts at block 513.
If the vision based navigation is used during the leading mode, the back camera 114d may be used to monitor the proximity of the user 500 by taking one or more images 150 of the user 500. The front camera 114b and left and right side cameras 114a, 114c may be used to avoid obstacles and navigate through the airport toward the destination by taking one or more images 419 of the airport. A computer vision based module may be used as the positioning module 74 to navigate through the airport for vision based navigation.
If the radio wave based navigation is used during the leading mode, the back camera 114d may be used to monitor the proximity of the user 500 by taking one or more images 150 of the user 500. The front camera 114b and left and right side cameras 114a, 114c may be used to avoid obstacles by taking one or more images 419 of the airport having the obstacles. A radio wave module, such as a 4G module, 5G module, iBeacon module, and/or Zigbee module, may be used as the positioning module 74 to navigate through the airport for radio wave based navigation.
Different cameras of the one or more sensors 120a, 120b and/or the one or more sensors 114a-114d may monitor the proximity of the user 500 as the self-driving system 100 switches between the leading mode and the following mode.
For example, the left side camera 114a may be used to monitor the proximity of the user 500 at block 505 in the following mode by taking one or more images of the user 500. At block 505, the piece of luggage 102 may follow the user 500 on a right side of the user 500 such that the left side camera 114a faces the user 500. At block 513, during the leading mode and in the first position 544 illustrated in
The right side camera 114c is used to monitor the proximity of the user 500 in the following mode by taking one or more images of the user 500 while the front camera 114b, left side camera 114a, and/or back camera 114d may be used for positioning, navigation, and/or obstacle avoidance. In such an example, a first camera (e.g., the back camera 114d) is used by the CPU 124 to monitor the proximity of the user 500 in the leading mode and a second camera (e.g., the right side camera 114c) is used in the following mode.
As the user 500 walks from the third position 549 to the fourth position 550, the front camera 114b faces the user and is used to monitor the proximity of the user 500 while the left side camera 114a, a right side camera 114c, and/or back camera 114d may be used for positioning, navigation, and/or obstacle avoidance.
The leading mode, and the ability to switch between the leading mode and the following mode, of the self-driving system 100 facilitate effectively and efficiently finding destinations in an airport. Benefits of the present disclosure include effectively and efficiently finding destinations, such as boarding gates, in airports; time savings; ease of finding destinations; reduced or eliminated probability of missing a connecting flight; and reduced or eliminated probability of damage to cameras. It is contemplated that one or more of the aspects disclosed herein may be combined. Moreover, it is contemplated that one or more of the aspects disclosed herein may include some or all of the aforementioned benefits.
While the foregoing is directed to embodiments of the present disclosure, other and further embodiments of the present disclosure may be devised without departing from the basic scope thereof. The present disclosure also contemplates that one or more aspects of the embodiments described herein may be substituted in for one or more of the other aspects described. The scope of the present disclosure is determined by the claims that follow.
Number | Date | Country | Kind |
---|---|---|---|
202010012417.5 | Jan 2020 | CN | national |