The present specification generally relates to autonomous navigation and, more specifically, to systems and methods for autonomous navigation to locate a user with ultra-wideband (UWB) sensing.
Autonomous vehicles may employ technology to utilize Global Positioning System (GPS) data to locate passengers. However, such GPS data may not be accurate enough to locate passengers with sufficient specificity. For instance, GPS data may not be able to differentiate which side of a street a user is on. For manually driven ridesharing vehicles, users may accommodate for inaccurate GPS data by verifying with the driver that they are the right passenger, calling the driver, and the like. However, such options to contact a manual driver are not typically available for autonomously operated ridesharing vehicles involving autonomous navigation. The user may desire a manner of an autonomous vehicle to accurately locate the user beyond use of GPS data to accommodate for potentially inaccurate GPS data.
In one embodiment, a navigation system may include one or more processors, a non-transitory memory communicatively coupled to the one or more processors, and machine readable instructions stored in the non-transitory memory. The machine readable instructions cause the navigation system to perform at least the following when executed by the one or more processors: interact, via an ultra-wideband (UWB) sensor of the autonomous vehicle, with a smart mobile device of a user to locate the user for pick up at a pickup position by the autonomous vehicle, locate the user for pick up by the autonomous vehicle based on the UWB sensor interaction, and navigate to and pick up the user at the pickup position via the autonomous vehicle. The machine readable instructions further cause the navigation system to perform at least the following when executed by the one or more processors: detect, via the UWB sensor, when the user is in the autonomous vehicle in a collected position, and operate the autonomous vehicle when the user is detected via the UWB sensor to be in the autonomous vehicle in the collected position.
In another embodiment, a method for autonomous navigation may include interacting, via an ultra-wideband (UWB) sensor of an autonomous vehicle, with a smart mobile device of a user to locate the user for pick up at a pickup position by the autonomous vehicle, locating the user for pick up by the autonomous vehicle based on the UWB sensor interaction, navigating to and pick up of the user at the pickup position via the autonomous vehicle, detecting, via the UWB sensor, when the user is in the autonomous vehicle in a collected position, and operating the autonomous vehicle when the user is detected via the UWB sensor to be in the autonomous vehicle in the collected position.
In one other embodiment, an autonomous vehicle may include a navigation system communicatively coupled to the autonomous vehicle, an ultra-wideband (UWB) sensor, one or more processors, a non-transitory memory communicatively coupled to the one or more processors, and machine readable instructions stored in the non-transitory memory. The machine readable instructions cause the autonomous vehicle to perform at least the following when executed by the one or more processors: interact, via a UWB sensor of the autonomous vehicle, with a smart mobile device of the user to locate the user for pick up at a pickup position by the autonomous vehicle, locate the user for pick up by the autonomous vehicle based on the UWB sensor interaction, and, via the navigation system, navigate to and pick up the user at the pickup position via the autonomous vehicle. The machine readable instructions further cause the autonomous vehicle to perform at least the following when executed by the one or more processors: detect, via the UWB sensor, when the user is in the autonomous vehicle in a collected position, and operate the autonomous vehicle when the user is detected via the UWB sensor to be in the autonomous vehicle in the collected position.
These and additional features provided by the embodiments of the present disclosure will be more fully understood in view of the following detailed description, in conjunction with the drawings.
The embodiments set forth in the drawings are illustrative and exemplary in nature and not intended to limit the disclosure. The following detailed description of the illustrative embodiments can be understood when read in conjunction with the following drawings, where like structure is indicated with like reference numerals and in which:
Embodiments of the present disclosure are directed to ridesharing technology for autonomous ridesharing vehicles to specifically locate passengers with use of ultra-wideband (UWB) via an autonomous ridesharing vehicle that is able to detect the location of a passenger with increased precision. Such an autonomous ridesharing vehicle equipped with UWB technology may be configured to interact with a smart mobile device of a user, for instance. The term “interact” or “interaction” as referenced herein describe an electronic communication including a transmission and reception of electronic data signals between at least two electronic devices, such as through wireless or wireless electronic communication networks. Through the use of UWB technology, the autonomous ridesharing vehicle may more accurately determine a user's location who requested the vehicle.
As a non-limiting example, the embodiments disclosed herein may be directed to locating users with ultra-wideband (UWB) sensing when a vehicle is in autonomous mode. Such UWB sensing utilizes UWB technology, such as UWB sensors, configured to implement radio technology via a pulse-based system that uses a low energy level for short-range, high-bandwidth communication over a broad portion of a radio spectrum, such as for transmission of information over a large bandwidth of greater than 500 MHz, as understood to one of ordinary skill in the art. Such UWB technology may employ transmit time communication schemes such as use of Time of Flight (ToF) to instead of signal strength measurement of electronic communications. The autonomous vehicles may, through a navigation system, locate a user with UWB sensing between the autonomous vehicle and the smart mobile device of the user, collect the user, determine the user is collected, and, based on the determination, navigate to deliver the collected user to a destination. Various autonomous navigation methods and systems will be described in more detail herein with specific reference to the corresponding drawings.
Referring to
A user 106 may use a smart mobile device 108 to request pick up and a ride by the autonomous vehicle 102. The user 106 may request pick up at a pickup position 112. The autonomous vehicle 102 may use a UWB sensor 110 to interact with the smart mobile device 108 to locate the user 106 at the pickup position 112, as described in greater detail further below. In an embodiment, the autonomous vehicle 102, such as through a navigation system 300 described in greater detail below with respect to
When the user 106 is picked up and seated in the autonomous vehicle 102, as shown in
Referring again to
Referring to
In block 204, the user 106 is located for pick up by the autonomous vehicle 102 based on the UWB sensor interaction of block 202. In an embodiment, based on the UWB sensor interaction, the pickup position 112 is determined to be the first side 104A or the second side 104B of the roadway surface 104.
In block 206, the autonomous vehicle 102 navigates to and picks up the user 106 at the pickup position 112. As a non-limiting example, the autonomous vehicle 102 navigates to and picks up the user 106 at the pickup position 112 that is determined to be the first side 104A or the second side 104B of the roadway surface 104. The autonomous vehicle 102 may be configured to detect via the interaction between the UWB sensor 110 and the smart mobile device 108 of the user 106 a distance to and a direction toward the user 106 to locate the user 106 with accuracy. Thus, the autonomous vehicle may determine how far away the user 106 is and on which side of the roadway surface 104 the user 106 is located.
In block 208, via the UWB sensor 110, the autonomous vehicle 102 detects when the user 106 is in the autonomous vehicle 102 in the collected position 114. Based on the interaction between the UWB sensor 110 and the smart mobile device 108 of the user 106 to determine the distance to and the direction toward the user 106, the autonomous vehicle 102 may determine when the user 106 is in the collected position 114 and seated in the autonomous vehicle 102. The autonomous vehicle 102 may further be configured to use other vehicle sensors to determine when the user 106 is in the collected position 114. By way of example, and not as a limitation, the autonomous vehicle 102 may include a knowledge database comprising one or more internal vehicle characteristics, such as, but not limited to, cabin space dimensions, rear seat distance dimensions, other internal spacing dimensions, and the like to determine when the user 106 is in vehicle. Other vehicle sensors in vehicle may further be utilized (e.g., seat sensors and/or cameras inside the vehicle) to assist with a confidence value of the determination that the user 106 is in the autonomous vehicle 102. The autonomous vehicle 102 may be configured to determine the confidence value and/or a probability calculation based on the UWB sensor 110 interaction with the smart mobile device, use of the knowledge database of one or more internal vehicle characteristics, other vehicle sensors in the autonomous vehicle 102, or combinations thereof. The confidence value may be representative of a determined confidence associated with a determination that the user 106 is in the collected position 114, and the probability calculation may be representative of a probability that the user 106 is in the collected position 114 associated with the determination by the autonomous vehicle 102. In embodiments, when the confidence value and/or the probability calculation is above a threshold, the autonomous vehicle 102 may then provide the determination that the user 106 is in the collected position 114.
Based on a determination that the user 106 is in the collected position 114, the autonomous vehicle 102 can start navigating to the next destination. Prior to navigation, the autonomous vehicle may request feedback from the user 106, such as requesting the user 106 to confirm the user 106 is in the autonomous vehicle 102. Such feedback may be requested through the smart mobile device 108 of the user 106, through a display within the autonomous vehicle 102, through audio technology within the autonomous vehicle 102, or combinations thereof. The feedback may be requested during multiple stages and when the user 106 has reached the destination requested by the user 106. For instance, the autonomous vehicle 102 may request the user 106 to confirm that the user 106 has reached the destination requested by the user.
In block 210, the autonomous vehicle 102 begins operation, such as to start navigation, when the user 106 is detected via the UWB sensor 110 to be in the autonomous vehicle 102 in the collected position 114. The autonomous vehicle 102 may delay operation for a predetermined time period to provide the user 106 with sufficient time to be seated prior to vehicle operation, may request user feedback that the user 106 is in the collected position such as through the smart mobile device 108 or a feedback console within the autonomous vehicle, or combinations thereof. The autonomous vehicle 102 may then navigate to the destination received when the user 106 is detected in the collected position 114. In an embodiment in which multiple passengers are riding in the autonomous vehicle 102, the application tool utilized by the smart mobile device 108 to interact with the autonomous vehicle 102 may need to be informed about the multiple passengers.
When the user 106 reaches the destination requested by the user 106, the autonomous vehicle 102 may determine that the smart mobile device 108 remains within the autonomous vehicle 102 while the user 106 has left the autonomous vehicle 102. The autonomous vehicle 102 may make such a determination through a combination of interaction between the UWB sensor 110 and the smart mobile device 108 as well as use of vehicle sensors as described herein to determine the user 106 is not within the autonomous vehicle 102. In an embodiment, the autonomous vehicle 102, such as through the navigation system 300, is configured to navigate to a destination when the user 106 is detected in the collected position 114, determine as a user exit determination that the user 106 has exited the autonomous vehicle 102 based on one or more vehicle sensors, determine as a device determination that the smart mobile device 108 is within the autonomous vehicle 102 based on the interaction between the UWB sensor 110 and the smart mobile device 108, and alert the user that the smart mobile device 108 is within the autonomous vehicle 102 via an alert notification based on the user exit determination and the device determination.
In embodiments, the UWB sensor 110 of the autonomous vehicle 102 may interact with the smart luggage 116 of the user 106 to sense whether, after the user 106 and the smart mobile device 108 has left the autonomous vehicle 102, the user 106 has collected the smart luggage 116 from, for example, a trunk of the autonomous vehicle 102 within a predetermined period of time. The autonomous vehicle 102 may sense whether the smart luggage has been collected from the trunk based on internal truck characteristics in the knowledge database, knowledge of whether the trunk is open or closed via associated vehicle sensors such as trunk sensors, or combinations thereof. If the autonomous vehicle 102 detects that the smart luggage 116 has not been collected, such as after the predetermined period of time, the autonomous vehicle 102 may alert the user 106 through the alert technology 118, a notification message to the application tool of the smart mobile device 108, or combinations thereof and as described herein. The autonomous vehicle 102, such as through the navigation system 300, may be configured to navigate to a destination when the user 106 is detected in the collected position 114, determine as the user exit determination that the user 106 has exited the autonomous vehicle 102 based on one or more vehicle sensors, determine as a luggage determination that the smart luggage 116 of the user 106 is within the autonomous vehicle 102 based on the interaction between the UWB sensor 110 and the smart luggage 116, and alert the user 106 that the smart luggage 116 is within the autonomous vehicle 102 via an alert notification based on the user exit determination and the luggage determination.
The alert notifications described herein may include an alert through the alert technology 118 associated with the autonomous vehicle 102, a notification message transmitted to an application tool of the smart mobile device 108, or combinations thereof. In embodiments, the alert technology 118 includes sound technology, tactile technology, visual technology, or combinations thereof. The alert technology 118 may be configured to generate the alert through the autonomous vehicle 102 that includes a sound alert, a haptic alert, a visual alert, or combinations thereof. The visual alert may include one or more vehicle lights flashing, and the sound alert may include vehicle honking, an audio message transmitted by audio speakers, or combinations thereof.
The autonomous vehicle 102 and the navigation system 300 may thus be configured to (1) interact, via the UWB sensor 110, with a smart mobile device 108 of a user 106 in a pickup position 112 to locate the user 106 for pick up at the pickup position 112 by the autonomous vehicle 102 (
Referring to
The navigation system 300 includes machine readable instructions stored in non-transitory memory that cause the navigation system 300 to perform one or more of instructions when executed by the one or more processors, as described in greater detail below. The navigation system 300 includes a communication path 302, one or more processors 304, a memory component 306, a localization component 312, a storage or database 314 that may include a product image database, an alert component 316, a network interface hardware 318, a server 320, a network 322, and at least one computer 324. The various components of the navigation system 300 and the interaction thereof will be described in detail below.
In some embodiments, the navigation system 300 is implemented using a wide area network (WAN) or network 322, such as an intranet or the Internet, or other wired or wireless communication network that may include a cloud computing-based network configuration. The computer 324 may include digital systems and other devices permitting connection to and navigation of the network, such as the smart mobile device. Other navigation system 300 variations allowing for communication between various geographically diverse components are possible. The lines depicted in
As noted above, the navigation system 300 includes the communication path 302. The communication path 302 may be formed from any medium that is capable of transmitting a signal such as, for example, conductive wires, conductive traces, optical waveguides, or the like, or from a combination of mediums capable of transmitting signals. The communication path 302 communicatively couples the various components of the navigation system 300. As used herein, the term “communicatively coupled” means that coupled components are capable of exchanging data signals with one another such as, for example, electrical signals via conductive medium, electromagnetic signals via air, optical signals via optical waveguides, and the like.
As noted above, the navigation system 300 includes the processor 304. The processor 304 can be any device capable of executing machine readable instructions. Accordingly, the processor 304 may be a controller, an integrated circuit, a microchip, a computer, or any other computing device. The processor 304 is communicatively coupled to the other components of the navigation system 300 by the communication path 302. Accordingly, the communication path 302 may communicatively couple any number of processors with one another, and allow the modules coupled to the communication path 302 to operate in a distributed computing environment. Specifically, each of the modules can operate as a node that may send and/or receive data. The processor 304 may process the input signals received from the system modules and/or extract information from such signals.
As noted above, the navigation system 300 includes the memory component 306 which is coupled to the communication path 302 and communicatively coupled to the processor 304. The memory component 306 may be a non-transitory computer readable medium or non-transitory computer readable memory and may be configured as a nonvolatile computer readable medium. The memory component 306 may comprise RAM, ROM, flash memories, hard drives, or any device capable of storing machine readable instructions such that the machine readable instructions can be accessed and executed by the processor 304. The machine readable instructions may comprise logic or algorithm(s) written in any programming language such as, for example, machine language that may be directly executed by the processor, or assembly language, object-oriented programming (OOP), scripting languages, microcode, etc., that may be compiled or assembled into machine readable instructions and stored on the memory component 306. Alternatively, the machine readable instructions may be written in a hardware description language (HDL), such as logic implemented via either a field-programmable gate array (FPGA) configuration or an application-specific integrated circuit (ASIC), or their equivalents. Accordingly, the methods described herein may be implemented in any conventional computer programming language, as pre-programmed hardware elements, or as a combination of hardware and software components. In embodiments, the navigation system 300 may include the processor 304 communicatively coupled to the memory component 306 that stores instructions that, when executed by the processor 304, cause the processor to perform one or more functions as described herein.
Still referring to
The navigation system 300 may comprises: (i) the localization component 312 of the autonomous vehicle 102 to locate a user 106 via UWB sensing between a smart mobile device 108 of the user 106 and a UWB sensor 110 of the autonomous vehicle 102 and (ii) the alert component 316 to alert a user to an event, such as a determination that the smart mobile device 108 and/or the smart luggage 116 remains in the autonomous vehicle 102 when the user 106 is determined to not be within the autonomous vehicle 102 as described herein. Further, an artificial intelligence component may be used in embodiments to train and provide machine learning capabilities to a neural network to aid with improvement of confidence of a determination the user 106 is seated in the autonomous vehicle 102 as described herein. The localization component 312 and the alert component 316 are coupled to the communication path 302 and communicatively coupled to the processor 304. The processor 304 may process the input signals received from the system modules and/or extract information from such signals.
Data stored and manipulated in the navigation system 300 as described herein is utilized by the artificial intelligence component, which is able to leverage a cloud computing-based network configuration such as the cloud to apply Machine Learning and Artificial Intelligence. This machine learning application may create models that can be applied by the navigation system 300, to make it more efficient and intelligent in execution. As an example and not a limitation, the artificial intelligence component may include components selected from the group consisting of an artificial intelligence engine, Bayesian inference engine, and a decision-making engine, and may have an adaptive learning engine further comprising a deep neural network learning engine.
The navigation system 300 includes the network interface hardware 318 for communicatively coupling the navigation system 300 with a computer network such as network 322. The network interface hardware 318 is coupled to the communication path 302 such that the communication path 302 communicatively couples the network interface hardware 218 to other modules of the navigation system 300. The network interface hardware 318 can be any device capable of transmitting and/or receiving data via a wireless network. Accordingly, the network interface hardware 318 can include a communication transceiver for sending and/or receiving data according to any wireless communication standard. For example, the network interface hardware 318 can include a chipset (e.g., antenna, processors, machine readable instructions, etc.) to communicate over wired and/or wireless computer networks such as, for example, wireless fidelity (Wi-Fi), WiMax, Bluetooth, IrDA, Wireless USB, Z-Wave, ZigBee, or the like.
Still referring to
The network 322 can include any wired and/or wireless network such as, for example, wide area networks, metropolitan area networks, the Internet, an Intranet, the cloud 323, satellite networks, or the like. Accordingly, the network 322 can be utilized as a wireless access point by the computer 324 to access one or more servers (e.g., a server 320). The server 320 and any additional servers such as a cloud server generally include processors, memory, and chipset for delivering resources via the network 322. Resources can include providing, for example, processing, storage, software, and information from the server 320 to the navigation system 300 via the network 322. Additionally, it is noted that the server 320 and any additional servers can share resources with one another over the network 322 such as, for example, via the wired portion of the network, the wireless portion of the network, or combinations thereof.
It is noted that the terms “substantially” and “about” may be utilized herein to represent the inherent degree of uncertainty that may be attributed to any quantitative comparison, value, measurement, or other representation. These terms are also utilized herein to represent the degree by which a quantitative representation may vary from a stated reference without resulting in a change in the basic function of the subject matter at issue.
While particular embodiments have been illustrated and described herein, it should be understood that various other changes and modifications may be made without departing from the spirit and scope of the claimed subject matter. Moreover, although various aspects of the claimed subject matter have been described herein, such aspects need not be utilized in combination. It is therefore intended that the appended claims cover all such changes and modifications that are within the scope of the claimed subject matter.