This document relates to vehicle monitoring and control.
Autonomous vehicle navigation is a technology for sensing the position and movement of a vehicle and, based on the sensing, autonomously control the vehicle to navigate towards a destination. Autonomous vehicle navigation can have important applications in transportation of people, goods and services. One of the components of autonomous driving, which ensures the safety of the vehicle and its passengers, as well as people and property in the vicinity of the vehicle, is precise perception of the area around the rear of the vehicle.
Disclosed are devices, systems and methods for a rear-facing perception system for vehicles. In some embodiments, a self-powered and an automatically synchronizing rear-facing perception system, which includes a center unit and two corner units, is installed at the rear of a vehicle or on a semi-trailer truck (commonly referred to as a tractor-trailer or a “semi”) and provides an unobstructed of an area around a rear of the vehicle. The corner and center units wirelessly communicate with a control unit located adjacent to the driver of the vehicle.
In one aspect, the disclosed technology can be used to provide a method for operating a rear-facing perception unit in a vehicle. An exemplary rear-facing perception system contains two corner units and a center unit, with each of the two corner units and the center unit including a camera module and a dual-band transceiver. This method includes pairing with a control unit by communicating, using the dual-band transceiver, over at least a first frequency band, transmitting, upon successfully pairing, a first trigger signal to the two corner units, where the first trigger signal is transmitted over a second frequency band non-overlapping with the first frequency band, and switching, subsequent to transmitting the first trigger signal, to an active mode, where the first trigger signal causes the two corner units to switch to the active mode, where the power module includes a battery, and where the active mode includes orienting the camera modules on the center unit and the two corner units to provide an unobstructed view of an area around a rear of the vehicle.
In another aspect, the above-described method is embodied in the form of processor-executable code and stored in a computer-readable program medium.
In yet another aspect, a device that is configured or operable to perform the above-described method is disclosed. The device may include a processor that is programmed to implement this method.
The above and other aspects and features of the disclosed technology are described in greater detail in the drawings, the description and the claims.
Backover accidents cause thousands of injuries every year because there is a blind spot behind every car and truck. Typically, this blind spot stretches from 15 feet in smaller cars to more than 25 feet in larger pickups and SUVs, and to even longer distances for tractor-trailers. Human drivers often rely on their visual inspection of the rear of their vehicle performed prior to entering the vehicle for driving. However, autonomous vehicles do not have such a mechanism. The increasing use of autonomous vehicles makes it important to use precise perception of objects at and around the rear of a vehicle, especially in the case of tractor-trailers that maneuver around static barriers, loading docks, etc. as a part of their daily operations.
A backup camera (or more generally, a rear-facing perception system) may maintain a driver's control and visibility while operating a vehicle. For example, it can prevent backing up over someone (with toddlers and pets being particularly susceptible) or something, and may assist with parallel parking while avoiding damage to adjacent vehicles. In another example, and if the vehicle were run into by another vehicle, the rear-facing perception system may provide evidence of the incidence for insurance purposes.
A rear-facing perception system can increase the operational safety of autonomous vehicles. Unlike passenger cars, where a rear-facing camera is permanently installed in the vehicle, some applications of autonomous vehicles may use rear perception systems only occasionally. For example, in cargo delivery business, truck cabs (cabins or tractors) are often attached to different cargo holds or trailers. Therefore, it would be useful to include a rear-perception system that requires very little or no installation when a truck cab is attached to another trailer. In particular, it would be beneficial if the truck driver does not have to attach cables from the rear perception system portion on the trailer to a display and other electronics in the cabin. Easy formation and activation of rear perception system in such cases saves the truck driver the additional time and work and eliminates manual errors in connecting and setting up the system. Furthermore, transport service providers often manage a large inventory of trailers and cabins, and truck trailers may often simply sit in a parking lot when not being used. In such a case, it would be desirable if the electronics associated with a rear perception system is put to use by a simple detachment from a trailer not being used and attachment to another trailer that will be paired with a truck cabin and used.
Some desirable features of such a system may include: (i) installation capabilities on trailers of semi-trailer trucks, thereby providing complete rear perception without blind spots, (ii) low-cost and self-powered, (iii) easily installable and detachable, (iv) the ability to wirelessly communicate with the driver, (v) precisely perceive the locations of all objects in the backup route, and (vi) provide automated calibration and synchronization.
The disclosed technology in this document provides solutions that can be used to address the above and other technical issues in the rear-facing monitoring and control of fully- and semi-autonomous vehicles.
In some embodiments, a subset of components of the rear-facing perception system (the CIU, LIU and MU) shown in
In some embodiments, the CIU, LIU and RIU include an attachment mechanism that is configured to enable easy attachability and detachability of the integration units to the vehicle or trailer of a tractor-trailer. For example, the attachment mechanism may be a magnetic panel that may be used to easily affix the integration units to the trailer. In another example, a suction panel or a hook-and-fastener system may be used for the attachment mechanism.
In some embodiments, and as shown in
In some embodiments, and as shown in
In some embodiments, different frequency bands and alternate wireless protocols and devices may be employed in the operation of the rear-facing perception system. For example, the VCU and integration units may use Bluetooth, Zigbee, direct short-range communications (DSRC) based on the 802.11p wireless standard, and so on.
In some embodiments, the ultrasonic module 311 may include a left sensor 341 and a right sensor 342, which are installed on the left and right corners of the trailer, respectively. In an example, the ultrasonic module 311 may be used not only to detect the presence of objects, but to ascertain their distance from the vehicle. In some embodiments, the camera module 317 may include a normal camera 361 and/or a fisheye camera 362 (e.g., to provide extremely wide angles of view ranging from 100°-180°) to provide a view of the area around the rear of the vehicle. For example, the inputs of the ultrasonic module 311 and/or the camera module 317 may be used independently to provide precise perception of any obstacles behind the vehicle.
In some embodiments, the inputs from the camera module 317 and the ultrasonic module 311 may be combined, or fused, prior to being used for detection of obstacles and people behind the vehicle. For example, augmenting the camera module 317 with the ultrasonic module 311 enables the rear-facing perception system to accurately determine the distances of objects from the vehicle, as well as to operate in low light conditions or even in full darkness.
In some embodiments, the radar module 313 may include a 24 GHz radar module 351, which is configured to augment the camera module 317. For example, the radar module 313 is able to detect obstacles in any weather conditions. Combining the camera and radar modules enables the rear-facing perception system to accurately operate in the presence of fog or mist, thereby ensuring the safety of the autonomous vehicles even in adverse situations.
In some embodiments, the HMI module 312 may include a synchronization button 331, a warning light 332 and a warning buzzer 333. For example, and as will be discussed later in this document, the synchronization button 331 may be configured to activate the corner integration units as part of the manual synchronization process between the VCU and the CIU, LIU and RIU. In an example, the warning light 332 and warning buzzer 333 may be configured to provide a visual and aural stimulus when the rear-facing perception system is operational.
In some embodiments, the computing module 314 includes a microcontroller unit (MCU, 381) and a Micro-Electro-Mechanical System (MEMS) based Inertial Measurement Unit (IMU). For example, an IMU is an electronic device that measures and reports a body's specific force, angular rate, and sometimes the magnetic field surrounding the body, using a combination of accelerometers and gyroscopes, sometimes also magnetometers. In an example, the MCU may control the MEMS IMU and/or other components to ensure that the rear-facing perception system is operating in an energy-efficiency manner to ensure vehicular and personnel safety.
In some embodiments, the RF module 316 includes a dual-band transceiver that supports RF communications in UHF and LF bands. The RF-433M transceiver 321 may be used to communicate with the VCU, whereas the 125 kHz RFID module 322 may be used to communicate with the LIU and RIU. In this example, the UHF and LF frequency bands are non-overlapping in frequency. In other embodiments, the dual-band communications may be orthogonal in time (e.g., using TDMA (time division multiple access)) or in structure (e.g., using CDMA (code division multiple access)).
In some embodiments, the power module 318 includes a battery 371 and/or a solar panel 372. More generally, the power module 318 is a self-powered source for the various components of the rear-facing perception system that does not require a constant connection to the power grid. In some embodiments, the power module 318 may be periodically re-charged to ensure that the various components function as intended during operation of the vehicle.
In some embodiments, the camera module 417 includes mechanisms to keep the respective normal and fisheye cameras of the CIU, LIU and RIU clean. For example, a camera cleaning system may be configured to spray a cleaning solvent (or water) onto the camera lens, followed by a puff of air to dry the camera lens. In another example, the VCU may provide the driver of the vehicle with an indication or reminder to ensure that the cameras are clean prior to operating the vehicle.
In some embodiments, step 510 may be followed by pressing the synchronization button on the tractor (e.g., the synchronization button 231 in
Typically, the manual synchronization process continues with step 550, wherein the CIU, LIU and MU send their identification information to the VCU using, for example, the RF-433M module in the dual-band transceiver. If this process is successful (step 560), the VCU stores the IDs in the EEPROM (e.g., EEPROM 224 in
The automatic synchronization process 600 beings at step 610, wherein movement of the VCU is detected in the form of signals from one or more axes of the IMU of the VCU. In an example, if the detected signals correspond to an acceleration of less than 1 g (step 620), then the automatic synchronization process terminates (step 670). However, if the detected signals correspond to an acceleration of greater than or equal to 1 g (e.g., an acceleration of 1 g equates to a rate of change in velocity of approximately 35 kilometers per hour (or 22 miles per hour) for each second that elapses), then the VCU transmits a synchronization signal using, for example, the RF-433M module in the dual-band transceiver (step 630).
This synchronization signal is received by the integration units (CIU, LIU, RIU), which then transmit signals corresponding to the axes of their corresponding IMUs (e.g., IMUs 382 and 482 in
The manual and automatic synchronization processes described in the context of
In some embodiments, embodiments of the disclosed technology implement wakeup and sleep processes to operate in a power-efficient manner, so as to enables the self-powered CIU, LIU and RIU to operate with frequent recharging.
The method 700 includes, at step 720, the VCU transmitting a wakeup signal using, for example, the RF-433M module in the dual-band transceiver. In some embodiments, the wakeup signal is directed toward the CIU, which then wakes up (step 730) upon receiving the wakeup signal from the VCU.
In some embodiments, the CIU transmits a signal to the VCU in response to the wakeup signal. Additionally, the CIU sends a trigger signal (e.g., a wakeup signal that may be identical to or different from the wakeup signal received from the VCU) to the LIU and RIU using, for example, the 125 kHz RFID module in the dual-band transceiver (step 740). Upon receiving this trigger signal, the LIU and RIU wake up (step 750).
The wakeup process concludes with the VCU receiving a signal from the CIU, LIU and RIU (step 760), acknowledging their active mode of operation. Embodiments of the disclosed technology may use the wakeup process to ensure that the rear-facing perception system operates in a power-efficient manner.
When the sleep event is detected or activated, the VCU transmits a sleep signal using, for example, the RF-433M module in the dual-band transceiver (step 820) to the integration units at the rear of the trailer. Upon receiving the sleep signal, the CIU, LIU and RIU switch to sleep mode (step 830). In some embodiments, switching to sleep mode includes the power module (e.g., 318 and 418 in
The method 900 includes, at step 920, transmitting, upon successfully pairing, a first trigger signal to the two corner units. In some embodiments, the first trigger signal is transmitted over a second frequency band that is non-overlapping with the first frequency band.
The method 900 includes, at step 930, switching to an active mode after having transmitted the first trigger signal. In some embodiments, the first trigger signal causes the two corner units to switch to the active mode. In some embodiments, the active mode includes orienting the camera modules on the center unit and the two corner units to provide an unobstructed view of an area around a rear of the vehicle.
In some embodiments, and as described in the context of at least
In some embodiments, and as described in the context of
In some embodiments, and as described in the context of
In some embodiments, and as described in the context of
In some embodiments, and as described in the context of
More generally, and in some embodiments, the VCU may communicate directly with the CIU over the first frequency band. Subsequently, the CIU may communicate with the LIU and MU (e.g., relay the information received from the VCU) over the second frequency band. In other embodiments, the VCU may communicate directly with each of the CIU, LIU and MU.
The method 1000 includes, at step 1020, receiving, upon successfully pairing, a trigger signal from the center unit over a second frequency band non-overlapping with the first frequency band.
The method 1000 includes, at step 1030, switching, subsequent to receiving the trigger signal, to an operating mode that matches the operating mode of the center unit. In some embodiments, the operating mode may be an active mode or a power-saving mode. For example, the power-saving mode may include turning off the camera module, and the active mode may include orienting the camera modules on the center unit and the two corner units to provide an unobstructed view of an area around a rear of the vehicle.
Implementations of the subject matter and the functional operations and modules described in this patent document can be implemented in various systems, digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of the subject matter described in this specification can be implemented as one or more computer program products, e.g., one or more modules of computer program instructions encoded on a tangible and non-transitory computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The term “data processing unit” or “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., FPGAs (field programmable gate arrays) or ASICs (application specific integrated circuits).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Computer readable media suitable for storing computer program instructions and data include all forms of nonvolatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., (E)EPROM and flash memory devices. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
It is intended that the specification, together with the drawings, be considered exemplary only, where exemplary means an example. As used herein, “or” is intended to include “and/or”, unless the context clearly indicates otherwise.
While this patent document contains many specifics, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments of particular inventions. Certain features that are described in this patent document in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Moreover, the separation of various system components in the embodiments described in this patent document should not be understood as requiring such separation in all embodiments.
Only a few implementations and examples are described and other implementations, enhancements and variations can be made based on what is described and illustrated in this patent document.
This patent document is a continuation of U.S. application Ser. No. 16/125,531, entitled, “REAR-FACING PERCEPTION SYSTEM FOR VEHICLES,” filed on Sep. 7, 2018. The entire contents of the above patent applications are incorporated by reference as part of the disclosure of this patent document.
Number | Name | Date | Kind |
---|---|---|---|
6317035 | Berberich et al. | Nov 2001 | B1 |
6975923 | Spriggs | Dec 2005 | B2 |
7742841 | Sakai et al. | Jun 2010 | B2 |
8346480 | Trepagnier et al. | Jan 2013 | B2 |
8706394 | Trepagnier et al. | Apr 2014 | B2 |
8718861 | Montemerlo et al. | May 2014 | B1 |
8983708 | Choe et al. | Mar 2015 | B2 |
9088744 | Grauer et al. | Jul 2015 | B2 |
9214084 | Grauer et al. | Dec 2015 | B2 |
9219873 | Grauer et al. | Dec 2015 | B2 |
9282144 | Tebay et al. | Mar 2016 | B2 |
9317033 | Ibanez-guzman et al. | Apr 2016 | B2 |
9347779 | Lynch | May 2016 | B1 |
9418549 | Kang et al. | Aug 2016 | B2 |
9494935 | Okumura et al. | Nov 2016 | B2 |
9507346 | Levinson et al. | Nov 2016 | B1 |
9513634 | Pack et al. | Dec 2016 | B2 |
9538113 | Grauer et al. | Jan 2017 | B2 |
9547985 | Tuukkanen | Jan 2017 | B2 |
9549158 | Grauer et al. | Jan 2017 | B2 |
9599712 | Van Der Tempel et al. | Mar 2017 | B2 |
9600889 | Boisson et al. | Mar 2017 | B2 |
9602807 | Crane et al. | Mar 2017 | B2 |
9620010 | Grauer et al. | Apr 2017 | B2 |
9625569 | Lange | Apr 2017 | B2 |
9628565 | Stenneth et al. | Apr 2017 | B2 |
9649999 | Amireddy et al. | May 2017 | B1 |
9690290 | Prokhorov | Jun 2017 | B2 |
9701023 | Zhang et al. | Jul 2017 | B2 |
9712754 | Grauer et al. | Jul 2017 | B2 |
9723233 | Grauer et al. | Aug 2017 | B2 |
9726754 | Massanell et al. | Aug 2017 | B2 |
9729860 | Cohen et al. | Aug 2017 | B2 |
9739609 | Lewis | Aug 2017 | B1 |
9753128 | Schweizer et al. | Sep 2017 | B2 |
9753141 | Grauer et al. | Sep 2017 | B2 |
9754490 | Kentley et al. | Sep 2017 | B2 |
9760837 | Nowozin et al. | Sep 2017 | B1 |
9766625 | Boroditsky et al. | Sep 2017 | B2 |
9769456 | You et al. | Sep 2017 | B2 |
9773155 | Shotton et al. | Sep 2017 | B2 |
9779276 | Todeschini et al. | Oct 2017 | B2 |
9785149 | Wang et al. | Oct 2017 | B2 |
9805294 | Liu et al. | Oct 2017 | B2 |
9810785 | Grauer et al. | Nov 2017 | B2 |
9823339 | Cohen | Nov 2017 | B2 |
20050062590 | Lang et al. | Mar 2005 | A1 |
20050174429 | Yanai | Aug 2005 | A1 |
20060050149 | Lang et al. | Mar 2006 | A1 |
20080055407 | Abe | Mar 2008 | A1 |
20080238636 | Birging et al. | Oct 2008 | A1 |
20080292146 | Breed et al. | Nov 2008 | A1 |
20090275296 | Huang et al. | Nov 2009 | A1 |
20100156667 | Bennie et al. | Jun 2010 | A1 |
20120007712 | Tung | Jan 2012 | A1 |
20120146779 | Hu et al. | Jun 2012 | A1 |
20150172518 | Lucas et al. | Jun 2015 | A1 |
20150302733 | Witkowski et al. | Oct 2015 | A1 |
20150302737 | Geerlings et al. | Oct 2015 | A1 |
20160052453 | Nalepka et al. | Feb 2016 | A1 |
20160334230 | Ross et al. | Nov 2016 | A1 |
20170083771 | Clark | Mar 2017 | A1 |
20170262717 | Drazan et al. | Sep 2017 | A1 |
20170301201 | Siann et al. | Oct 2017 | A1 |
20180121742 | Son et al. | May 2018 | A1 |
20200082175 | Han | Mar 2020 | A1 |
Number | Date | Country |
---|---|---|
107433950 | Dec 2017 | CN |
108270970 | Jul 2018 | CN |
202004013984 | Jan 2006 | DE |
102016209418 | Nov 2017 | DE |
2753422 | Mar 1998 | FR |
2551331 | Dec 2017 | GB |
2014208521 | Nov 2014 | JP |
101736411 | May 2017 | KR |
2016186355 | Nov 2016 | WO |
2018140701 | Aug 2018 | WO |
Entry |
---|
Park, Tae Wook. International Application No. PCT/US2019/050212, International Search Report and Written Opinion, dated Dec. 20, 2019. (pp. 1-13). |
Chinese Patent Office, Search Report for Appl. No. 2019800582997, dated Apr. 24, 2023, 5 pages. |
Chinese Patent Office, 1st Office Action for Appl. No. 2019800582997, dated Apr. 24, 2023, 5 pages. |
Dondi, D., et al., “An autonomous wireless sensor network device powered by a RF energy harvesting system,” IECON 2012—38th Annual Conference on IEEE Industrial Electronics Society, Oct. 25-28, 2012. |
Number | Date | Country | |
---|---|---|---|
20210271899 A1 | Sep 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16125531 | Sep 2018 | US |
Child | 17325160 | US |