The technology of the disclosure relates generally to using sensors in a vehicle for multiple purposes.
The automotive industry began widespread infiltration into society before the advent of computers. Early computing devices were too large and cumbersome to be practical for incorporation into automobiles. However, as the size and cost of computing devices has come down, vehicles, and automobiles in particular, have embraced the incorporation of computing devices into the regular operation of the vehicles.
While engine management and exhaust control saw the first widespread use of computing devices in automobiles, more recent automobiles have seen a proliferation of computing devices into almost every system with sensors capable of monitoring almost any function related to operation of the vehicle as well as sophisticated audiovisual systems capable of providing robust multimedia experiences for operators and passengers. The proliferation of computing power and computing devices has led to an increase in efforts to assist in the safe operation of such vehicles.
One early effort to assist in the safe operation of a vehicle was the introduction of a backup camera. The operator is able to supplement the view available in the rear view mirror and direct viewing through a rear window with the images from the camera. In many cases so-called blind spots may be eliminated. More recent advances have used cameras to assist in parking cars, and even more recent advances have seen the testing of self-driving or autonomous vehicles. While cameras may be used in each of these activities, there may be different processing requirements for images that are used for human consumption (e.g., the backup camera view) relative to images that are used for machine consumption (e.g., self-parking or self-driving uses). Current approaches to these different processing requirements may use duplicative cameras or may use a single integrated circuit (IC) to perform both processing activities with a shared imaging processing pipe. Other sensors may be used in the self-driving process such as radar, sonar, light detection and ranging (LIDAR), infrared or the like. Likewise, other sensors such as sensors that measure speed, engine revolutions, exhaust, or the like may be used both for self-driving purposes as well as performance calculations. In most cases, where sensors are dual-use, there may duplicative sensors or a single IC performing calculations for both uses. While acceptable, each of these solutions involves compromises. Accordingly, a more optimized solution to these processing requirements is desirable.
Aspects disclosed in the detailed description include methods and systems to broadcast sensor outputs in an automotive environment. In particular, sensors such as cameras output relatively unprocessed (raw) data to two or more different processing circuits where the processing circuits are located in separate and distinct embedded control units (ECUs). A first one of the two or more different processing circuits processes the raw data for human consumption. A second one of the two or more different processing circuits processes the raw data for machine utilization such as for autonomous driving functions. Such an arrangement allows for greater flexibility in utilization of the data from the sensors without imposing undue latency in the processing stream and without compromising key performance indices for human use and machine use. In particular, different processing circuits may be differently optimized for such processing and may come from different vendors if desired. Still further, the processing circuits may have different levels of safety certifications depending on use. In a particularly contemplated aspect, the sensors are cameras, and the processing circuits are image processing circuits. While the data is provided to two such image processing circuits, the overall connection requirements may be reduced. Still further, by duplicating the data to the two different image processing circuits, the integrity of the data is not compromised by unnecessary encoding and decoding when transferred between two integrated circuits (ICs).
In this regard in one aspect, a vehicle is disclosed. The vehicle includes a sensor configured to sense data related to the vehicle and output raw data. The vehicle also includes a first ECU including a first processing circuit communicatively coupled to the sensor and configured to receive the raw data. The vehicle also includes a second ECU separate and distinct from the first ECU. The second ECU includes a second processing circuit communicatively coupled to the sensor and is configured to receive the raw data.
In another aspect, a vehicle is disclosed. The vehicle includes an image capturing sensor configured to sense image data related to the vehicle and output raw image data. The vehicle also includes a first ECU including a first image processing circuit communicatively coupled to the image capturing sensor and configured to receive the raw image data and output a visual representation of the raw image data on a display within the vehicle. The vehicle also includes a second ECU separate and distinct from the first ECU. The second ECU includes a second image processing circuit communicatively coupled to the image capturing sensor and is configured to receive the raw image data and process the raw image data for machine utilization.
In another aspect, a method is disclosed. The method includes capturing an image with a camera on a vehicle. The method also includes providing raw image data from the camera to a first image processing circuit in a first ECU. The method also includes providing the raw image data from the camera to a second image processing circuit in a second ECU separate and distinct from the first ECU. The method also includes presenting processed image data on a display within the vehicle after processing by the first image processing circuit.
In another aspect, an ECU for a vehicle is disclosed. The ECU includes a camera configured to capture images external to a vehicle. The ECU also includes a first output configured to provide raw image data from the camera to a first image processing circuit. The ECU also includes a second output configured to provide the raw image data from the camera to a second image processing circuit.
With reference now to the drawing figures, several exemplary aspects of the present disclosure are described. The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any aspect described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects.
Aspects disclosed in the detailed description include methods and systems to broadcast sensor outputs in an automotive environment. In particular, sensors such as cameras output relatively unprocessed (raw) data to two or more different processing circuits where the processing circuits are located in separate and distinct embedded control units (ECUs). A first one of the two or more different processing circuits processes the raw data for human consumption. A second one of the two or more different processing circuits processes the raw data for machine utilization such as for autonomous driving functions. Such an arrangement allows for greater flexibility in utilization of the data from the sensors without imposing undue latency in the processing stream and without compromising key performance indices for human use and machine use. In particular, different processing circuits may be differently optimized for such processing and may come from different vendors if desired. Still further, the processing circuits may have different levels of safety certifications depending on use. In a particularly contemplated aspect, the sensors are cameras, and the processing circuits are image processing circuits. While the data is provided to two such image processing circuits, the overall connection requirements may be reduced. Still further, by duplicating the data to the two different image processing circuits, the integrity of the data is not compromised by unnecessary encoding and decoding when transferred between two integrated circuits (ICs).
While much of the present disclosure is presented in the context of cameras and image processing, the present disclosure is not so limited and the reader should appreciate that other sorts of image sensors such as radar, light detection and ranging (LIDAR), sonar, infrared, microwave, millimeter wave, 3d point cloud, and the like are also readily used in the systems and methods set forth herein. For example, raw radar data could be converted to b-scan and presented to an operator while concurrently the raw data could be used for machine vision perception and planning. Likewise, while image sensors and image processing are specifically contemplated, other sorts of sensors that may be used in multiple contexts may also benefit from the present disclosure. For example, data from a sensor that may be used to control an engine may also be presented for human perception. Other sensors that produce such operational control and informational data may also benefit from the present disclosure.
In this regard,
The network 110 may be a single homogenous network such as a common bus having a multi-drop or ring topology, or may be formed from distinct communication links such as separate point-to-point cables.
In practice, the cameras 108(1)-108(M) may provide a backup view to an operator on one of the displays 116 as well as provide data to a control system to assist in an advanced driver assistance system (ADAS). A camera sensor raw output may be converted to YUV for human consumption or gray scale for machine consumption. The camera sensor raw output (RGGB, RCCB, RCCC, RCCG) may even be fed directly to a deep neural network for object detection and tracking in an ADAS.
In conventional systems, a single integrated circuit (IC) may operate as the control system. Such an approach imposes substantial burden on the IC requiring a relatively large circuit, which may have a large and/or costly silicon area with extensive packaging requirements. Such large silicon elements may have low yield due to the large die area. Likewise, such large multi-purpose circuits may result in independent processing functions competing for access to the associated shared memory, which may affect performance, reliability, and/or require additional links between the circuit and the memory. Other conventional systems may connect more than one IC together via a shared bus link, such as Peripheral Component Interconnect (PCI) express (PCIe), requiring careful thought and partitioning of processing tasks and transfer of data across the collection of ICs and the need to consider shared memory spaces and available bus communication data rates. Exemplary aspects of the present disclosure allow multiple distinct data processing circuits to interoperate with the sensors, reducing the need for such large multi-purpose circuits. The ability to use multiple data processing circuits allows the data processing circuits to be optimized for particular functions and separates different functions from competing for the same shared memory resource, which in turn may allow different safety certifications to be possible for different data processing circuits. Cost savings may be possible because the expense of certification testing may not be required for different ones of the circuits. As noted, in particularly contemplated aspects, the data processing circuits are image processing circuits, and the data is image data that may be processed differently depending on whether the image processing circuit is associated with machine consumption or human consumption.
In this regard, exemplary aspects of the present disclosure allow for the cameras 108(1)-108(M) to broadcast raw image data to multiple image processing circuits. Four exemplary network structures are illustrated in
With reference to
A close variant of the alternate camera system 500 is alternate camera system 500B illustrated in
A fourth camera system 600 is illustrated in
It should be appreciated that while only two uses of sensor data are illustrated in
A flowchart of the method of operation is provided with reference to
As used herein raw image data includes, but is not limited to, Bayer RGB image data, RCCB, RCCC, RCCG, and monochrome.
While particularly contemplated as being appropriate for an automobile, it should be appreciated that the concepts disclosed herein are also applicable to other vehicles.
While not central to the present disclosure, it should be appreciated that in many instances there is a virtual backchannel or other backchannel present between ECUs. Thus, while the above discussion may focus on the serializer portion of the link from the camera to the deserializer portion of the link at the computer vision SoC end, the backchannel may allow data to pass from the SoC to the camera.
Those of skill in the art will further appreciate that the various illustrative logical blocks, modules, circuits, and algorithms described in connection with the aspects disclosed herein may be implemented as electronic hardware, instructions stored in memory or in another computer readable medium and executed by a processor or other processing device, or combinations of both. The devices described herein may be employed in any circuit, hardware component, IC, or IC chip, as examples. Memory disclosed herein may be any type and size of memory and may be configured to store any type of information desired. To clearly illustrate this interchangeability, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. How such functionality is implemented depends upon the particular application, design choices, and/or design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The various illustrative logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed with a processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices (e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration).
The aspects disclosed herein may be embodied in hardware and in instructions that are stored in hardware, and may reside, for example, in Random Access Memory (RAM), flash memory, Read Only Memory (ROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), registers, a hard disk, a removable disk, a CD-ROM, or any other form of computer readable medium known in the art. An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a remote station. In the alternative, the processor and the storage medium may reside as discrete components in a remote station, base station, or server.
It is also noted that the operational steps described in any of the exemplary aspects herein are described to provide examples and discussion. The operations described may be performed in numerous different sequences other than the illustrated sequences. Furthermore, operations described in a single operational step may actually be performed in a number of different steps. Additionally, one or more operational steps discussed in the exemplary aspects may be combined. It is to be understood that the operational steps illustrated in the flowchart diagrams may be subject to numerous different modifications as will be readily apparent to one of skill in the art. Those of skill in the art will also understand that information and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
The previous description of the disclosure is provided to enable any person skilled in the art to make or use the disclosure. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other variations. Thus, the disclosure is not intended to be limited to the examples and designs described herein, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The present application claims priority to U.S. Provisional Patent Application Ser. No. 62/578,775 filed on Oct. 30, 2017 and entitled “METHODS AND SYSTEMS TO BROADCAST CAMERA OUTPUTS IN AN AUTOMOTIVE ENVIRONMENT,” the contents of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62578775 | Oct 2017 | US |