The present disclosure generally relates to automotive technology, more particularly, to an apparatus and a method for notifying an expected motion of a vehicle.
Autonomous driving is a relatively new technological field for automotive industry. With autonomous driving, vehicles are capable of sensing their environment and navigating without human operations. Autonomous cars use a variety of technologies to detect their surroundings, such as using radar, laser, GPS, odometry and computer vision. Advanced control systems of Autonomous vehicles can interpret sensory data to identify appropriate navigation paths, as well as obstacles and relevant signages.
Although autonomous vehicles have already driven millions of miles on public roads, the road safety is still a main concern. Thus, there is a need for further improvement.
According to a first aspect of embodiments of the present disclosure, an apparatus for notifying an expected motion of a vehicle is provided. The apparatus may include: a light projection module disposed on a body of the vehicle and operable to project a visual light pattern to an external environment in which the vehicle is driving; and a processor configured to: obtain trajectory information indicative of an expected motion of the vehicle; and generate, according to the trajectory information, and transmit to the light projection module a projection control signal for controlling the light projection module to project the visual light pattern onto a surface in the external environment such that the expected motion of the vehicle can be visually observed from the external environment.
According to a second aspect of embodiments of the present disclosure, a vehicle is provided. The vehicle may include: a body; a light projection module disposed on the body of the vehicle and operable to project a visual light pattern onto an external environment in which the vehicle is driving; and a processor configured to: obtain trajectory information indicative of an expected motion of the vehicle; and generate, according to the trajectory information, a projection control signal for controlling the light projection module to project the visual light pattern onto a surface in the external environment such that the expected motion of the vehicle can be visually observed from the external environment.
According to a third aspect of embodiments of the present disclosure, a method for notifying an expected motion of a vehicle is provided. The method may include: obtaining trajectory information indicative of an expected motion of the vehicle; and generating, according to the trajectory information, and transmitting to the light projection module a projection control signal for controlling a light projection module to project a visual light pattern onto a surface in an external environment in which the vehicle is driving such that the expected motion of the vehicle can be visually observed from the external environment.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only, and are not restrictive of the invention. Further, the accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain principles of the invention.
The drawings referenced herein form a part of the specification. Features shown in the drawing illustrate only some embodiments of the disclosure, and not of all embodiments of the disclosure, unless the detailed description explicitly indicates otherwise, and readers of the specification should not make implications to the contrary.
The same reference numbers will be used throughout the drawings to refer to the same or like parts.
The following detailed description of exemplary embodiments of the disclosure refers to the accompanying drawings that form a part of the description. The drawings illustrate specific exemplary embodiments in which the disclosure may be practiced. The detailed description, including the drawings, describes these embodiments in sufficient detail to enable those skilled in the art to practice the disclosure. Those skilled in the art may further utilize other embodiments of the disclosure, and make logical, mechanical, and other changes without departing from the spirit or scope of the disclosure. Readers of the following detailed description should, therefore, not interpret the description in a limiting sense, and only the appended claims define the scope of the embodiment of the disclosure.
In this application, the use of the singular includes the plural unless specifically stated otherwise. In this application, the use of “or” means “and/or” unless stated otherwise. Furthermore, the use of the term “including” as well as other forms such as “includes” and “included” is not limiting. In addition, terms such as “element” or “component” encompass both elements and components comprising one unit, and elements and components that comprise more than one subunit, unless specifically stated otherwise. Additionally, the section headings used herein are for organizational purposes only, and are not to be construed as limiting the subject matter described.
As depicted in
The apparatus 100 may include at least one program function module in form of software or firmware stored or embedded in the memory 102 and executed by the processor 104. The processor 104 is used for performing executable instructions and programs stored in the memory 102. The memory 102 is used for storing various types of data of the apparatus 100. The memory 102 may be an internal memory of the apparatus 100, or a removable memory. For example, the memory 102 may include, but not be limited to, random access memory (RAM), read only memory (ROM), programmable read-only memory (PROM), erasable read only memory (EPROM), electrically erasable read only memory (EEPROM) and the like.
The processor 104 may be an integrated circuit chip with signal and data processing capability. The processor 104 as described may be a general purpose processor, including a central processor (CPU), a network processor (NP) and etc. The processor 104 can also be a digital signal processor (DSP), application specific integrated circuit (ASIC), Field-programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The processor 104 can execute or implement methods, steps and logic diagrams disclosed in embodiments of the present disclosure. The processor 104 may be a microprocessor or any conventional processor, etc.
The light projection module 106 may be disposed on a body of the vehicle and operable to project a visual light pattern to an external environment in which the vehicle is driving. The visual light pattern can indicate expected motions to be taken by the vehicle. When projected onto the external environment such as a ground, the visual light pattern can be observed by people around and thus the people can be well aware of the expected motions of the vehicle. In some embodiments, the light projection module 106 includes a light source and a mechanical member. The mechanical member can move to change a direction and/or focus of a light beam emitted from the light source. In this way, the light pattern from the light projection module 106 can change accordingly. In some other embodiments, the light projection module 106 can also include a power adjusting member for adjusting a power of the light beam emitted from the light source.
In some embodiments, the light projection module 106 is a digital light processing projector based on optical micro-electro-mechanical technology that uses a digital micromirror device. In the digital light processing projector, the light pattern or image is created by microscopically small mirrors laid out in a matrix on a semiconductor chip, known as a Digital Micromirror Device (DMD). The DMD is driven by a digital video or graphic signal in which each digital pixel corresponds to a single mirror on the DMD. The number of mirrors corresponds to the resolution of the projected image. These mirrors can be repositioned rapidly to reflect light either through the lens or onto a heat sink. Rapidly toggling the mirror between the two orientations produces grayscales. In an embodiment, to get color, a rotating color wheel (with red, green and blue filters) is put between the light source and the DMD. Separate signal is delivered for each of the three colors, and each mirror (i.e., each pixel) is switched on and off as the filter rotates each color between the lamp and DMD. In other embodiments, different methods may be used to create a color image, and the present disclosure is not limited thereto.
In some embodiment, the light projection module 106 includes a laser light source. The laser light source can produce a richer, more vibrant color palette than conventional light sources. In some embodiments, the light projection module 106 includes a light-emitting diode (LED) light source or an Ultra High Power (UHP) lamp.
The function device 108 may include a camera 108a, a sensor 108b and the like. The function device is used by the apparatus 100 to perform specific operations (for example, taking pictures of the external environment, telemetering with infrared, etc.). In some embodiments, the camera 108a may be used to monitor the visual light pattern projected onto the surface in the external environment, such that the processor 104 can control light projection module 106 to adjust the visual light pattern when a substantial portion of the visual light pattern is not projected onto the surface. In some examples, a substantial portion of the visual light pattern not projected onto the surface refers to that a ratio greater than 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% or 90% of an area of the visual light pattern cannot be projected onto a surface of the environment. In some other examples, a substantial portion of the visual light pattern not projected onto the surface refers to that an intensity of the visual light pattern is substantially equal to or lower than an intensity of environmental lighting such that people around cannot observe such visual light pattern. In some other examples, a substantial portion of the visual light pattern not projected onto the surface refers to that an essential part of the visual light pattern indicating the expected motion of the vehicle (such as a head of an arrow) cannot be projected onto the surface with the other part of the visual light pattern. In some embodiments, the sensor 108b may be a distance detection sensor, and is used for detecting a distance between the vehicle and an object in the external environment.
The I/O unit 110 is an interface for data transmission of the apparatus 100. In some embodiments, the I/O unit 110 may be used to receive a user's input. For example, the I/O unit 110 may include a touch screen, a button, a voice sensor for receiving the user's voice command, and/or an image capturing device for detecting the user's hand gesture or body language.
As shown in
In Step S302, a trigger signal for controlling the apparatus 100 to enter into a projection mode is generated.
After the apparatus 100 has enter into the projection mode, the apparatus 100 may perform subsequent steps for notifying one or more expected motions of the vehicle. It should be noted that, in some embodiments, the projection mode is always activated, and then Step S302 may be omitted.
In some embodiments, the trigger signal is generated in response to a user input. For example, a user or driver of the vehicle may input a projection instruction to the apparatus 100. The projection instruction may be input by the user triggering a button on the apparatus 100, sending a voice command, or performing a specific action within a capturing area of an image capturing device. As to inputting the projection instruction by triggering the button, the user may press relevant button(s), and then the projection instruction can be transmitted to the processor 104 in form of an electric signal. As to inputting the projection instruction by voice control, the user may input a specific voice command (for example, “start projection”, etc.), and then the apparatus 100 can receive the voice command as the projection instruction through a microphone or the like, which picks up the voice and further converts the voice command into an electric signal. The electric signal can be further transmitted to the processor 104. As to inputting the projection instruction through the specific action (for example, a predetermined gesture, a unique hand gesture or body language, etc.), the user may perform a specific action within a capturing area, and then the image capturing device can take the acquired specific action as the projection instruction, converts the projection instruction into an electric signal, and sends the signal to the processor 104. After receiving the signal corresponding the projection instruction, the processor 104 may generate the trigger signal.
In some embodiments, the trigger signal is generated in response to a pedestrian's request. For example, an imaging device mounted on the vehicle is used to detect pedestrians' actions. If a specific gesture of a pedestrian (for example, a sweeping gesture) is detected, it can be determined that the pedestrian requests the vehicle to show its expected motions, and thus the trigger signal is generated. In some other examples, a microphone is used to detect pedestrians' voices. If a specific voice of a pedestrian is detected, it can also be determined that the pedestrian requests the vehicle to show its expected motions, and thus the trigger signal is generated.
In some embodiments, the trigger signal is generated automatically by the vehicle using a specific algorithm, which can determine specific conditions where motions of the vehicle should be visually observed from the external environment. For example, an imaging device may be used to take a picture or video of the external environment, and the trigger signal is generated when a number of objects in the external environment is greater than a predetermined number. In some embodiments, a radar may be used to detect a distance between the vehicle and an object in the external environment, the trigger signal is generated when the distance is smaller than a predetermined distance. The object in the external environment may be a vehicle, a pedestrian, a bicyclist, or the like.
In some embodiments, the trigger signal is generated when a series of car honks is received through a microphone or a voice sensor. The series of honks may be made by the vehicle equipped with apparatus 100, or other vehicle in the external environment. In some embodiments, the trigger signal is generated when a message is received from a V2X (vehicle-to-everything) mechanism running on another vehicle or infrastructure in the external environment, or a software running on a mobile device. Both the series of honks and the message may indicate that the expected motion of the vehicle should be visually observed from the external environment.
In step 304, trajectory information indicative of an expected motion of the vehicle is obtained.
The trajectory information may include an expected trajectory of the vehicle, and/or an expected turning to be made by the vehicle. It can be readily appreciated that the trajectory information may include two or more expected motions of the vehicle in a sequence.
In some embodiments, the trajectory information may be determined based on one or more parameters of the vehicle, such as a current position of the vehicle, a speed of the vehicle, a wheel track of the vehicle, and/or a steering angle of the vehicle. There parameters can be collected through a motion detecting system (e.g. an inertial sensor) of the vehicle, and then transmitted to the processor 104.
In some embodiment, the vehicle is an autonomous vehicle, and the trajectory information may be obtained from a control system of the autonomous vehicle. Generally, autonomous vehicles use a variety of technologies (such as radar, laser light, GPS, odometry and computer vision) to detect their surroundings, such that the control system can use these sensory information to plan a trajectory and a turning.
In step S306, a projection control signal is generated according to the trajectory information, and then is transmitted to the light projection module for controlling the light projection module to project the visual light pattern onto a surface in the external environment such that the expected motion of the vehicle can be visually observed from the external environment.
Referring to
In some embodiments, as shown in
In some embodiment, the visual light pattern may include at least one portion identified with a color indicative of urgency of the expected motion of the vehicle. For example, as shown in
In some embodiments, the visual light pattern may include a static pattern or a dynamic pattern. For example, as shown in
Various example of the visual light pattern have been described herein with reference to the accompanying drawings. However, persons of ordinary skill in the art will recognize that the visual light pattern may have other features as required without departing from the spirit or scope of the present disclosure.
Referring to
In some embodiments, if a manhole cover or an obstacle lies in the expected trajectory of the vehicle 200, the visual light pattern projected on the ground surface would be incomplete or distorted. In this context, the camera 108a shown in
In some examples, a substantial portion of the visual light pattern not projected onto the surface refers to that a ratio greater than 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% or 90% of an area of the visual light pattern cannot be projected onto a surface of the environment. In this case, the processor 104 may adjust the projection control signal, such that the light projection module 106 is controlled to change the scale of the visual light pattern or project the visual light pattern onto other regions of the ground surface.
In some other examples, a substantial portion of the visual light pattern not projected onto the surface refers to that an intensity of the visual light pattern is substantially equal to or lower than an intensity of environmental lighting such that people around cannot observe such visual light pattern. In this case, the processor 104 may adjust the projection control signal, such that the light projection module 106 is controlled to increase the intensity of the visual light pattern or increase the contrast of the visual light pattern.
In some other examples, a substantial portion of the visual light pattern not projected onto the surface refers to that an essential part of the visual light pattern indicating the expected motion of the vehicle (such as a head of an arrow) cannot be projected onto the surface with the other part of the visual light pattern. In this case, the processor 104 may adjust the projection control signal, such that the light projection module 106 is controlled to change the shape of the visual light pattern or project the essential part of the visual light pattern onto other regions of the ground surface.
It should be noted, in some embodiments, even though there is a manhole cover or an obstacle lies in the expected trajectory of the vehicle, the portion of the visual light pattern projected on the manhole cover or the obstacle can be visually observed from the external environment. In this case, Step S308 may be omitted.
It should be noted that, the apparatus and methods disclosed in the embodiments of the present disclosure can be implemented by other ways. The aforementioned apparatus and method embodiments are merely illustrative. For example, flow charts and block diagrams in the figures show the architecture and the function operation according to a plurality of apparatus, methods and computer program products disclosed in embodiments of the present disclosure. In this regard, each frame of the flow charts or the block diagrams may represent a module, a program segment, or portion of the program code. The module, the program segment, or the portion of the program code includes one or more executable instructions for implementing predetermined logical function. It should also be noted that in some alternative embodiments, the function described in the block can also occur in a different order as described from the figures. For example, two consecutive blocks may actually be executed substantially concurrently. Sometimes they may also be performed in reverse order, depending on the functionality. It should also be noted that, each block of the block diagrams and/or flow chart block and block combinations of the block diagrams and/or flow chart can be implemented by a dedicated hardware-based systems execute the predetermined function or operation or by a combination of a dedicated hardware and computer instructions.
If the functions are implemented in the form of software modules and sold or used as a standalone product, the functions can be stored in a computer readable storage medium. Based on this understanding, the technical nature of the present disclosure, part contributing to the prior art, or part of the technical solutions may be embodied in the form of a software product. The computer software product is stored in a storage medium, including several instructions to instruct a computer device (may be a personal computer, server, or network equipment) to perform all or part of the steps of various embodiments of the present. The aforementioned storage media include: U disk, removable hard disk, read only memory (ROM), a random access memory (RAM), floppy disk or CD-ROM, which can store a variety of program codes.
Various embodiments have been described herein with reference to the accompanying drawings. It will, however, be evident that various modifications and changes may be made thereto, and additional embodiments may be implemented, without departing from the broader scope of the invention as set forth in the claims that follow.
Further, other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of one or more embodiments of the invention disclosed herein. It is intended, therefore, that this disclosure and the examples herein be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following listing of exemplary claims.
The present application claims the benefit of U.S. provisional patent application 62/661,657, filed Apr. 24, 2018, the disclosure of which is incorporated herein by reference in the entirety.
Number | Date | Country | |
---|---|---|---|
62661657 | Apr 2018 | US |