This application claims the benefit of Korean Patent Application No. 10-2023-0167484, filed on Nov. 28, 2023, which is hereby incorporated by reference as if fully set forth herein.
The embodiments of the present disclosure relate to a vehicle, and more particularly to an apparatus and method for determining whether a warning light is displayed on an integrated monitor of a vehicle and displaying an alternative warning light on a portion of the integrated monitor when the warning light is not turned on.
As technology of electric vehicles and technology of autonomous driving have rapidly developed, it is expected that various occupants (hereinafter referred to as “users”) will be able to engage in various activities within the vehicle. For example, a user may be a fallback-ready user (FRU) and thus be able to watch video or moving images or participate in video conferences, as long as he or she remains aware of the vehicle's surroundings and can be ready for fallback.
With the advent of the autonomous driving era, various sensors are being installed within vehicles. For example, cameras, microphone sensors, and heat detection sensors may be installed to check objects within the vehicle.
Within the above-described environment, it is expected that there will be a need for ways to provide interfaces for users in vehicles in various ways.
According to the above-mentioned background, as integrated monitors have rapidly become popular in vehicles, the present disclosure provides a method for guaranteeing the safety of lighting operation of one or more warning lights of a vehicle in the integrated monitors. The integrated monitor may be referred to as an integrated screen, an integrated display device, etc., and may be controlled by an integrated controller that can integrally control a cluster and a multimedia monitor instead of by a cluster controller that controls the cluster where the warning light was turned on. When there occurs an unexpected operation in which the warning light is not turned on, there is a need to implement a countermeasure for the warning light not lighting by using the integrated controller.
Technical subjects to be solved by the present disclosure are not limited to the above-mentioned technical solutions, and it should be noted that other technical subjects not described above can be understood by those skilled in the art from the description of the present disclosure below.
In accordance with an embodiment of the present disclosure, an apparatus for detecting non-lighting of a warning light may include: an integrated display device configured to include a cluster window and a multimedia window; and an integrated controller operatively connected to the integrated display device and configured to control the integrated display device. The integrated controller is configured to, in response to a reception of an event-related signal for triggering a display of the warning light, transmit the event-related signal to a first application for the cluster window and a second application for the multimedia window; and determine whether a warning light corresponding to the event-related signal is displayed on the cluster window.
In accordance with another embodiment of the present disclosure, a method for detecting non-lighting of a warning light, the method being performed by an apparatus that includes an integrated display device having a cluster window and a multimedia window and an integrated controller controlling the integrated display device may include: receiving an event-related signal for triggering a display of the warning light; transmitting the event-related signal to an application for the cluster window and an application for the multimedia window; and determining whether a warning light corresponding to the event-related signal is displayed on the cluster window.
The above-described solutions of the present disclosure are some of the embodiments of the present disclosure. Various solutions other than the above-described solutions can be derived and understood based on the detailed description of the present disclosure to be described below.
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the disclosure and together with the description serve to explain the principle of the disclosure.
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings so that the present disclosure may be easily realized by those skilled in the art. However, the present disclosure may be achieved in various different forms and is not limited to the embodiments described herein. In the drawings, parts that are not related to a description of the present disclosure are omitted to clearly explain the present disclosure and similar reference numbers will be used throughout this specification to refer to similar parts.
In the specification, when a part “includes” an element, it means that the part may further include another element rather than excluding another element unless otherwise mentioned.
First, a structure and function of an autonomous driving control system (e.g., an autonomous driving vehicle) to which an autonomous driving apparatus according to the present embodiments is applicable will be described with reference to
As illustrated in
The autonomous driving integrated controller 600 may obtain, through the driving information input interface 101, driving information based on manipulation of an occupant for a user input unit 100 in an autonomous driving mode or manual driving mode of a vehicle. As illustrated in
For example, a driving mode (i.e., an autonomous driving mode/manual driving mode or a sports mode/eco mode/safety mode/normal mode) of the vehicle determined by manipulation of the occupant for the driving mode switch 110 may be transmitted to the autonomous driving integrated controller 600 through the driving information input interface 101 as the driving information.
Furthermore, navigation information, such as the destination of the occupant input through the control panel 120 and a path up to the destination (e.g., the shortest path or preference path, selected by the occupant, among candidate paths up to the destination), may be transmitted to the autonomous driving integrated controller 600 through the driving information input interface 101 as the driving information.
The control panel 120 may be implemented as a touchscreen panel that provides a user interface (UI) through which the occupant inputs or modifies information for autonomous driving control of the vehicle. In this case, the driving mode switch 110 may be implemented as touch buttons on the control panel 120.
In addition, the autonomous driving integrated controller 600 may obtain traveling information indicative of a driving state of the vehicle through the traveling information input interface 201. The traveling information may include a steering angle formed when the occupant manipulates a steering wheel, an accelerator pedal stroke or brake pedal stroke formed when the occupant depresses an accelerator pedal or brake pedal, and various types of information indicative of driving states and behaviors of the vehicle, such as a vehicle speed, acceleration, a yaw, a pitch, and a roll formed in the vehicle. The traveling information may be detected by a traveling information detection unit 200, including a steering angle sensor 210, an accelerator position sensor (APS)/pedal travel sensor (PTS) 220, a vehicle speed sensor 230, an acceleration sensor 240, and a yaw/pitch/roll sensor 250, as illustrated in
Furthermore, the traveling information of the vehicle may include location information of the vehicle. The location information of the vehicle may be obtained through a global positioning system (GPS) receiver 260 applied to the vehicle. Such traveling information may be transmitted to the autonomous driving integrated controller 600 through the traveling information input interface 201 and may be used to control the driving of the vehicle in the autonomous driving mode or manual driving mode of the vehicle.
The autonomous driving integrated controller 600 may transmit driving state information provided to the occupant to an output unit 300 through the occupant output interface 301 in the autonomous driving mode or manual driving mode of the vehicle. That is, the autonomous driving integrated controller 600 transmits the driving state information of the vehicle to the output unit 300 so that the occupant may check the autonomous driving state or manual driving state of the vehicle based on the driving state information output through the output unit 300. The driving state information may include various types of information indicative of driving states of the vehicle, such as a current driving mode, transmission range, and speed of the vehicle.
If it is determined that it is necessary to warn a driver in the autonomous driving mode or manual driving mode of the vehicle along with the above driving state information, the autonomous driving integrated controller 600 transmits warning information to the output unit 300 through the occupant output interface 301 so that the output unit 300 may output a warning to the driver. In order to output such driving state information and warning information acoustically and visually, the output unit 300 may include a speaker 310 and a display 320 as illustrated in
Furthermore, the autonomous driving integrated controller 600 may transmit control information for driving control of the vehicle to a lower control system 400, applied to the vehicle, through the vehicle control output interface 401 in the autonomous driving mode or manual driving mode of the vehicle. As illustrated in
As described above, the autonomous driving integrated controller 600 according to the present embodiment may obtain the driving information based on manipulation of the driver and the traveling information indicative of the driving state of the vehicle through the driving information input interface 101 and the traveling information input interface 201, respectively, and transmit the driving state information and the warning information, generated based on an autonomous driving algorithm, to the output unit 300 through the occupant output interface 301. In addition, the autonomous driving integrated controller 600 may transmit the control information generated based on the autonomous driving algorithm to the lower control system 400 through the vehicle control output interface 401 so that driving control of the vehicle is performed.
In order to guarantee stable autonomous driving of the vehicle, it is necessary to continuously monitor the driving state of the vehicle by accurately measuring a driving environment of the vehicle and to control driving based on the measured driving environment. To this end, as illustrated in
The sensor unit 500 may include one or more of a LiDAR sensor 510, a radar sensor 520, or a camera sensor 530, in order to detect a nearby object outside the vehicle, as illustrated in
The LiDAR sensor 510 may transmit a laser signal to the periphery of the vehicle and detect a nearby object outside the vehicle by receiving a signal reflected and returning from a corresponding object. The LiDAR sensor 510 may detect a nearby object located within the ranges of a preset distance, a preset vertical field of view, and a preset horizontal field of view, which are predefined depending on specifications thereof. The LiDAR sensor 510 may include a front LiDAR sensor 511, a top LiDAR sensor 512, and a rear LiDAR sensor 513 installed at the front, top, and rear of the vehicle, respectively, but the installation location of each LiDAR sensor and the number of LiDAR sensors installed are not limited to a specific embodiment. A threshold for determining the validity of a laser signal reflected and returning from a corresponding object may be previously stored in a memory (not illustrated) of the autonomous driving integrated controller 600. The autonomous driving integrated controller 600 may determine a location (including a distance to a corresponding object), speed, and moving direction of the corresponding object using a method of measuring time taken for a laser signal, transmitted through the LiDAR sensor 510, to be reflected and returning from the corresponding object.
The radar sensor 520 may radiate electromagnetic waves around the vehicle and detect a nearby object outside the vehicle by receiving a signal reflected and returning from a corresponding object. The radar sensor 520 may detect a nearby object within the ranges of a preset distance, a preset vertical field of view, and a preset horizontal field of view, which are predefined depending on specifications thereof. The radar sensor 520 may include a front radar sensor 521, a left radar sensor 522, a right radar sensor 523, and a rear radar sensor 524 installed at the front, left, right, and rear of the vehicle, respectively, but the installation location of each radar sensor and the number of radar sensors installed are not limited to a specific embodiment. The autonomous driving integrated controller 600 may determine a location (including a distance to a corresponding object), speed, and moving direction of the corresponding object using a method of analyzing power of electromagnetic waves transmitted and received through the radar sensor 520.
The camera sensor 530 may detect a nearby object outside the vehicle by photographing the periphery of the vehicle and detect a nearby object within the ranges of a preset distance, a preset vertical field of view, and a preset horizontal field of view, which are predefined depending on specifications thereof.
The camera sensor 530 may include a front camera sensor 531, a left camera sensor 532, a right camera sensor 533, and a rear camera sensor 534 installed at the front, left, right, and rear of the vehicle, respectively, but the installation location of each camera sensor and the number of camera sensors installed are not limited to a specific embodiment. The autonomous driving integrated controller 600 may determine a location (including a distance to a corresponding object), speed, and moving direction of the corresponding object by applying predefined image processing to an image captured by the camera sensor 530.
In addition, an internal camera sensor 535 for capturing the inside of the vehicle may be mounted at a predetermined location (e.g., rear view mirror) within the vehicle. The autonomous driving integrated controller 600 may monitor a behavior and state of the occupant based on an image captured by the internal camera sensor 535 and output guidance or a warning to the occupant through the output unit 300.
As illustrated in
Furthermore, in order to determine a state of the occupant within the vehicle, the sensor unit 500 may further include a bio sensor for detecting bio signals (e.g., heart rate, electrocardiogram, respiration, blood pressure, body temperature, electroencephalogram, photoplethysmography (or pulse wave), and blood sugar) of the occupant. The bio sensor may include a heart rate sensor, an electrocardiogram sensor, a respiration sensor, a blood pressure sensor, a body temperature sensor, an electroencephalogram sensor, a photoplethysmography sensor, and a blood sugar sensor.
Finally, the sensor unit 500 additionally includes a microphone 550 having an internal microphone 551 and an external microphone 552 used for different purposes.
The internal microphone 551 may be used, for example, to analyze the voice of the occupant in the autonomous driving vehicle 1000 based on AI or to immediately respond to a direct voice command of the occupant.
In contrast, the external microphone 552 may be used, for example, to appropriately respond to safe driving by analyzing various sounds generated from the outside of the autonomous driving vehicle 1000 using various analysis tools such as deep learning.
For reference, the symbols illustrated in
The present disclosure proposes a new method for satisfying requirements for functional safety related to clusters for vehicles.
The cluster may have a warning light (also referred to as “telltale”) lighting function that receives signals through CAN communication and displays the received signals on a cluster display. Examples of functional safety requirements for these warning lights (telltales) are as follows:
Dissatisfaction with these requirements may lead to a hazardous situation caused by the driver's ignorance of vehicle conditions.
In the case of a conventional vehicle cluster, a cluster controller 611 may be connected to a cluster display driving board 612 to drive a display module 321. The cluster controller 611 and the cluster display driving board 612 may be collectively referred to as a cluster platform 610.
Referring to
The term “integrated monitor” may refer to a single display that displays all of cluster information, information related to a navigation, audio, and communication device, and/or vehicle-related information, and multimedia information required to display images or video. However, the above single display does not necessarily mean one physically integrated display panel. The integrated monitor may be referred to by several terms including an integrated screen, an integrated cluster, an integrated display device, etc.
Referring to
On the other hand, referring to
Referring to
Referring to
In order to address the above-described issues, the present disclosure proposes a method for adding the path P2 to the above-described example. The hypervisor may perform video capture of the cluster window only for signals related to the telltale trigger event for turning on telltale (or the warning light), and may transmit the captured result to a safety detector of the multimedia app. Here, the video capture may be performed after lapse of a predetermined time from a time point at which signals related to the telltale trigger event are received in consideration of a time delay for driving or implementing the cluster window.
Referring to
Referring to
Accordingly, as shown in
The operation of the integrated controller 630 for this purpose will now be described in detail with reference to the attached drawings.
Upon receiving a signal related to the telltale trigger event, the integrated controller 630 may perform video capture on the aforementioned cluster window 321 or may perform video capture on a display including the cluster window 321, and may thus obtain a captured image.
The integrated controller 630 may detect a telltale (warning light) image from the captured image. Since the area in which the telltale (warning light) is to be turned on is set in the cluster window 322, the integrated controller 630 may attempt to detect the telltale image in the telltale (warning light) region.
If the telltale image is successfully detected, the integrated controller 630 may compare the detected telltale image with the pre-stored telltale information, and may thus identify a telltale (warning light) corresponding to the detected telltale image.
If detection of the telltale image is not successful or identification of the telltale corresponding to the detected telltale image is not successful, the integrated controller 630 may identify telltale (warning light) information corresponding to the signal related to the received telltale trigger event. The integrated controller 630 may detect or select the identified telltale information from the pre-stored telltale information, and may control the telltale image corresponding to the identified telltale to be displayed on the multimedia window 322.
That is, when the integrated controller 630 confirms that the telltale (warning light) scheduled to be displayed on the cluster window 322 is not turned on, the integrated controller 630 may enable the corresponding telltale (S) to be displayed on the multimedia window 322.
Referring to
The apparatus may transmit the signal related to the telltale trigger event to the application (app) for the cluster window and another application (app) for the multimedia window (S920).
The apparatus may determine whether a telltale (warning light) corresponding to the signal related to the telltale trigger event is displayed on the cluster window (S930).
As the telltale (warning light) corresponding to the signal related to the telltale trigger event is turned on, the apparatus may finish performing the present method. In other words, this is because the telltale (warning light) was normally displayed on the integrated display device.
A detailed procedure for determining whether the telltale (warning light) corresponding to the signal related to the telltale trigger event is turned on will hereinafter be described in detail.
After a preset time delay from a time point where the signal related to the trigger event was transmitted, the apparatus may capture video or images displayed on the integrated display device. Video capture can be performed only for the cluster window from among the integrated display device, not for the entire integrated display device.
The apparatus can deliver the captured video to the application (app) for the multimedia window. This is because a normal operation of the app (cluster app) for the cluster window cannot be guaranteed.
The apparatus may determine whether the telltale (warning light) is not displayed on the cluster window based on the telltale (warning light) trigger event and the video captured by the app for the multimedia window. The apparatus may store information about all telltale (warning light) trigger events in advance. Accordingly, the apparatus can recognize which one of the telltale (warning light) trigger events is related to the signal related to the telltale trigger event, and can recognize the telltale (warning light) corresponding to the recognized telltale trigger event. The apparatus may attempt to detect previously stored telltale (warning light) information from the captured video. If detection of telltale (warning light) information in the captured video is not successful, the apparatus may determine that the telltale (warning light) is not displayed on the cluster window.
As the telltale (warning light) is not displayed on the cluster window, the apparatus may control the telltale (warning light) corresponding to the telltale (warning light) trigger event to be displayed on the multimedia window.
Meanwhile, the app for the cluster window and the app for the multimedia window may be implemented, installed, or operated on different guest operating systems (OSs).
In addition, the apparatus can perform operations of the integrated controller related to the present disclosure described in
Although the above-described embodiments of the present disclosure have disclosed that the device (or apparatus) for controlling a user interface (UI) and components included therein perform such control for convenience of description, the device (or apparatus) and the components belonging thereto are names only and the scope of rights is not dependent thereon.
In other words, the proposed technology of the present disclosure may be performed by devices having names other than the control device. In addition, the method, scheme, or the like described above may be performed by software or code readable by a computer or other machine or device for vehicle control.
In addition, as another aspect of the present disclosure, the operation of the proposed technology described above may be provided as code that may be implemented, realized, or executed by a “computer” (a generic concept including a system on chip (SoC) or a (micro) processor) or a computer-readable storage medium, a computer program product, or the like storing or containing the code. The scope of the present disclosure is extendable to the code or the computer-readable storage medium or the computer program product storing or containing the code.
Detailed descriptions of preferred embodiments of the present disclosure disclosed as described above have been provided such that those skilled in the art may implement and realize the present disclosure.
Although the present disclosure has been described above with reference to preferred embodiments, those skilled in the art will understand that various modifications and changes can be made to the present disclosure set forth in the claims below.
Accordingly, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
As is apparent from the above description, according to the solutions to the above-described problems, the present disclosure can prevent occurrence of a situation in which one or more warning lights are not displayed on the integrated monitor.
In addition, according to the present disclosure, a situation in which the warning lights are not displayed on the cluster area of the integrated monitor can be detected early, and a countermeasure for solving this situation can be started.
Additionally, the present disclosure can perform a suggestion function only in an event for turning on the warning lights, thereby reducing resource consumption.
The effects of the present disclosure are not limited to the effects described above. Other effects not described above can be understood by those skilled in the art from the description of the present disclosure below.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present disclosure without departing from the spirit or scope of the disclosure. Thus, it is intended that the present disclosure covers the modifications and variations of this disclosure provided they come within the scope of the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2023-0167484 | Nov 2023 | KR | national |