This application claims priority from Korean Patent Application No. 10-2019-0175540 filed on Dec. 26, 2019, which application is herein incorporated by reference in its entirety.
The present disclosure relates to an apparatus for monitoring surroundings of a vehicle, and more specifically, to an apparatus for monitoring surroundings of a vehicle in which it provides images around the vehicle to allow a driver to more easily monitor the surroundings of the vehicle.
Generally, in a vehicle, inside mirrors allow a driver to secure a rear view of the vehicle, and outside mirrors are installed on both sides of the vehicle. The driver perceives surrounding vehicles or pedestrians in situations such as reversing the vehicle, passing, or changing lanes based on the view acquired with the inside mirror or the outside mirror.
Recently, cameras are installed in a vehicle instead of outside mirrors to reduce aerodynamic drag and reduce the possibility of damage caused by external impacts when the vehicle is operating. An image acquired by the camera is displayed through a display device provided inside the vehicle. Accordingly, a driver may easily perceive surrounding situations of the vehicle.
Generally, an image is displayed via a display device by extracting a portion of an image acquired by a camera. A driver secures an optimal field of view by adjusting an extracted area among the images acquired by the camera. However, it is not possible to perceive a relative positional relationship between the image acquired by the camera and the extracted area. Therefore, it is required to repeatedly adjust the extracted area until a desired area is extracted.
Accordingly, there is a need for a method that enables a driver to easily perceive the relative positional relationship between an image acquired by a camera and an area to be extracted.
Aspects of the present disclosure provide an apparatus for monitoring surroundings of a vehicle, which enables a driver to more easily perceive a relative positional relationship between an image acquired by an imaging device and an area to be extracted.
Problems of the present disclosure are not limited to the above-mentioned problem, and other problems not mentioned may be clearly understood by a person skilled in the art from the following description.
However, aspects of the present disclosure are not restricted to those set forth herein. The above and other aspects of the present disclosure will become more apparent to one of ordinary skill in the art to which the present disclosure pertains by referencing the detailed description of the present disclosure given below.
According to an aspect of the present disclosure, an apparatus for monitoring surroundings of a vehicle may include an imaging device that acquires an original image for at least one direction around the vehicle; an image processor configured to extract a monitoring image corresponding to a set area in the original image; and an image display that outputs the extracted monitoring image. The image processor may be configured to display a guide map that indicates a relative positional relationship between the original image and the monitoring image on the monitoring image.
The guide map may comprise a first display area corresponding to the original image, and a second display area corresponding to the monitoring image, and the image processor may be configured to cause the second display area to be moved and displayed within the first display area in accordance with a position of the set area.
The first display area may comprise a line representing a vehicle body line.
The image processor may be configured to display a captured image of the original image in the first display area when the position of the set area is adjusted. The captured image may be the original image that is captured at the time when the guide map is activated.
Alternatively, the image processor may be configured to output the original image in the first display area. Further, the image processor may be configured to display an angle of view of at least one of a horizontal direction or a vertical direction in the guide map.
The first display area and the second display area may have different image properties. The image properties may comprise at least one of hue, saturation, brightness, or transparency of image.
A user interface may be further provided for adjusting a position of the set area, and the image processor may be configured to display the guide map on the monitoring image in response to an operation signal being input from the operation unit. Further, the image processor may be configured to remove the guide map from the monitoring image when no operation signal is input for a predetermined period of time or longer.
Another aspect of the present disclosure provides a non-transitory computer readable medium containing program instructions executed by a processor or controller. The program instructions, when executed by the processor or controller, may be configured to acquire, using an imaging device, an original image for at least one direction around a vehicle; display, in an image display, a monitoring image that is extracted from the original image to correspond to a set area within the original image; and display, in the image display, a guide map that indicates a relative positional relationship between the original image and the monitoring image.
The guide map may comprise a first display area that shows the original image, and a second display area that shows the monitoring image, and the program instructions may be configured to allow the second display area to be moved and displayed within the first display area in accordance with a position of the set area relative to the original image.
The program instructions may be configured to display the guide map on the monitoring image in response to receiving an operation signal via a user interface. The program instructions may be configured to remove the guide map from the monitoring image when no operation signal is input for a predetermined period of time or longer. Further, the program instructions may be configured to display a captured image of the original image in the first display area when the position of the set area is adjusted, the captured image being the original image at a time when the guide map is activated.
The program instructions may be further configure to display, in the first display area, a line that represents a vehicle body line. The program instructions may be configured to display an angle of view of at least one of a horizontal direction or a vertical direction in the guide map.
The first display area and the second display area may have different image properties, which comprise at least one of hue, saturation, brightness, or transparency of image.
An apparatus for monitoring surroundings of a vehicle according to the present disclosure has one or more of the following benefits. A driver's convenience may be improved by displaying a relative positional relationship between an original image and a set area based on a position of the set area corresponding to a monitoring image in the original image that is acquired by an imaging device.
The benefits of the present disclosure are not limited to the above-mentioned benefits, and other benefits not mentioned may be clearly understood by a person skilled in the art from the claims.
The above and other aspects and features of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings, in which:
Advantages and features of the present disclosure and methods of accomplishing the same may be understood more readily by reference to the following detailed description of exemplary embodiments and the accompanying drawings. The present disclosure may, however, be embodied in many different forms and should not be construed as being limited to the exemplary embodiments set forth herein. Rather, these exemplary embodiments are provided so that this disclosure will be thorough and complete and will fully convey the concept of the disclosure to those skilled in the art, and the present disclosure will only be defined by the appended claims. Throughout the specification, like reference numerals in the drawings denote like elements.
In some exemplary embodiments, well-known steps, structures and techniques will not be described in detail to avoid obscuring the disclosure.
The terminology used herein is for the purpose of describing particular exemplary embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
Exemplary embodiments of the disclosure are described herein with reference to plan and cross-section illustrations that are schematic illustrations of idealized exemplary embodiments of the disclosure. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, exemplary embodiments of the disclosure should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. In the drawings, respective components may be enlarged or reduced in size for convenience of explanation.
Hereinafter, the present disclosure will be described with reference to the drawings for an apparatus for monitoring surroundings of a vehicle according to exemplary embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, the image acquisition unit 100 may be installed on or near the front doors of both sides of the vehicle as shown in
In an exemplary embodiment of the present disclosure, the image acquisition unit 100 installed on the driver side among both sides of the vehicle will be described as an example. The image acquisition unit 100 installed on the passenger side (i.e., the opposite side of the driver side) may also be similarly configured, although there may be some differences in terms of installation positions. Herein, the driver side and the passenger side respectively refer to a side where the driver of the vehicle seats and a side that is opposite from the driver side. In the United States, the left side of the vehicle is typically referred to as the driver side, and the right side of the vehicle is typically referred to as the passenger side. However, the actual sides of the driver side and the passenger side in terms of the left-right direction may vary depending on the road-use customs and local regulations and stipulations.
The image acquisition unit 100 may use at least one imaging device (e.g. a camera) having various angles of view (e.g., viewing angle, field of view, or the like), such as a narrow-angle camera or a wide-angle camera, depending on a field of view that the driver needs to monitor. In an exemplary embodiment of the present disclosure, the image acquisition unit 100 may acquire an image exhibiting an angle of view of θ1 in the horizontal direction as shown in
A size of the image acquired by the image acquisition unit 100 may be defined by the angle of view θ1 in the horizontal direction and the angle of view θ2 in the vertical direction. Hereinafter, the image acquired by the image acquisition unit 100 in an exemplary embodiment of the present disclosure will be referred to as an “original image.”
The image processor 200 may be configured to extract a monitoring image corresponding to a set area A′ of the original image A as shown in
In other words, as the set area A′ increases, the size of the object appearing in the monitoring image decreases and thus the magnification decreases. Therefore, the smaller the set area A′ is, the larger the size of the object appearing in the monitoring image becomes, and thus the magnification increases. The set area A′ may be determined to have a magnification that allows the driver to appropriately recognize the size of the object appearing in the monitoring image or the distance to the object.
The original image A may have a larger size than the set area A′. In other words, the set area A′ may be set to a portion of the original image A. This is to prevent image distortion in the monitoring image because the image distortion is more likely to occur in an edge region of the original image A than in a central region thereof, and to allow the set area A′ to be adjusted according to the driver's preference.
The set area A′ may be defined with respect to an extraction angle ah in the horizontal direction and an extraction angle av in the vertical direction as shown in
Herein, the extraction angle ah in the horizontal direction and the extraction angle av in the vertical direction may be determined to provide a required magnification based on a distance or angle between the image output unit 300 and the driver's view point (e.g., a location of the driver's eyes).
The image output unit 300 may include an image display 310 (e.g., a screen) having a predetermined size on which the monitoring image is output or displayed. In an exemplary embodiment of the present disclosure, the image acquisition units 100 may be installed on both sides of the vehicle, respectively. Therefore, as shown in
The operation unit 400 may allow the driver to activate a guide map for adjusting a position of the set area A′, and may enable the driver to adjust the position of the set area A′. The operation unit 400 may include a user-interface and may be provided in the vehicle in the form of a button, switch, joystick, or the like. However, the present disclosure is not limited thereto, and when the image output unit 300 is configured as a touch display panel, the operation unit 400 may be provided as a touch button. In such exemplary embodiments, the image display and the user-interface may be provided as a single unit such as a touch screen.
For example, when the driver wants to increase the view toward the lateral sideways of the vehicle in the monitoring image that is being currently output via the image output unit 300 or to reduce a proportion of the vehicle's own body in the monitoring image, the guide map may be called or activated via the operation unit 400 to allow the guide map to be displayed, and then the position of the set area A′ in the original image A may be adjusted.
As described above, when the driver adjusts the position of the set area A′ using the operation unit 400, it may be difficult for the driver to perceive a relative position of the set area A′ with respect to the actual original image A. Therefore, it may be possible for the driver to attempt to adjust the position of the set area A′ even when an edge of the set area A′ is disposed on an edge of the original image A and further positioning of the set area A′ is no longer possible. In such a circumstance, unnecessary operation may occur, thereby reducing the driver's convenience.
In other words, when the driver is unaware of the relative positional relationship between the original image A and the set area A′, unnecessary operation may be frequently input by the driver even though the position of the set area A′ is unable to be adjusted further. To this end, in an exemplary embodiment of the present disclosure, information that allows the driver to know the relative positional relationship between the original image A and the set area A′ may be displayed on the monitoring image, thereby facilitating the driver's convenience.
In an exemplary embodiment of the present disclosure, the image processor 200 may be configured to synthesize the guide map 500 with the monitoring image, e.g., by inserting the guide map 500 within the monitoring image, when an operation signal is input by the operation unit 400, and may be configured to display no guide map when no operation signal is input for a certain period of time or longer. However, the present disclosure is not limited thereto, and the guide map 500 may be displayed even when no operation signal is input.
The guide map 500 may include a first display area 510 that shows the original image A and a second display area 520 that shows the set area A′, which corresponds to the monitoring image. As shown in
The first display area 510 and the second display area 520 may have different image properties, for example, hue, saturation, brightness, transparency, and the like to secure driver's visibility. By way of examples, the first display area 510 may be displayed in black and white image (i.e., substantially reduced saturation), and the second display area 520 may be displayed in color. The driver may adjust the position of the set area A′ by operating the operation unit 400 while checking the position of the set area A′ based on the original image A through the guide map 500. By way of examples, the second display area 520 may be moved within the first display area 510 by touch-dragging on the screen 310 or by manipulating a joystick-type switch that is separately provided, e.g., at the center fascia, at the dashboard, or adjacent to a power-windows switch on a door-panel. The present disclosure is not limited thereto, however, and the user-interface for the operation unit 400 may be variously configured.
In the exemplary embodiment described above, an example has been described in which the first display area 510 and the second display area 520 have different image properties. However, the present disclosure is not limited thereto, and as shown in
Therefore, the driver may check the proportion occupied by the vehicle body in the set area A′, and the operation unit 400 may be operated to increase or decrease the proportion occupied by the vehicle body in the set area A′ according to the driver's preference, thereby adjusting the position of the second display area 520 as shown in
In the exemplary embodiment described above, an example has been described in which the relative positional relationship between the original area A and the set area A′ is displayed by the second display area 520, the position of which is moved within the first display area 510. However, the present disclosure is not limited thereto, and as shown in
For example, when the driver adjusts the position of the set area A′ in the horizontal direction using the operation unit 400, the guide map 500 may be displayed as a horizontal angle of view 532 of the set area A′ with respect to a horizontal angle of view 531 of the original image A as shown in
In addition, in
When the image output units 300 are respectively disposed on the left and right sides of the driver as shown in
Although exemplary embodiment is described as using a plurality of units to perform the exemplary processes, it is understood that the exemplary processes may also be performed by one or plurality of modules. Additionally, it is understood that the term processor/image processor/controller/control unit refers to a hardware device that includes a memory and a processor. The memory is configured to store the modules and the processor is specifically configured to execute said modules to perform one or more processes which are described above.
Furthermore, control logic of the present disclosure may be embodied as non-transitory computer readable media on a computer readable medium containing executable program instructions executed by a processor, controller/control unit or the like. Examples of the computer readable mediums include, but are not limited to, ROM, RAM, compact disc (CD)-ROMs, magnetic tapes, floppy disks, flash drives, smart cards and optical data storage devices. The computer readable recording medium can also be distributed in network coupled computer systems so that the computer readable media is stored and executed in a distributed fashion, e.g., by a telematics server or a Controller Area Network (CAN).
As described above, with the surrounding monitoring system 1 for sensing the surroundings of the vehicle according to the present disclosure, the driver may adjust the position of the set area A′ while checking the relative positional relationship between the original image A acquired by the image acquisition unit 100 and the set area A′ corresponding to the monitoring image being output via the image output unit 300 using the guide map 500. Accordingly, the driver's convenience may be improved.
In concluding the detailed description, those skilled in the art will appreciate that many variations and modifications can be made to the exemplary embodiments without substantially departing from the principles of the present disclosure. Therefore, the disclosed exemplary embodiments of the disclosure are used in a generic and descriptive sense only and not for purposes of limitation.
Number | Date | Country | Kind |
---|---|---|---|
10-2019-0175540 | Dec 2019 | KR | national |