This application claims the priority benefit of Korean Patent Application No. 10-2014-0070321, filed on Jun. 10, 2014 in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference.
1. Field
The present disclosure relates to an around view provision apparatus and a vehicle including the same and, more particularly, to an around view provision apparatus that is capable of reducing distortion of an around view image and a vehicle including the same.
2. Background
Around view provision apparatuses and vehicles including the same are known. However, they suffer from various disadvantages.
The embodiments will be described in detail with reference to the following drawings in which like reference numerals refer to like elements wherein:
Exemplary embodiments of the present disclosure will be described with reference to the attached drawings.
The terms “module” and “unit,” when attached to the names of components are used herein to help the understanding of the components and thus they should not be considered as having specific meanings or roles. Accordingly, the terms “module” and “unit” may be used interchangeably.
Broadly, a vehicle is a device that allows a driver to move in a desired direction. A representative example of the vehicle may be a car. In order to improve convenience of a user who uses the vehicle, the vehicle may be equipped with various sensors and electronic devices. In particular, various devices to improve driving convenience of the user have been developed. For example, an image captured by a rear view camera may be provided when moving the vehicle backward or when parking the vehicle.
A vehicle as described in this specification may include a car, a motorcycle, or another appropriate type of vehicle. Hereinafter, a description will be given based on a car merely for sake of convenience.
It should be appreciated, however, that a vehicle as described in this disclosure may include various types of transportation devices, including but not limited to, a vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, an electric vehicle having an electric motor as a power source, or the like.
Meanwhile, an around view provision apparatus as described in this specification may be an apparatus that includes a plurality of cameras and which combines a plurality of images captured by the cameras to generate an around view image. In particular, the around view provision apparatus may be a vehicle-based apparatus that provides a top view or a bird's eye view. It is one object of the present disclosure to provide an around view provision apparatus that is capable of reducing distortion of an around view image and a vehicle including the same. Hereinafter, a description will be given of various embodiments of an around view provision apparatus according to the present disclosure and a vehicle including the same.
A vehicle 200 may include a wheels 103FR, 103FL, 103RL, etc., a steering wheel 150, and a plurality of around view cameras 195a, 195b, 195c, and 195d mounted on the vehicle 200. In
When the vehicle moves forward at a predetermined speed or less or when the vehicle moves backward, the around view cameras 195a, 195b, 195c, and 195d may be activated to acquire images. The images acquired by the cameras may be signal-processed by an around view provision apparatus 100 (see
Referring first to
A plurality of images captured by the around view cameras 195a, 195b, 195c, and 195d may be transmitted to a processor 170 (see
An around view provision apparatus 100 shown in
Referring first to
The communication unit 120 may exchange data with a mobile terminal 600 or a server 500 in a wireless fashion. In particular, the communication unit 120 may exchange data with a mobile terminal of the driver in a wireless fashion. To this end, various wireless data communication protocols, such as Bluetooth, Wi-Fi, Wi-Fi Direct, and APiX, may be used.
The communication unit 120 may receive weather information and road traffic state information, such as Transport Protocol Expert Group (TPEG) information, from the mobile terminal 600 or the server 500. On the other hand, the communication unit 120 may transmit real-time traffic information acquired by the around view provision apparatus 100 based on images to the mobile terminal 600 or the server 500. When a user gets into the vehicle, a mobile terminal 600 of the user may pair with the around view provision apparatus 100 automatically or by the user executing an application.
The interface unit 130 may receive vehicle-related data or transmit a signal processed or generated by the processor 170 to the outside. To this end, the interface unit 130 may perform data communication with an electronic control unit (ECU) 770, an audio and video navigation (AVN) apparatus 400, and a sensor unit 760 in the vehicle in a wired communication fashion or a wireless communication fashion.
The interface unit 130 may receive map information related to vehicle travel through data communication with the AVN apparatus 400. On the other hand, the interface unit 130 may receive sensor information from the ECU 770 and the sensor unit 760.
The sensor information may include at least one selected from among vehicle heading information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward movement/backward movement information, battery information, fuel information, tire information, vehicle lamp information, in-vehicle temperature information, and in-vehicle humidity information.
Of the above-specified sensor information, the vehicle heading information, the vehicle position information, the vehicle angle information, the vehicle speed information, and the vehicle tilt information, which are related to vehicle travel, may be referred to as vehicle travel information.
The memory 140 may store various data for overall operation of the around view provision apparatus 100, such as programs for processing or control of the processor 170.
The audio interface unit 185 may convert electric signal received from the processor 170 into an audio signal and output the audio signal. To this end, the audio interface unit 185 may include a speaker. The audio interface unit 185 may output a sound corresponding to an operation of an input unit (not shown), e.g. a button. The audio input unit may detect a user's voice. To this end, the audio input unit may include a microphone. The received voice may be converted into an electric signal, which may be transmitted to the processor 170.
The processor 170 may control overall operation of each unit in the around view provision apparatus 100. In particular, the processor 170 may acquire a plurality of images from the cameras 195a, 195b, 195c, and 195d and combine the acquired images to generate an around view image.
On the other hand, the processor 170 may perform signal processing based on computer vision. For example, the processor 170 may calculate disparity for a view around the vehicle based on the acquired images or the generated around view image, detect an object in the image based on calculated disparity information, and continuously track motion of the object after detection of the object.
In particular, during detection of the object, the processor 170 may perform lane detection, adjacent vehicle detection, pedestrian detection, and road surface detection. In addition, the processor 170 may calculate the distance to the detected adjacent vehicle or the detected pedestrian.
On the other hand, the processor 170 may receive sensor information from the ECU 770 or the sensor unit 760 through the interface unit 130. The sensor information may include at least one of vehicle heading information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward movement/backward movement information, battery information, fuel information, tire information, vehicle lamp information, in-vehicle temperature information, or in-vehicle humidity information.
The display unit 180 may display the around view image generated by the processor 170. During display of the around view image, the display unit 180 may provide various user interfaces. In addition, the display unit 180 may include a touch sensor to sense a touch input to each user interface.
Meanwhile, the display unit 180 may include a cluster or a head up display (HUD) provided at the inside front of the vehicle. In a case in which the display unit 180 is the HUD, the display unit 180 may include a projection module to project an image on the front windshield glass of the vehicle 200.
The electric power supply unit 190 may supply electric power to the respective components under control of the processor 170. In particular, electric power from an in-vehicle battery may be supplied to the electric power supply unit 190.
The cameras 195a, 195b, 195c, and 195d may be cameras to provide an around view image. The cameras 195a, 195b, 195c, and 195d may be wide-angle cameras. Moreover, the around view provision apparatus may include additional cameras. For example, a camera 195e may be an indoor camera mounted in the vehicle to capture a user, specifically a driver. The processor 170 may detect the position of the driver based on an image from the indoor camera, set a region that cannot be observed by a side view mirror or a rear view mirror based on the position of the driver (e.g., blind spot), and control at least one of the cameras to be operated in a first mode, which is referred to as a blind spot detection (BSD) mode, in which at least one of the cameras is moved (e.g., tilted or rotated) to capture the region that cannot be observed by the side view mirror or the rear view mirror.
Referring now to
The input unit 110 may include a plurality of buttons attached around the display unit 180 or a touchscreen disposed on the display unit 180. The around view provision apparatus 100 may be powered on through the buttons or the touchscreen such that the around view provision apparatus 100 can be operated. On the other hand, various input operations may be performed through the input unit 110.
The ultrasonic sensor unit 198 may include a plurality of ultrasonic sensors. In a case in which the ultrasonic sensors are mounted in the vehicle, the ultrasonic sensor unit 198 may sense an object around the vehicle based on a difference between transmitted ultrasonic waves and received ultrasonic waves.
Unlike
Referring first to
The image preprocessor 410 may receive a plurality of images from the cameras 195a, 195b, 195c, and 195d or a generated around view image and preprocess the plurality of images or the generated around view image.
Specifically, the image preprocessor 410 may perform noise reduction, rectification, calibration, color enhancement, color space conversion (CSC), interpolation, camera gain control, or the like, for the plurality of images or the generated around view image. As a result, the image preprocessor 410 may acquire an image that is more vivid than the plurality of images from the cameras 195a, 195b, 195c, and 195d or the generated around view image.
The disparity calculator 420 may receive the plurality of images or the generated around view image signal-processed by the image preprocessor 410, sequentially performs stereo matching for the received plural images or the received around view image for a predetermined time, and acquires a disparity map based on the stereo matching. That is, the disparity calculator 420 may acquire disparity information for a view around the vehicle. The stereo matching may be performed on a per pixel basis or a per predetermined block basis of the images. Meanwhile, the disparity information may be included in a map showing binocular parallax information as values.
The segmentation unit 432 may perform segmentation and clustering in the images based on the disparity information from the disparity calculator 420. Specifically, the segmentation unit 432 may segment at least one of the images into a background and a foreground based on the disparity information.
For example, a region having a predetermined value or less of the disparity information in the disparity map may be calculated as a background and the region may be excluded. As a result, a foreground may be relatively separated from the image. In another example, a region having a predetermined value or more of the disparity information in the disparity map may be calculated as a foreground and the region may be extracted. As a result, the foreground may be separated from the image.
As described above, the image may be segmented into the background and the foreground based on the disparity information extracted based on the image. Therefore, signal processing speed and signal processing amount may be reduced during detection of an object.
The object detector 434 may detect an object based on the image segment from the segmentation unit 432. That is, the object detector 434 may detect an object for at least one of the images based on the disparity information. For example, the object detector 434 may detect an object from a foreground separated from the image by the image segment.
Subsequently, the object verification unit 436 may classify and verify the separated object. To this end, the object verification unit 436 may use a recognition method using a neural network, a support vector machine (SVM) method, a recognition method based on AdaBoost using a Haar-like feature, a histogram of oriented gradients (HOG) method, or another appropriate technique.
On the other hand, the object verification unit 436 may compare the detected object with objects stored in the memory 140 to verify the detected object. For example, the object verification unit 436 may verify an adjacent vehicle, a lane, a road surface, a traffic sign, a dangerous zone, a tunnel, etc. located around the vehicle.
The object tracking unit 440 may track the verified object. For example, the object tracking unit 440 may verify an object in images which are sequentially acquired, calculate motion or a motion vector of the verified object, and track movement of the object based on the calculated motion or the calculated motion vector. Consequently, the object tracking unit 440 may track an adjacent vehicle, a lane, a road surface, a traffic sign, a dangerous zone, etc. located around the vehicle.
The object detector 434 may receive a plurality of images or a generated around view image and detect an object in the plural images or the generated around view image. Unlike
Subsequently, the object verification unit 436 may classify and verify the detected and separated object based on the image segment from the segmentation unit 432 and the object detected by the object detector 434. To this end, the object verification unit 436 may use a recognition method using a neural network, an SVM method, a recognition method based on AdaBoost using a Haar-like feature, a HOG method, or the like.
The disparity calculator 420 of the processor 170 may receive the images FR1a and FR1b signal-processed by the image preprocessor 410 and may perform stereo matching for the received images FR1a and FR1b to acquire a disparity map 520. The disparity map 520 may show a disparity between the images FR1a and FR1b as levels. When a disparity level is high, the distance to the vehicle may be calculated as being short. When a disparity level is low, on the other hand, the distance to the vehicle may be calculated as being long.
Meanwhile, when the disparity map is displayed, the disparity map may be displayed with higher brightness when the disparity level is higher and the disparity map may be displayed with lower brightness when the disparity level is lower.
By way of example,
The segmentation unit 432, the object detector 434, and the object verification unit 436 may respectively perform segmentation, object detection, and object verification for at least one of the images FR1a and FR1b based on the disparity map 520. Moreover, object detection and object verification for the second image FR1b may be performed using the disparity map 520. That is, object detection and object verification for first to fourth lanes 538a, 538b, 538c, and 538d, a construction zone 532, a first preceding vehicle 534, and a second preceding vehicle 536 in an image 530 may be performed.
Meanwhile, images may be continuously acquired and the object tracking unit 440 may track verified objects.
The vehicle 200 may include an electronic control apparatus 700 for vehicle control. The electronic control apparatus 700 may exchange data with the AVN apparatus 400.
The electronic control apparatus 700 may include an input unit 710, a communication unit 720, a memory 740, a lamp drive unit 751, a steering drive unit 752, a brake drive unit 753, a power source drive unit 754, a sunroof drive unit 755, a suspension drive unit 756, an air conditioning drive unit 757, a window drive unit 758, an airbag drive unit 759, a sensor unit 760, an ECU 770, a display unit 780, an audio output unit 785, an electric power supply unit 790, a plurality of cameras 795.
Meanwhile, the ECU 770 may include a processor. Alternatively, an additional processor to signal-process images from the cameras may be provided in addition to the ECU 770. The input unit 710 may include a plurality of buttons or a touchscreen provided in the vehicle 200. Various input operations may be performed through the buttons or the touchscreen.
The communication unit 720 may exchange data with the mobile terminal 600 or the server 500 in a wireless fashion. In particular, the communication unit 720 may exchange data with a mobile terminal of the driver in a wireless fashion. To this end, various wireless data communication protocols, such as Bluetooth, Wi-Fi, Wi-Fi Direct, and APiX, may be used.
In one example, the communication unit 720 may receive weather information and road traffic state information, such as TPEG information, from the mobile terminal 600 or the server 500. When a user gets into the vehicle, a mobile terminal 600 of the user may pair with the electronic control apparatus 700 automatically or by the user executing an application.
The memory 740 may store various data for overall operation of the electronic control apparatus 700, such as programs for processing or control of the ECU 770.
The lamp drive unit 751 may control turn on/turn off of lamps provided inside and outside the vehicle. In addition, the lamp drive unit 751 may control intensity, direction, etc. of light emitted from each lamp. For example, the lamp drive unit 751 may control a direction indicating lamp, a brake lamp, etc.
The steering drive unit 752 may electronically control a steering apparatus (not shown) in the vehicle 200. Consequently, the steering drive unit 752 may change a heading of the vehicle.
The brake drive unit 753 may electronically control a brake apparatus in the vehicle 200. For example, the brake drive unit 753 may control an operation of a brake mounted at each wheel to reduce speed of the vehicle 200. In another example, the brake drive unit 753 may differently control operations of brakes mounted at left wheels and right wheels to adjust the heading of the vehicle 200 to the left or the right.
The power source drive unit 754 may electronically control a power source in the vehicle 200. For example, when the power source is an engine using fossil fuel, the power source drive unit 754 may electronically control the engine. Consequently, the power source drive unit 754 may control output torque of the engine. In another example, when the power source is an electric motor, the power source drive unit 754 may control the motor. Consequently, the power source drive unit 754 may control rotational speed and torque of the motor.
The sunroof drive unit 755 may electronically control a sunroof apparatus in the vehicle 200. For example, the sunroof drive unit 755 may control a sunroof to be opened or closed.
The suspension drive unit 756 may electronically control a suspension apparatus in the vehicle 200. For example, when a road surface is uneven, the suspension drive unit 756 may control the suspension apparatus to reduce vibration of the vehicle 200.
The air conditioning drive unit 757 may electronically control an air conditioner in the vehicle 200. For example, when the internal temperature of the vehicle is high, the air conditioning drive unit 757 may control the air conditioner to supply cool air into the vehicle.
The window drive unit 758 may electronically control a window apparatus in the vehicle 200. For example, the window drive unit 758 may control left and right side windows of the vehicle to be opened or closed.
The airbag drive unit 759 may electronically control an airbag apparatus in the vehicle 200. For example, the airbag drive unit 759 may control an airbag to deploy in a dangerous situation.
The sensor unit 760 may sense a signal related to travel of the vehicle 200. To this end, the sensor unit 760 may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward movement/backward movement sensor, a wheel sensor, a vehicle speed sensor, a vehicle body tilt sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor, an in-vehicle temperature sensor, an in-vehicle humidity sensor or another appropriate type of sensor.
Consequently, the sensor unit 760 may acquire a sensing signal for vehicle heading information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward movement/backward movement information, battery information, fuel information, tire information, vehicle lamp information, in-vehicle temperature information, in-vehicle humidity information, and the like. In addition, the sensor unit 760 may further include an engine speed sensor, an air flow sensor (AFS), an intake air temperature sensor (ATS), a water temperature sensor (WTS), a throttle position sensor (TPS), a top dead center (TDC) sensor, and a crank angle sensor (CAS).
The ECU 770 may control overall operation of each unit in the electronic control apparatus 700. The ECU 770 may perform a specific operation based on an input through the input unit 710, receive and transmit a signal sensed by the sensor unit 760 to the around view provision apparatus 100, receive map information from the AVN apparatus 400, or control operations of the respective drive units 751, 752, 753, 754, and 756. In addition, the ECU 770 may receive weather information and road traffic state information, such as TPEG information, from the communication unit 720.
On the other hand, the ECU 770 may combine a plurality of images received from the plurality of cameras 795 to generate an around view image. In particular, when the vehicle moves forward at a predetermined speed or less or when the vehicle moves backward, the ECU 770 may generate an around view image. The display unit 780 may display the generated around view image. In particular, the display unit 180 may provide various user interfaces in addition to the around view image.
In order to display the around view image, etc., the display unit 780 may include a cluster or an HUD provided at the inside front of the vehicle. In a case in which the display unit 180 is the HUD, the display unit 180 may include a projection module to project an image on the front windshield glass of the vehicle 200. Meanwhile, the display unit 780 may include a touchscreen to allow input by tapping on the screen.
The audio unit 785 may convert an electric signal received from the ECU 770 into an audio signal and output the audio signal. To this end, the audio unit 785 may include a speaker. The audio unit 785 may output a sound corresponding to an operation of the input unit 710, e.g. a button. The audio unit 785 may also include a microphone to receive sound. Hence, the audio unit 785 may include an audio input unit and an audio output unit.
The electric power supply unit 790 may supply electric power to the respective components under control of the ECU 770. In particular, electric power from an in-vehicle battery may be supplied to the electric power supply unit 790.
The plurality of cameras 795 may be used to provide an around view image. To this end, the plurality of cameras 795 may include four cameras as shown in
Meanwhile, when generating an around view image using images captured by the around view cameras 195a, 195b, 195c, and 195d, it may be necessary to provide an around view image having correct magnifying power without distortion.
However, a partial region of the around view image may be distorted depending upon disposition of the around view cameras 195a, 195b, 195c, and 195d in the vehicle. The present disclosure proposes a method of reducing such distortion, which will hereinafter be described with reference to
Referring first to
In particular, the left side view camera 195a and the right side view camera 195c may be disposed in the case surrounding the left side view mirror and the case surrounding the right side view mirror, respectively. On the other hand, the rear view camera 195b and the front view camera 195d may be disposed, for example, around the trunk switch and at the emblem or around the emblem, respectively. The images captured by the around view cameras 195a, 195b, 195c, and 195d may be transmitted to the processor 170 in the vehicle 200.
Subsequently, the processor 170 of the around view provision apparatus 100 may project the received images on a first solid projection surface to generate an around view image and may project a partial region of the around view image on a second solid projection surface, to generate a partially calibrated around view image, in Step S720. The first and second solid projection surfaces may be virtual surfaces having prescribed contours onto which the captured images may be projected.
Subsequently, the processor 170 of the around view provision apparatus 100 may control the generated around view image to be displayed on the display unit 180, in Step S730. The processor 170 may perform signal processing to a top view or a bird's eye view based on the vehicle using the images captured by the around view cameras 195a, 195b, 195c, and 195d.
In particular, as illustrated in
The size of the image boundary region 815 may be much smaller than that of the vehicle image 203. As a result, the acquired around view image 810 displayed through the display unit 180 may appear to have unnatural perspective to a driver or a user viewing the around view image 180. For this reason, it may be necessary for the processor 170 to control the above regions to be scaled with different magnifying power using a second solid projection surface having a different contour from the first solid projection surface.
Consequently, the processor 170 may scale the lower region 907 of the conical oval solid projection surface 900 with magnifying power different from that in scaling of the oval solid projection surface 800. That is, the processor 170 may not scale down the lower region 907 but may scale up the lower region 907. As a result, it is possible to generate an around view image without distortion, a partial region of which appears more realistic and more closely representing the actual objects. In other words, it may be possible to generate an around view image in which the outer region that is distant from the vehicle image is not distorted.
Meanwhile, upon receiving a selection input to select a predetermined region of the around view image displayed on the display unit 180, the processor 170 may project the region of the around view image corresponding to the selection input on the second solid projection surface to generate a partially calibrated around view image.
FIG. 9C(a) shows, by way of example, that, when the around view image 810 before calibration is displayed on the display unit 180, the outer region 802 of the around view image 810 before calibration may be selected by a touch input. When the display unit 180 is a touchscreen or a touch display device to sense a touch input, the processor 170 may change scaling of the corresponding region in response to the sensed touch input.
That is, the processor 170 may control the corresponding region to be projected on the second solid projection surface. In addition, the processor 170 may project the selected region on the second solid projection surface to generate a partially calibrated around view image.
FIG. 9C(b) shows, by way of example, that the around view image 810 after calibration may be displayed on the display unit 180. As shown in FIG. 9C(b), an around view image, a desired region of which has been calibrated, may be generated and displayed. As a result, user convenience may be improved.
Meanwhile, the processor 170 may control the partially calibrated around view image to be displayed on the display unit 180. At this time, the processor 170 may control the partial region to be displayed in a highlighted fashion or an object indicating the partial region to be displayed in an overlapping fashion on the partially calibrated around view image displayed through the display unit 180.
Unlike
When the region 1015 positioned at a lower height than the left side view camera 195a is projected on the first oval solid projection surface 1005, the region 1015 may be distorted. As shown in
Here, a convex region of the first solid projection surface 1005 may be different in direction from a convex region of the second solid projection surface 1025. In this way, the processor 170 may generate a partially calibrated around view image using different curvatures of the first solid projection surface and the second solid projection surface, different directions of the convex region of the first solid projection surface and the convex region of the second solid projection surface, or a combination thereof.
In a case in which the first to fourth cameras 195a, 195b, 195c, and 195d mounted at the vehicle are positioned relatively high, the size of a partial region before calibration is relatively reduced by the first solid projection surface. In order to calibrate such reduction, therefore, the processor 170 may set the size of the partial region of the around view image to be increased. As a result, distortion of the partial region of the around view image may be reduced.
Moreover, when a specific object is contained in the around view image generated based on the first solid projection surface, the processor 170 may project a region corresponding to the specific object on the second solid projection surface to generate a partially calibrated around view image.
In a case in which the region 1015 positioned at a lower height than the left side view camera 195a is projected on the first oval solid projection surface 1005, the region 1015 may be distorted. As shown in
Moreover, the processor 170 may change at least one of a size or a position of at least a partial region of the second solid projection surface in response to a travel direction of the vehicle, which will hereinafter be described in detail with reference to
In this case, the processor 170 may change at least one of the size or the position of at least a partial region of the second solid projection surface in response to the travel direction of the vehicle. That is, at the first point of time T1, the processor 170 may project the respective images from the cameras 195a, 195b, 195c, and 195d on the upper region, e.g., the first region 806, of the oval solid projection surface 800 while partial regions of the respective images are projected on the lower region, e.g., the second region 907, of the conical oval solid projection surface 900 as shown in
At the second point in time T2, the vehicle may have moved in the right side rear direction. Consequently, the processor 170 may project the respective images from the cameras 195a, 195b, 195c, and 195d on an right side upper region 1206 of the oval solid projection surface 800 generated according to movement of a curved line 1205 to the right, while partial regions of the respective images are projected on a left side lower region 1217 of the conical oval solid projection surface 900. As a result, it may be possible to generate an around view image with reduced distortion corresponding to the travel direction of the vehicle.
At the third point in time T3, the processor 170 may project the respective images on the upper region, e.g., the first region 806, of the oval solid projection surface 800, while partial regions of the respective images are projected on the lower region, e.g., the second region 907, of the conical oval solid projection surface 900 in the same manner as at the first point of time T1.
Meanwhile, the processor 170 may change at least one of a tilt or an angle of at least one selected from between the first solid projection surface and the solid projection surface in response to the tilt of the vehicle, which will hereinafter be described in detail with reference to
On the other hand,
For example, the processor 170 may select a second solid projection surface among a plurality of second solid projection surfaces based on a detected inclination of the vehicle or a road surface. The around view image may include a plurality of regions that correspond to an object having a prescribed inclination, and each of the plurality of regions of the around view image may be generated using a different solid projection surface. Accordingly, various tilt or angle of inclination of the vehicle or road surface may be accounted for to more accurately remove the distortions caused by the inclination. Moreover, in both situations illustrated in
The vehicle 200 may include a plurality of ultrasonic sensors 198a to 198j to sense approach of a specific object. The first to third ultrasonic sensors 198a, 198b, and 198c may be attached to the front end of the vehicle, the fourth and fifth ultrasonic sensors 198d and 198e may be attached to opposite sides of the front part of the vehicle, the sixth and seventh ultrasonic sensors 198f and 198g may be attached to opposite sides of the rear part of the vehicle, and the eighth to tenth ultrasonic sensors 198h, 198i, and 198j may be attached to the rear end of the vehicle. It should be appreciated the present disclosure is not limited to the number and positions of the ultrasonic sensors, but may include more or less ultrasonic sensors positioned at various locations on the vehicle.
In one example, a specific object 1100 located at the left side rear of the vehicle may be sensed by the seventh ultrasonic sensor 198e. Upon determining that the specific object 1100 is within a predetermined distance from the vehicle through the seventh ultrasonic sensor 198g, the processor 170 may control the specific object 1100 to be projected on the second solid projection surface, not the first solid projection surface.
Here, once the specific object 1100 is detected by the ultrasonic sensor 198g to be with a prescribed distance of the vehicle, the processor 170 may process the images captured by the left side view camera 195a to reduce or remove distortions. For example, the detected specific object 1100 may be the specific object 1100 as previously described with regard to
When the region 1015 positioned at a lower height than the left side view camera 195a may be projected on the first solid projection surface 1005, the region 1015 may be distorted. Referring again to
Meanwhile, the operation method of the around view provision apparatus and the vehicle according to the present disclosure may be implemented as code that can be written on a processor-readable recording medium and thus read by a processor provided in the around view provision apparatus or the vehicle. The processor-readable recording medium may be any type of recording device in which data may be stored in a processor-readable manner. The processor-readable recording medium may include, for example, a read only memory (ROM), a random access memory (RAM), a compact disc read only memory (CD-ROM), a magnetic tape, a floppy disc, and an optical data storage device and may be implemented in the form of a carrier wave transmitted over the Internet. The processor-readable recording medium can be distributed over a plurality of computer systems connected to a network such that processor-readable code is written thereto and executed therefrom in a decentralized manner.
As broadly described and embodied herein, an around view provision apparatus is provided. It is an object of the present disclosure to provide an around view provision apparatus that is capable of reducing distortion of an around view image and a vehicle including the same.
In accordance with an aspect of the present disclosure, the above and other objects may be accomplished by the provision of an around view provision apparatus including first to fourth cameras mounted at a vehicle and a processor to project images from the first to fourth cameras on a first solid projection surface so as to generate an around view image and to project a partial region of the around view image on a second solid projection surface different from the first solid projection surface so as to generate a partially calibrated around view image.
Meanwhile, the around view provision apparatus and the vehicle may each further include at least one ultrasonic sensor mounted at the vehicle, wherein, when a specific object approaches within a predetermined distance from the vehicle based on the at least one ultrasonic sensor, the processor may project a region containing the object of the around view image on the second solid projection surface to generate the partially calibrated around view image. Consequently, it may be possible to generate a more accurate around view image.
On the other hand, the processor may project a region of the around view image corresponding to a selection input on the second solid projection surface to generate the partially calibrated around view image, thereby improving user convenience.
In accordance with another aspect of the present disclosure, an around view provision apparatus may include first to fourth cameras mounted at a vehicle and a processor to generate an around view image based on images from the first to fourth cameras using a first scaling factor corresponding to a first solid shape and to generate a partially calibrated around view image based on a partial region of the around view image using a scaling factor corresponding to a second solid shape different from the first solid shape.
In accordance with another aspect of the present disclosure, a vehicle may include a steering drive unit to drive a steering apparatus, a brake drive unit to drive a brake apparatus, a power source drive unit to drive a power source, first to fourth cameras mounted at the vehicle, and a processor to project images from the first to fourth cameras on a first solid projection surface so as to generate an around view image and to project a partial region of the around view image on a second solid projection surface different from the first solid projection surface so as to generate a partially calibrated around view image.
Any reference in this specification to “one embodiment,” “an embodiment,” “example embodiment,” etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with any embodiment, it is submitted that it is within the purview of one skilled in the art to effect such feature, structure, or characteristic in connection with other ones of the embodiments.
Although embodiments have been described with reference to a number of illustrative embodiments thereof, it should be understood that numerous other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More particularly, various variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, alternative uses will also be apparent to those skilled in the art.
Number | Date | Country | Kind |
---|---|---|---|
10-2014-0070321 | Jun 2014 | KR | national |