This application is based on and claims priority under 35 U.S.C. § 119 to Japanese Patent Application 2018-063507, filed on Mar. 29, 2018, the entire contents of which are incorporated herein by reference.
This disclosure relates to a surrounding monitoring device.
JP 2017-123627A (Reference 1) discloses a technique of displaying images imaged by imaging devices installed in a vehicle to a driver or the like after performing image processing such as viewpoint conversion and synthesis.
However, in the related art, a center of an imaging range of the imaging device installed in the vehicle deviates from a center of an image displayed to the driver or the like. Accordingly, distortion occurs in the display image due to viewpoint conversion, which might bring a sense of discomfort to the driver or the like who views the display image.
As an example, a surrounding monitoring device according to an embodiment of this disclosure includes a plurality of imaging devices attached to a vehicle in order to generate a display image corresponding to one viewpoint. Centerlines of respective lenses of the plurality of imaging devices intersect at the viewpoint.
The foregoing and additional features and characteristics of this disclosure will become more apparent from the following detailed description considered with the reference to the accompanying drawings, wherein:
More specifically, the imaging device 2a is a front camera which images a front of the vehicle 1, the imaging device 2b is a right side camera which images a right side of the vehicle 1, the imaging device 2c is a left side camera which images a left side of the vehicle 1, and the imaging device 2d is a back camera (rear camera) which images a back of the vehicle 1.
The installation positions and installation angles of the imaging devices 2a to 2c are determined such that centerlines 20a to 20c of lenses of the imaging devices 2a to 2c intersect at a point P. Hereinafter, the imaging devices 2a to 2c are referred to as an imaging device 2 unless particularly distinguished. The centerlines 20a to 20c are referred to as a centerline 20 unless particularly distinguished. The centerline 20 is also a centerline of an imaging range of the imaging device 2.
The point P is an imaginary viewpoint of a display image generated from images imaged by the imaging devices 2a to 2c. Hereinafter, the point P is referred to as an imaginary viewpoint P. The imaginary viewpoint P is a position that is a viewpoint of a user when the user views the display image. In this embodiment, the position of the imaginary viewpoint P is determined in advance, and the imaging devices 2a to 2c are attached to the vehicle 1 in order to generate the display image corresponding to the imaginary viewpoint P.
The display image displays a range corresponding to a visual field of the user when the user views the surroundings of the vehicle from the position of the imaginary viewpoint P. Details of the display image are described below. The user is, for example, a driver of the vehicle 1.
The image processing unit 101 performs image processing such as magnification and synthesis of a plurality of images imaged by the imaging devices 2a to 2d associated with the imaginary viewpoint P to generate the display image viewed from the imaginary viewpoint P. The display control unit 102 outputs the display image generated by the image processing unit 101 to a display device to be described below.
In this embodiment, the imaging devices 2a to 2c may be used as a surrounding monitoring device, or both of the imaging devices 2a to 2c and the information processing device 100 may be used as a surrounding monitoring device.
Next, imaging ranges of the imaging devices 2a to 2c are described in more details.
A diagram in a middle of
As illustrated in
As also illustrated in
Since the center of the imaging range of the imaging device 2 is located in the center of the display range of the display image, it is possible to fit the visual field of the user 3 by magnifying the image just by the distance L without significant viewpoint conversion. Therefore, distortion is hard to occur in an object (subject) included in the imaged image. For example, in the image after magnification in
In addition to the magnification processing of the imaged image, the image processing unit 101 synthesizes the imaged images of the imaging devices 2a to 2c into a series of the display images which display the left side, the front, and the right side of the vehicle 1. Since the imaging devices 2a to 2c are installed as the centerlines 20a to 20c at the imaginary viewpoint P, the image processing unit 101 does not need processing of significant viewpoint conversion when the imaged images of the imaging devices 2a to 2c are synthesized. Therefore, the image processing unit 101 can reduce distortion of the display image.
The above magnification processing and synthesis processing are examples of the image processing, and the image processing unit 101 may further perform other image processing. The image processing unit 101 also performs image processing such as magnification on an imaged image in which the imaging device 2d, which is the back camera, images the back of the vehicle 1.
Next, the display image is described.
The first display image 22 includes a front camera reflection obtained by magnifying the image imaged by the imaging device 2a, a right side camera reflection obtained by magnifying the image imaged by the imaging device 2b, and a left side camera reflection obtained by magnifying the image imaged by the imaging device 2c, and is synthesized such that the images are connected at a boundary of each reflection. The first display image 22 is also referred to as an image which displays the visual field of the user 3 when the user 3 views the surroundings of the vehicle 1 from the position of the imaginary viewpoint P. Since shapes of persons 5a, 5b included in the first display image 22 are not distorted, the first display image 22 can grasp a situation of the surroundings of the vehicle 1 without bringing a sense of discomfort to the user 3.
The first display image 22 and the second display image 23 illustrated in
In contrast with the configuration of this embodiment, distortion might occur in a display image due to viewpoint conversion since a center of an imaging range of an imaging device installed in a vehicle does not coincide with a center of an image displayed to a driver or the like in an installation position of a general imaging device in the related art.
In contrast, the surrounding monitoring device of this embodiment includes a plurality of imaging devices 2a to 2c attached to the vehicle 1 in order to generate the first display image 22 corresponding to the imaginary viewpoint P, and the positions and angles of the plurality of imaging devices 2a to 2c are determined such that the centerlines 20a to 20c of the respective lenses of the plurality of imaging devices 2a to 2c intersect at the imaginary viewpoint P. Therefore, according to the surrounding monitoring device of this embodiment, it is possible to image an image capable of generating the first display image 22 with a little distortion.
Furthermore, the surrounding monitoring device of this embodiment synthesizes a plurality of images imaged by the plurality of imaging devices 2a to 2c to generate the first display image 22 viewed from the imaginary viewpoint P and display the first display image 22 on the display device 12. Therefore, according to the surrounding monitoring device of this embodiment, since a situation in a plurality of directions of the surroundings of the vehicle 1 to be confirmed by the driver (user 3) is displayed on the first display image 22 with a little distortion, the situation in the plurality of directions surrounding the vehicle 1 can be displayed on one screen while reducing a sense of discomfort of the user 3 on the display image. For example, the user 3 can easily grasp situations of the front, the right side, and the left side of the vehicle 1 imaged by the front camera and the left and right side cameras from the first display image 22.
The image processing unit 101 may perform image recognition processing of recognizing a predetermined object included in images from the imaged images of the imaging devices 2a to 2c or the first display image 22 after image processing. The predetermined object may be determined in advance, or may be specified by the user 3 or the like. The display control unit 102 may display a recognition result by means of the image processing unit 101 on the display device 12. For example, the display control unit 102 superimposes and displays a figure or the like illustrating a result of recognizing the predetermined object by the image processing unit 101 on the first display image 22. As described above, the distortion is a little in the first display image 22. Accordingly, according to the surrounding monitoring device of this embodiment, when the recognition result of the predetermined object included in the imaged image is displayed, it is possible to reduce a sense of discomfort of the user 3 on display of the recognition result and occurrence of misunderstanding.
In addition to the imaging devices 2a to 2d, the vehicle 1 of this embodiment may include a distance measuring unit such as a sonar (sonar sensor or ultrasonic detector) or the like that emits an ultrasonic wave and captures a reflected wave thereof. The information processing device 100 or another ECU may detect an obstacle or the like based on a detection result of the surroundings of the vehicle 1 by the distance measuring unit and the imaged image, and output a warning or the like to the display device 12.
In this embodiment, the surrounding monitoring device includes the imaging devices 2a to 2c, but this disclosure is not limited thereto, and may include the imaging devices 2a to 2d. The position of the imaginary viewpoint P illustrated in
In the first embodiment described above, the position of the imaginary viewpoint P and the imaging directions of the imaging devices 2a to 2c are fixed at predetermined positions. In a second embodiment, imaging directions of the imaging devices 2a to 2c are variable along with movement of the imaginary viewpoint P.
The image processing unit 101 and the display control unit 102 have the same functions as those in the first embodiment.
The reception unit 103 receives an operation of instructing movement of a position of the imaginary viewpoint P by a user 3. For example, the display device 12 includes a touch panel, and the operation of instructing the movement of the position of the imaginary viewpoint P may be input from the touch panel. Further, the reception unit 103 may receive the operation of instructing the movement of the position of the imaginary viewpoint P input from an input device such as another input screen or button. The operation of instructing the movement of the position of the imaginary viewpoint P is not limited to an operation of directly specifying a position of the imaginary viewpoint P after the movement, but may also be a screen display switching operation or the like.
When the imaginary viewpoint P moves, the imaging device control unit 104 moves an orientation of the imaging devices 2a to 2c to a direction where centerlines 20a to 20c of the imaging devices 2a to 2c intersect at the imaginary viewpoint P after the movement. The imaging device control unit 104 controls the orientation of the imaging devices 2a to 2c by, for example, transmitting a control signal to rotate the imaging devices 2a to 2c. The imaging device control unit 104 is an example of a movement control unit in this embodiment.
In this way, when the imaginary viewpoint P moves, the surrounding monitoring device of this embodiment moves the orientation of the imaging devices 2a to 2c to the direction where the centerlines 20a to 20c of the imaging devices 2a to 2c intersect at the imaginary viewpoint P after the movement. Therefore, according to the surrounding monitoring device of this embodiment, in addition to effects of the first embodiment, a display image in which the position of the imaginary viewpoint P is changed can be generated and provided to the user 3.
The movement of the position of the imaginary viewpoint P is not limited to movement due to the operation of the user 3, and may be determined by the information processing device 100 or another ECU loaded on a vehicle 1.
In the second embodiment described above, by moving the imaging devices 2a to 2c along with the movement of the imaginary viewpoint P, display images based on a plurality of imaginary viewpoints P can be displayed. In a third embodiment, positions of a plurality of imaginary viewpoints are determined in advance, and a plurality of imaging devices 2 are associated with the imaginary viewpoints respectively.
The memory unit 105 memorizes predetermined positions of a plurality of imaginary viewpoints P and a plurality of imaging devices 2 associated with each imaginary viewpoints P. The memory unit 105 is, for example, a memory device such as an HDD or a flash memory.
The reception unit 1103 receives an operation of selecting either one of the imaginary viewpoints P1, P2 by the user 3.
The image processing unit 1101 has the function in the first embodiment, and executes image processing such as magnification and synthesis on images imaged by the plurality of imaging devices 2 associated with the imaginary viewpoints P selected by the user 3 to generate the display images. For example, when the imaginary viewpoint P1 is selected, the image processing unit 1101 synthesizes the plurality of images imaged by the imaging devices 2a to 2c to generate the display image viewed from the imaginary viewpoint P1.
The display control unit 1102 has the function in the first embodiment, and displays the display images generated from the plurality of images imaged by the plurality of imaging devices 2 associated with the imaginary viewpoints P selected by the user 3 on a display device 12.
In this way, the surrounding monitoring device of this embodiment includes the plurality of imaging devices 2a to 2c attached to the vehicle 1 in order to generate a display image corresponding to the imaginary viewpoint P1, and the plurality of imaging devices 2e to 2g attached to the vehicle 1 in order to generate another display image corresponding to the imaginary viewpoint P2. It is determined that the centerlines 20a to 20c of the lenses of the plurality of imaging devices 2a to 2c associated with the imaginary viewpoint P1 intersect at the imaginary viewpoint P1, and the centerlines 20e to 20g of the lenses of the plurality of imaging devices 2e to 2g associated with the imaginary viewpoint P2 intersect at the imaginary viewpoint P2. Therefore, according to the surrounding monitoring device of this embodiment, in addition to effects of the first embodiment, even though the imaging device 2 is not moved, it is possible to generate the plurality of display images based on the positions of different imaginary viewpoints P1, P2.
The imaginary viewpoints P may be selected by the information processing device 100 or another ECU. The positions and the number of the imaginary viewpoints P1, P2 and the number of the imaging devices 2 illustrated in
The information processing device 100, 1100, 2100 includes a control device such as a CPU, a memory device such as a ROM and a RAM, and an external memory device such as an HDD, a flash memory, and a CD drive device, and has a hardware configuration using a general computer. The image processing unit 101, 1101, the display control unit 102, 1102, the reception unit 103, 1103, and the imaging device control unit 104 of the information processing device 100, 1100, 2100 are realized by executing a program stored in the ROM by the CPU. These configurations may be realized with a hardware circuit.
As an example, a surrounding monitoring device according to an embodiment of this disclosure includes a plurality of imaging devices attached to a vehicle in order to generate a display image corresponding to one viewpoint. Centerlines of respective lenses of the plurality of imaging devices intersect at the viewpoint. Therefore, according to the surrounding monitoring device according to the embodiment, it is possible to image an imaged image capable of generating the display image with a little distortion.
As an example, the surrounding monitoring device further includes a movement control unit that moves orientations of the plurality of imaging devices to directions in which the centerlines of the respective lenses of the plurality of imaging devices intersect at a viewpoint after movement when the viewpoint moves. Therefore, according to the surrounding monitoring device according to the embodiment, for example, it is possible to generate a display image which changes a position of an imaginary viewpoint P.
As an example, the surrounding monitoring device further includes a plurality of imaging devices attached to the vehicle in order to generate the display image corresponding to a first viewpoint, and a plurality of imaging devices attached to a vehicle in order to generate another display image corresponding to a second viewpoint. The centerlines of the respective lenses of the plurality of imaging devices associated with the first viewpoint intersect at the first viewpoint. The centerlines of the respective lenses of the plurality of imaging devices associated with the second viewpoint intersect at the second viewpoint. Therefore, according to the surrounding monitoring device according to the embodiment, for example, even though the imaging devices are not moved, it is possible to generate the plurality of display images corresponding to the plurality of viewpoints respectively.
As an example, the surrounding monitoring device further includes an image processing unit that synthesizes a plurality of images imaged by the plurality of imaging devices associated with the viewpoint to generate the display image viewed from the viewpoint, and a display control unit causes the generated display image to be displayed on a display device. Therefore, according to the surrounding monitoring device according to the embodiment, for example, by displaying a situation in a plurality of directions surrounding the vehicle on the display image with a little distortion, a situation in the plurality of directions can be displayed on one screen while reducing a sense of discomfort of a user on the display image.
In the surrounding monitoring device as an example, the image processing unit recognizes a predetermined object from the plurality of images imaged by the plurality of imaging devices or the display image, and the display control unit superimposes a result of recognition of the object by the image processing unit on the display image and causes the result to be displayed on the display device. Therefore, according to the surrounding monitoring device according to the embodiment, for example, when the recognition result of the predetermined object included in the imaged images is displayed, it is possible to reduce a sense of discomfort of the user on the display of the recognition result and occurrence of misunderstanding.
While embodiments disclosed here are exemplified, the above embodiments and modifications are consistently an example, and are not intended to limit a scope of this disclosure. The above embodiments or modifications can be carried out in various other modes, and various omissions, substitutions, combinations, and changes can be performed without departing from the spirit of the disclosure. Configurations and shapes of the respective embodiments and modifications may be partly replaced and carried out.
The principles, preferred embodiment and mode of operation of the present invention have been described in the foregoing specification. However, the invention which is intended to be protected is not to be construed as limited to the particular embodiments disclosed. Further, the embodiments described herein are to be regarded as illustrative rather than restrictive. Variations and changes may be made by others, and equivalents employed, without departing from the spirit of the present invention. Accordingly, it is expressly intended that all such variations, changes and equivalents which fall within the spirit and scope of the present invention as defined in the claims, be embraced thereby.
Number | Date | Country | Kind |
---|---|---|---|
2018-063507 | Mar 2018 | JP | national |