SURROUNDING MONITORING DEVICE

Information

  • Patent Application
  • 20190299857
  • Publication Number
    20190299857
  • Date Filed
    March 15, 2019
    5 years ago
  • Date Published
    October 03, 2019
    5 years ago
Abstract
A surrounding monitoring device includes: a plurality of imaging devices attached to a vehicle in order to generate a display image corresponding to one viewpoint, in which centerlines of respective lenses of the plurality of imaging devices intersect at the viewpoint.
Description
CROSS REFERENCE TO RELATED APPLICATIONS

This application is based on and claims priority under 35 U.S.C. § 119 to Japanese Patent Application 2018-063507, filed on Mar. 29, 2018, the entire contents of which are incorporated herein by reference.


TECHNICAL FIELD

This disclosure relates to a surrounding monitoring device.


BACKGROUND DISCUSSION

JP 2017-123627A (Reference 1) discloses a technique of displaying images imaged by imaging devices installed in a vehicle to a driver or the like after performing image processing such as viewpoint conversion and synthesis.


However, in the related art, a center of an imaging range of the imaging device installed in the vehicle deviates from a center of an image displayed to the driver or the like. Accordingly, distortion occurs in the display image due to viewpoint conversion, which might bring a sense of discomfort to the driver or the like who views the display image.


SUMMARY

As an example, a surrounding monitoring device according to an embodiment of this disclosure includes a plurality of imaging devices attached to a vehicle in order to generate a display image corresponding to one viewpoint. Centerlines of respective lenses of the plurality of imaging devices intersect at the viewpoint.





BRIEF DESCRIPTION OF THE DRAWINGS

The foregoing and additional features and characteristics of this disclosure will become more apparent from the following detailed description considered with the reference to the accompanying drawings, wherein:



FIG. 1 illustrates an example of installation positions of imaging devices according to a first embodiment;



FIG. 2 shows an example of functions of an information processing device according to the first embodiment;



FIG. 3 illustrates an example of imaging ranges of the imaging devices and a display range of a display image according to the first embodiment;



FIG. 4 illustrates an example of image processing according to the first embodiment;



FIG. 5 illustrates an example of an installation position of a display device according to the first embodiment;



FIG. 6 illustrates an example of the display image according to the first embodiment;



FIG. 7 shows an example of functions of an information processing device according to a second embodiment;



FIG. 8 illustrates an example of movement of an imaginary viewpoint according to the second embodiment;



FIG. 9 illustrates an example of installation positions of imaging devices according to a third embodiment;



FIG. 10 shows an example of functions of an information processing device according to the third embodiment;



FIG. 11 illustrates an example of installation positions of imaging devices according to the related art;



FIG. 12 illustrates an example of imaging ranges of the imaging devices and a display range of a display image according to the related art; and



FIG. 13 illustrates an example of the display image according to the related art.





DETAILED DESCRIPTION
First Embodiment


FIG. 1 illustrates an example of installation positions of imaging devices 2a to 2d according to a first embodiment. The imaging devices 2a to 2d are cameras which image surroundings of a vehicle 1.


More specifically, the imaging device 2a is a front camera which images a front of the vehicle 1, the imaging device 2b is a right side camera which images a right side of the vehicle 1, the imaging device 2c is a left side camera which images a left side of the vehicle 1, and the imaging device 2d is a back camera (rear camera) which images a back of the vehicle 1.


The installation positions and installation angles of the imaging devices 2a to 2c are determined such that centerlines 20a to 20c of lenses of the imaging devices 2a to 2c intersect at a point P. Hereinafter, the imaging devices 2a to 2c are referred to as an imaging device 2 unless particularly distinguished. The centerlines 20a to 20c are referred to as a centerline 20 unless particularly distinguished. The centerline 20 is also a centerline of an imaging range of the imaging device 2.


The point P is an imaginary viewpoint of a display image generated from images imaged by the imaging devices 2a to 2c. Hereinafter, the point P is referred to as an imaginary viewpoint P. The imaginary viewpoint P is a position that is a viewpoint of a user when the user views the display image. In this embodiment, the position of the imaginary viewpoint P is determined in advance, and the imaging devices 2a to 2c are attached to the vehicle 1 in order to generate the display image corresponding to the imaginary viewpoint P.


The display image displays a range corresponding to a visual field of the user when the user views the surroundings of the vehicle from the position of the imaginary viewpoint P. Details of the display image are described below. The user is, for example, a driver of the vehicle 1.



FIG. 2 shows an example of functions of an information processing device 100 according to this embodiment. The information processing device 100 is, for example, an electronic control unit (ECU) loaded on the vehicle 1, and manages the imaging devices 2a to 2d. The information processing device 100 of this embodiment includes an image processing unit 101 and a display control unit 102.


The image processing unit 101 performs image processing such as magnification and synthesis of a plurality of images imaged by the imaging devices 2a to 2d associated with the imaginary viewpoint P to generate the display image viewed from the imaginary viewpoint P. The display control unit 102 outputs the display image generated by the image processing unit 101 to a display device to be described below.


In this embodiment, the imaging devices 2a to 2c may be used as a surrounding monitoring device, or both of the imaging devices 2a to 2c and the information processing device 100 may be used as a surrounding monitoring device.


Next, imaging ranges of the imaging devices 2a to 2c are described in more details. FIG. 3 illustrates an example of imaging ranges of the imaging devices 2a to 2c and a display range of the display image according to this embodiment. An alternate long and short dash line illustrated in FIG. 3 represents the display range of the display image displayed to a user 3. “The display range of the display image” is also referred to as “a visual field of the user 3 when the user 3 is located at the imaginary viewpoint P” or “a visual field of the display image”. In addition, the hatched line illustrated in FIG. 3 represents the imaging ranges of the imaging devices 2a to 2c. “The imaging ranges of the imaging devices 2a to 2c” are also referred to as “visual fields of the imaging devices 2a to 2c”.


A diagram in a middle of FIG. 3 illustrates the imaging ranges of the imaging devices 2a to 2c and the display range of the display image when the vehicle 1 is viewed from above. A diagram on a right side and a diagram on a left side of FIG. 3 illustrate the imaging ranges of the imaging devices 2a to 2c and the display range of the display image when the vehicle 1 is viewed from the right side or the left side, respectively. A position of the user 3 in FIG. 3 indicates the position of the imaginary viewpoint P.


As illustrated in FIG. 3, a center of the imaging range of each of the imaging devices 2a to 2c is located on an extension line of a center of the display range of the display image. For example, regarding imaging of the front of the vehicle 1, the center of the imaging range of the imaging device 2a, which is the front camera, is located on the extension line of the center of the display range of the display image. Regarding imaging of the left and right sides of the vehicle 1, the center of the imaging range of the imaging device 2b, which is the right side camera, or the center of the imaging range of the imaging device 2c, which is the left side camera, is also located on the extension line of the center of the display range of the display image.



FIG. 4 illustrates an example of image processing according to this embodiment. The imaging device 2 illustrated in FIG. 4 is any one of the imaging devices 2a to 2c. An upper part of FIG. 4 illustrates diagrams of the imaging range of the imaging device 2 and the display range of the display image as viewed from above and a side.


As also illustrated in FIG. 1, there is a distance between the positions where the imaging devices 2a to 2c are installed and the imaginary viewpoint P (the position of the user 3 in FIG. 4). This distance is defined as distance L. As illustrated in a lower part of FIG. 4, the image processing unit 101 of the information processing device 100 makes an imaging range of the imaged image after magnification coincide with the display range of the display image (the visual field viewed by the user 3 from the imaginary viewpoint P) by magnifying the imaged image just by the distance L. An imaging range of an imaged image of an imaging device 2′ illustrated in FIG. 4 indicates an imaging range when the imaging device 2′ images an image from the position of the imaginary viewpoint P. By magnifying the imaged image, the image processing unit 101 sets a size of a subject included in the image imaged by the imaging device 2 to the same size as that when the imaging device 2′ images an image from the position of the imaginary viewpoint P.


Since the center of the imaging range of the imaging device 2 is located in the center of the display range of the display image, it is possible to fit the visual field of the user 3 by magnifying the image just by the distance L without significant viewpoint conversion. Therefore, distortion is hard to occur in an object (subject) included in the imaged image. For example, in the image after magnification in FIG. 4, a person 5 included in the imaged image becomes larger as illustrated as a person 5′, but a shape is the same as before magnification, and no distortion occurs.


In addition to the magnification processing of the imaged image, the image processing unit 101 synthesizes the imaged images of the imaging devices 2a to 2c into a series of the display images which display the left side, the front, and the right side of the vehicle 1. Since the imaging devices 2a to 2c are installed as the centerlines 20a to 20c at the imaginary viewpoint P, the image processing unit 101 does not need processing of significant viewpoint conversion when the imaged images of the imaging devices 2a to 2c are synthesized. Therefore, the image processing unit 101 can reduce distortion of the display image.


The above magnification processing and synthesis processing are examples of the image processing, and the image processing unit 101 may further perform other image processing. The image processing unit 101 also performs image processing such as magnification on an imaged image in which the imaging device 2d, which is the back camera, images the back of the vehicle 1.


Next, the display image is described. FIG. 5 illustrates an example of an installation position of a display device 12 according to this embodiment. As an example, the display device 12 is installed near a middle of a dashboard 14 under a windshield of the vehicle 1. The display device 12 may be at a position that is easily visible from a driver who operates a handle 13, but the installation position is not limited thereto. The display device 12 is, for example, a liquid crystal display.



FIG. 6 is an example of the display image according to this embodiment. The display control unit 102 of the information processing device 100 displays the display image generated by the image processing of the image processing unit 101 to the display device 12. In more details, the display control unit 102 displays an image 21 illustrating a back surface of the vehicle 1, a first display image 22 in which the imaged images of the imaging devices 2a to 2c are synthesized, and a second display image 23 generated by magnification processing or the like from the image imaged by the imaging device 2d which is the back camera.


The first display image 22 includes a front camera reflection obtained by magnifying the image imaged by the imaging device 2a, a right side camera reflection obtained by magnifying the image imaged by the imaging device 2b, and a left side camera reflection obtained by magnifying the image imaged by the imaging device 2c, and is synthesized such that the images are connected at a boundary of each reflection. The first display image 22 is also referred to as an image which displays the visual field of the user 3 when the user 3 views the surroundings of the vehicle 1 from the position of the imaginary viewpoint P. Since shapes of persons 5a, 5b included in the first display image 22 are not distorted, the first display image 22 can grasp a situation of the surroundings of the vehicle 1 without bringing a sense of discomfort to the user 3.


The first display image 22 and the second display image 23 illustrated in FIG. 6 are an example of the display image, and for example, the display control unit 102 may individually display the left side camera reflection, the front camera reflection, and the right side camera reflection without synthesis.


In contrast with the configuration of this embodiment, distortion might occur in a display image due to viewpoint conversion since a center of an imaging range of an imaging device installed in a vehicle does not coincide with a center of an image displayed to a driver or the like in an installation position of a general imaging device in the related art. FIG. 11 illustrates an example of an installation position of an imaging device according to the related art. In the example illustrated in FIG. 11, since a right side camera and a left side camera face downward separately, centerlines of a front camera, the right side camera, and the left side camera installed in the vehicle do not intersect at one point. In a display image illustrated in FIG. 13 to be described below, an imaginary viewpoint is located on the back of the vehicle, but the centerline of each imaging device illustrated in FIG. 11 does not pass through the imaginary viewpoint of the display image.



FIG. 12 illustrates an example of imaging ranges of imaging devices and a display range of a display image according to the related art. In an example illustrated in FIG. 12, a center of the imaging range of each imaging device does not coincide with a center of the display range of the display image. In particular, since the right side camera and the left side camera face downward, the imaging range is different from the display range of the display image, and an imaging angle with respect to a subject is also different from an angle between an imaginary development of the display image and the subject. When the display image is generated from images imaged by the imaging devices having such an installation position and an installation angle, an information processing device in the related art executes viewpoint conversion processing on each imaged image and then synthesizes the imaged images.



FIG. 13 illustrates an example of the display image according to the related art. In the example illustrated in FIG. 13, distortion occurs in a shape of a person located near a boundary of each imaged image. As in this example, when the center of the imaging range of each imaging device does not coincide with the center of the display range of the display image, distortion might occur in the display image since a degree of viewpoint conversion increases. In addition, the distortion of the display image might make it difficult for the user 3 to grasp a sense of distance from an obstacle or the like and information on the surroundings of the vehicle without a sense of discomfort.


In contrast, the surrounding monitoring device of this embodiment includes a plurality of imaging devices 2a to 2c attached to the vehicle 1 in order to generate the first display image 22 corresponding to the imaginary viewpoint P, and the positions and angles of the plurality of imaging devices 2a to 2c are determined such that the centerlines 20a to 20c of the respective lenses of the plurality of imaging devices 2a to 2c intersect at the imaginary viewpoint P. Therefore, according to the surrounding monitoring device of this embodiment, it is possible to image an image capable of generating the first display image 22 with a little distortion.


Furthermore, the surrounding monitoring device of this embodiment synthesizes a plurality of images imaged by the plurality of imaging devices 2a to 2c to generate the first display image 22 viewed from the imaginary viewpoint P and display the first display image 22 on the display device 12. Therefore, according to the surrounding monitoring device of this embodiment, since a situation in a plurality of directions of the surroundings of the vehicle 1 to be confirmed by the driver (user 3) is displayed on the first display image 22 with a little distortion, the situation in the plurality of directions surrounding the vehicle 1 can be displayed on one screen while reducing a sense of discomfort of the user 3 on the display image. For example, the user 3 can easily grasp situations of the front, the right side, and the left side of the vehicle 1 imaged by the front camera and the left and right side cameras from the first display image 22.


The image processing unit 101 may perform image recognition processing of recognizing a predetermined object included in images from the imaged images of the imaging devices 2a to 2c or the first display image 22 after image processing. The predetermined object may be determined in advance, or may be specified by the user 3 or the like. The display control unit 102 may display a recognition result by means of the image processing unit 101 on the display device 12. For example, the display control unit 102 superimposes and displays a figure or the like illustrating a result of recognizing the predetermined object by the image processing unit 101 on the first display image 22. As described above, the distortion is a little in the first display image 22. Accordingly, according to the surrounding monitoring device of this embodiment, when the recognition result of the predetermined object included in the imaged image is displayed, it is possible to reduce a sense of discomfort of the user 3 on display of the recognition result and occurrence of misunderstanding.


In addition to the imaging devices 2a to 2d, the vehicle 1 of this embodiment may include a distance measuring unit such as a sonar (sonar sensor or ultrasonic detector) or the like that emits an ultrasonic wave and captures a reflected wave thereof. The information processing device 100 or another ECU may detect an obstacle or the like based on a detection result of the surroundings of the vehicle 1 by the distance measuring unit and the imaged image, and output a warning or the like to the display device 12.


In this embodiment, the surrounding monitoring device includes the imaging devices 2a to 2c, but this disclosure is not limited thereto, and may include the imaging devices 2a to 2d. The position of the imaginary viewpoint P illustrated in FIG. 1 or the like and the installation positions and installation directions of the imaging devices 2a to 2d are an example, and this disclosure is not limited thereto. The related art illustrated in FIGS. 11 to 13 is an example, and this disclosure is not limited to the above configuration.


Second Embodiment

In the first embodiment described above, the position of the imaginary viewpoint P and the imaging directions of the imaging devices 2a to 2c are fixed at predetermined positions. In a second embodiment, imaging directions of the imaging devices 2a to 2c are variable along with movement of the imaginary viewpoint P.



FIG. 7 illustrates an example of functions of an information processing device 1100 according to this embodiment. As illustrated in FIG. 7, the information processing device 1100 includes the image processing unit 101, the display control unit 102, a reception unit 103, and an imaging device control unit 104. In this embodiment, a surrounding monitoring device includes the imaging devices 2a to 2c and the information processing device 1100.


The image processing unit 101 and the display control unit 102 have the same functions as those in the first embodiment.


The reception unit 103 receives an operation of instructing movement of a position of the imaginary viewpoint P by a user 3. For example, the display device 12 includes a touch panel, and the operation of instructing the movement of the position of the imaginary viewpoint P may be input from the touch panel. Further, the reception unit 103 may receive the operation of instructing the movement of the position of the imaginary viewpoint P input from an input device such as another input screen or button. The operation of instructing the movement of the position of the imaginary viewpoint P is not limited to an operation of directly specifying a position of the imaginary viewpoint P after the movement, but may also be a screen display switching operation or the like.


When the imaginary viewpoint P moves, the imaging device control unit 104 moves an orientation of the imaging devices 2a to 2c to a direction where centerlines 20a to 20c of the imaging devices 2a to 2c intersect at the imaginary viewpoint P after the movement. The imaging device control unit 104 controls the orientation of the imaging devices 2a to 2c by, for example, transmitting a control signal to rotate the imaging devices 2a to 2c. The imaging device control unit 104 is an example of a movement control unit in this embodiment.



FIG. 8 illustrates an example of movement of the imaginary viewpoint P according to this embodiment. The imaging devices 2a to 2c in this embodiment have a function of changing a direction (orientation) of imaging in addition to the function in the first embodiment. For example, the imaging devices 2a to 2c are attached to rotatable moving mechanisms. For example, when the reception unit 103 receives an operation of moving the imaginary viewpoint P to a position of an imaginary viewpoint P′, the imaging device control unit 104 moves (rotates) the orientation of the imaging devices 2a to 2c to a direction where the centerlines 20a to 20c of the imaging devices 2a to 2c intersect at the imaginary viewpoint P′ after the movement. Centerlines 20a′ to 20c′ in FIG. 8 indicate centerlines of lenses of the imaging devices 2a to 2c after the movement.


In this way, when the imaginary viewpoint P moves, the surrounding monitoring device of this embodiment moves the orientation of the imaging devices 2a to 2c to the direction where the centerlines 20a to 20c of the imaging devices 2a to 2c intersect at the imaginary viewpoint P after the movement. Therefore, according to the surrounding monitoring device of this embodiment, in addition to effects of the first embodiment, a display image in which the position of the imaginary viewpoint P is changed can be generated and provided to the user 3.


The movement of the position of the imaginary viewpoint P is not limited to movement due to the operation of the user 3, and may be determined by the information processing device 100 or another ECU loaded on a vehicle 1.


Third Embodiment

In the second embodiment described above, by moving the imaging devices 2a to 2c along with the movement of the imaginary viewpoint P, display images based on a plurality of imaginary viewpoints P can be displayed. In a third embodiment, positions of a plurality of imaginary viewpoints are determined in advance, and a plurality of imaging devices 2 are associated with the imaginary viewpoints respectively.



FIG. 9 illustrates an example of installation positions of imaging devices 2a to 2g according to this embodiment. In this embodiment, in order to generate the display images corresponding to a plurality of different imaginary viewpoints P1, P2 respectively, a plurality of imaging devices 2 are associated with the imaginary viewpoints P1, P2 respectively and installed in the vehicle 1. In this embodiment, the imaging devices 2a to 2c are associated with the imaginary point P1, and the imaging devices 2e to 2g are associated with the imaginary viewpoint P2. In more details, installation positions and installation angles of the imaging devices 2a to 2c illustrated in FIG. 9 are determined such that centerlines 20a to 20c of lenses of the imaging devices 2a to 2c intersect at the imaginary viewpoint (first viewpoint) P1. Installation positions and installation angles of the imaging devices 2e to 2g are determined such that centerlines 20e to 20g of lenses of the imaging devices 2e to 2g intersect at the imaginary viewpoint (second viewpoint) P2.



FIG. 10 illustrates an example of functions of an information processing device 2100 according to this embodiment. As illustrated in FIG. 10, the information processing device 2100 according to this embodiment includes an image processing unit 1101, a display control unit 1102, a reception unit 1103, and a memory unit 105.


The memory unit 105 memorizes predetermined positions of a plurality of imaginary viewpoints P and a plurality of imaging devices 2 associated with each imaginary viewpoints P. The memory unit 105 is, for example, a memory device such as an HDD or a flash memory.


The reception unit 1103 receives an operation of selecting either one of the imaginary viewpoints P1, P2 by the user 3.


The image processing unit 1101 has the function in the first embodiment, and executes image processing such as magnification and synthesis on images imaged by the plurality of imaging devices 2 associated with the imaginary viewpoints P selected by the user 3 to generate the display images. For example, when the imaginary viewpoint P1 is selected, the image processing unit 1101 synthesizes the plurality of images imaged by the imaging devices 2a to 2c to generate the display image viewed from the imaginary viewpoint P1.


The display control unit 1102 has the function in the first embodiment, and displays the display images generated from the plurality of images imaged by the plurality of imaging devices 2 associated with the imaginary viewpoints P selected by the user 3 on a display device 12.


In this way, the surrounding monitoring device of this embodiment includes the plurality of imaging devices 2a to 2c attached to the vehicle 1 in order to generate a display image corresponding to the imaginary viewpoint P1, and the plurality of imaging devices 2e to 2g attached to the vehicle 1 in order to generate another display image corresponding to the imaginary viewpoint P2. It is determined that the centerlines 20a to 20c of the lenses of the plurality of imaging devices 2a to 2c associated with the imaginary viewpoint P1 intersect at the imaginary viewpoint P1, and the centerlines 20e to 20g of the lenses of the plurality of imaging devices 2e to 2g associated with the imaginary viewpoint P2 intersect at the imaginary viewpoint P2. Therefore, according to the surrounding monitoring device of this embodiment, in addition to effects of the first embodiment, even though the imaging device 2 is not moved, it is possible to generate the plurality of display images based on the positions of different imaginary viewpoints P1, P2.


The imaginary viewpoints P may be selected by the information processing device 100 or another ECU. The positions and the number of the imaginary viewpoints P1, P2 and the number of the imaging devices 2 illustrated in FIG. 9 is an example, and this disclosure is not limited thereto. In this embodiment, the reception unit 1103 receives an operation of selecting either one of the imaginary viewpoints P1, P2 by the user 3, but may receive an operation of selecting a plurality of imaginary viewpoints P. The display control unit 1102 may simultaneously display both the display image corresponding to the imaginary viewpoint P1 and the display image corresponding to the imaginary viewpoint P2 on the display device 12.


The information processing device 100, 1100, 2100 includes a control device such as a CPU, a memory device such as a ROM and a RAM, and an external memory device such as an HDD, a flash memory, and a CD drive device, and has a hardware configuration using a general computer. The image processing unit 101, 1101, the display control unit 102, 1102, the reception unit 103, 1103, and the imaging device control unit 104 of the information processing device 100, 1100, 2100 are realized by executing a program stored in the ROM by the CPU. These configurations may be realized with a hardware circuit.


As an example, a surrounding monitoring device according to an embodiment of this disclosure includes a plurality of imaging devices attached to a vehicle in order to generate a display image corresponding to one viewpoint. Centerlines of respective lenses of the plurality of imaging devices intersect at the viewpoint. Therefore, according to the surrounding monitoring device according to the embodiment, it is possible to image an imaged image capable of generating the display image with a little distortion.


As an example, the surrounding monitoring device further includes a movement control unit that moves orientations of the plurality of imaging devices to directions in which the centerlines of the respective lenses of the plurality of imaging devices intersect at a viewpoint after movement when the viewpoint moves. Therefore, according to the surrounding monitoring device according to the embodiment, for example, it is possible to generate a display image which changes a position of an imaginary viewpoint P.


As an example, the surrounding monitoring device further includes a plurality of imaging devices attached to the vehicle in order to generate the display image corresponding to a first viewpoint, and a plurality of imaging devices attached to a vehicle in order to generate another display image corresponding to a second viewpoint. The centerlines of the respective lenses of the plurality of imaging devices associated with the first viewpoint intersect at the first viewpoint. The centerlines of the respective lenses of the plurality of imaging devices associated with the second viewpoint intersect at the second viewpoint. Therefore, according to the surrounding monitoring device according to the embodiment, for example, even though the imaging devices are not moved, it is possible to generate the plurality of display images corresponding to the plurality of viewpoints respectively.


As an example, the surrounding monitoring device further includes an image processing unit that synthesizes a plurality of images imaged by the plurality of imaging devices associated with the viewpoint to generate the display image viewed from the viewpoint, and a display control unit causes the generated display image to be displayed on a display device. Therefore, according to the surrounding monitoring device according to the embodiment, for example, by displaying a situation in a plurality of directions surrounding the vehicle on the display image with a little distortion, a situation in the plurality of directions can be displayed on one screen while reducing a sense of discomfort of a user on the display image.


In the surrounding monitoring device as an example, the image processing unit recognizes a predetermined object from the plurality of images imaged by the plurality of imaging devices or the display image, and the display control unit superimposes a result of recognition of the object by the image processing unit on the display image and causes the result to be displayed on the display device. Therefore, according to the surrounding monitoring device according to the embodiment, for example, when the recognition result of the predetermined object included in the imaged images is displayed, it is possible to reduce a sense of discomfort of the user on the display of the recognition result and occurrence of misunderstanding.


While embodiments disclosed here are exemplified, the above embodiments and modifications are consistently an example, and are not intended to limit a scope of this disclosure. The above embodiments or modifications can be carried out in various other modes, and various omissions, substitutions, combinations, and changes can be performed without departing from the spirit of the disclosure. Configurations and shapes of the respective embodiments and modifications may be partly replaced and carried out.


The principles, preferred embodiment and mode of operation of the present invention have been described in the foregoing specification. However, the invention which is intended to be protected is not to be construed as limited to the particular embodiments disclosed. Further, the embodiments described herein are to be regarded as illustrative rather than restrictive. Variations and changes may be made by others, and equivalents employed, without departing from the spirit of the present invention. Accordingly, it is expressly intended that all such variations, changes and equivalents which fall within the spirit and scope of the present invention as defined in the claims, be embraced thereby.

Claims
  • 1. A surrounding monitoring device comprising: a plurality of imaging devices attached to a vehicle in order to generate a display image corresponding to one viewpoint, whereincenterlines of respective lenses of the plurality of imaging devices intersect at the viewpoint.
  • 2. The surrounding monitoring device according to claim 1, further comprising: a movement control unit that moves orientations of the plurality of imaging devices to directions in which the centerlines of the respective lenses of the plurality of imaging devices intersect at the viewpoint after movement when the viewpoint moves.
  • 3. The surrounding monitoring device according to claim 1, further comprising: a plurality of imaging devices attached to the vehicle in order to generate a display image corresponding to a first viewpoint; anda plurality of imaging devices attached to the vehicle in order to generate another display image corresponding to a second viewpoint, whereincenterlines of respective lenses of the plurality of imaging devices associated with the first viewpoint intersect at the first viewpoint, andcenterlines of respective lenses of the plurality of imaging devices associated with the second viewpoint intersect at the second viewpoint.
  • 4. The surrounding monitoring device according to claim 1, further comprising: an image processing unit that synthesizes a plurality of images imaged by the plurality of imaging devices associated with the viewpoint to generate the display image viewed from the viewpoint; anda display control unit that causes the generated display image to be displayed on a display device.
  • 5. The surrounding monitoring device according to claim 2, further comprising: an image processing unit that synthesizes a plurality of images imaged by the plurality of imaging devices associated with the viewpoint to generate the display image viewed from the viewpoint; anda display control unit that causes the generated display image to be displayed on a display device.
  • 6. The surrounding monitoring device according to claim 3, further comprising: an image processing unit that synthesizes a plurality of images imaged by the plurality of imaging devices associated with the viewpoint to generate the display image viewed from the viewpoint; anda display control unit that causes the generated display image to be displayed on a display device.
  • 7. The surrounding monitoring device according to claim 4, wherein the image processing unit recognizes a predetermined object from the plurality of images imaged by the plurality of imaging devices or the display image, andthe display control unit superimposes a result of recognition of the object by the image processing unit on the display image and causes the result to be displayed on the display device.
Priority Claims (1)
Number Date Country Kind
2018-063507 Mar 2018 JP national