The present technology relates to imaging systems and imaging devices, and more particularly, to an imaging system and an imaging device that are suitable for use in a case where the surroundings of a vehicle are imaged from inside the vehicle.
There has been a conventional camera device that is attached to the windshield of a vehicle from inside the vehicle, to capture an image of the view in front of the vehicle (see Patent Document 1, for example).
However, in the camera device disclosed in Patent Document 1, a space is formed between the lens of the camera module provided in the camera device and the windshield, and therefore, there is a possibility that incident light will be reflected, resulting in reflection in the windshield. On the other hand, when the lens of the camera module is brought close to the windshield so as to prevent reflection, the imaging direction shifts upward, and the view in front of the vehicle cannot be imaged at an appropriate angle of view.
The present technology has been made in view of such circumstances, and is to enable imaging at an appropriate angle of view while preventing reflection in a case where imaging of the surroundings of a vehicle is performed from inside the vehicle.
An imaging system of a first aspect of the present technology includes an imaging device that includes a plurality of pixels that receives incident light entering from an object after passing through neither an imaging lens nor a pinhole and each outputs a detection signal indicating an output pixel value modulated in accordance with the incident angle of the incident light, the imaging device being mounted so that the light receiving surface faces the surface of a windshield on the inner side of a vehicle and is in contact with or in proximity to the surface of the windshield. In this imaging system, the average of the centroids of incident angle directivities indicating the directivities of the plurality of pixels with respect to the incident angle of the incident light is biased in one direction from the center of the pixel.
An imaging device of a second aspect of the present technology includes a plurality of pixels that receives incident light entering from an object after passing through neither an imaging lens nor a pinhole, and each outputs a detection signal indicating an output pixel value modulated in accordance with the incident angle of the incident light. In the imaging device, the average of the centroids of incident angle directivities indicating the directivities of the plurality of pixels with respect to the incident angle of the incident light deviates from the center of the pixel.
In the first aspect of the present technology, imaging is performed in a direction deviating from the front direction of a vehicle.
In the second aspect of the present technology, imaging is performed in a direction deviating from the front direction.
The following is a detailed description of preferred embodiments of the present technology, with reference to the accompanying drawings. Note that, in this specification and the drawings, components having substantially the same functional configurations are denoted by the same reference numerals, and repeated explanation of them will not be made.
Further, explanation will be made in the following order.
1. First embodiment
2. Second embodiment
3. Third embodiment
4. Modifications
5. Other aspects
Referring to
<Example Configuration of an In-Vehicle System 11>
The in-vehicle system 11 is a system that is provided in a vehicle, and performs control and the like on the vehicle.
The in-vehicle system 11 includes a front camera module 21, a communication unit 22, an automatic driving electronic control unit (ECU) 23, an advanced driver assistance system (ADAS) ECU 24, a steering mechanism 25, a headlight 26, a braking device 27, an engine 28, and a motor 29. The front camera module 21, the communication unit 22, the automatic driving ECU 23, the ADAS ECU 24, the steering mechanism 25, the headlight 26, the braking device 27, the engine 28, and the motor 29 are connected to one another via a bus B1 designed for controller area network (CAN) communication.
Note that, in the description below, the bus B1 in a case where each component of the in-vehicle system 11 performs data transmission/reception or the like via the bus B1 will not be mentioned, for ease of explanation. For example, a case where the ADAS ECU 24 supplies data to the steering mechanism 25 via the bus B1 will be described as a case where the ADAS ECU 24 supplies data to the steering mechanism 25.
The front camera module 21 is installed on the vehicle's interior side of the windshield of the vehicle, performs processing such as imaging and image recognition of the view in front of the vehicle, and supplies data indicating a result of the processing to each component of the in-vehicle system 11, as will be described later. The front camera module 21 includes an imaging unit 41, a front camera ECU 42, and a micro control unit (MCU) 43.
The imaging unit 41 includes a lens-less camera (LLC) that uses neither an imaging lens nor a pinhole, as will be described later. The imaging unit 41 restores, from a detection image obtained by imaging, a restored image in which an image of an object is formed, and supplies the restored image as a sensing image obtained by sensing the view in front of the vehicle, to the front camera ECU 42, as will be described later.
The front camera ECU 42 performs an image quality adjustment process such as gain adjustment, white balance adjustment, a high dynamic range (HDR) process, and a traffic-signal flicker correction process on the sensing image supplied from the imaging unit 41, and then performs image recognition on the sensing image, for example. Note that the image quality adjustment process is not necessarily performed by the front camera ECU 42, but may be performed inside the imaging unit 41.
During the image recognition, objects such as a pedestrian, a light road vehicle such as a bicycle, a vehicle, a headlight, a brake lamp, a sidewalk, a guardrail, a traffic light, a road marking such as a lane marking, and a road sign are detected, and the time till a collision with a vehicle running ahead and the like are detected, for example. The front camera ECU 42 generates a signal indicating the result of the detection by the image recognition, and supplies the signal to the automatic driving ECU 23 via the MCU 43.
The front camera ECU 42 also generates a control signal for assisting various kinds of driving on the basis of the result of the detection by the image recognition performed on the sensing image, and supplies the control signal to the ADAS ECU 24 via the MCU 43. For example, the front camera ECU 42 generates a control signal for issuing an instruction for a change in the traveling direction, deceleration, sudden braking, warning notification, or the like to avoid danger such as a collision with an object or deviation from a traveling lane (driving lane), on the basis of the result of detection of a lane marking, a curbstone, a pedestrian, or the like on a road obtained by the image recognition. The front camera ECU 42 then supplies the control signal to the ADAS ECU 24 via the MCU 43. The front camera ECU 42 also generates a control signal for issuing an instruction for switching between a low beam and a high beam or the like on the basis of the presence/absence of the headlight of an oncoming vehicle obtained by the image recognition, for example, and supplies the control signal to the ADAS ECU 24 via the MCU 43.
The MCU 43 converts the signal supplied from front camera ECU 42 into a signal in a format for CAN communication, and outputs the signal to the bus B1. The MCU 43 also converts a signal received from the bus B1 into a signal in a format for the front camera ECU 42, and supplies the signal to the front camera ECU 42.
The communication unit 22 transmits/receives information to and from a surrounding vehicle, a portable terminal device being carried by a pedestrian, a roadside device, and an external server by various kinds of wireless communication such as vehicle-to-vehicle communication, vehicle-to-pedestrian communication, and road-to-vehicle communication. For example, the communication unit 22 performs vehicle-to-vehicle communication with a surrounding vehicle, receives surrounding vehicle information including information indicating the number of occupants and a traveling status from the surrounding vehicle, and supplies the surrounding vehicle information to the automatic driving ECU 23.
The automatic driving ECU 23 is an ECU to be used for executing an automatic driving (self driving) function of the vehicle. For example, the automatic driving ECU 23 controls automatic driving of the vehicle on the basis of various kinds of information such as a result of object detection performed by the front camera ECU 42, positional information about the vehicle, and surrounding vehicle information supplied from the communication unit 22, sensor data from various sensors provided in the vehicle, a result of detection of a vehicle speed, and the like. For example, the automatic driving ECU 23 controls the steering mechanism 25, the headlight 26, the braking device 27, the engine 28, the motor 29, and the like, to perform driving control such as changing the traveling direction, braking, accelerating, and starting, warning notification control, beam switching control, and the like.
The ADAS ECU 24 is an ECU to be used for executing an advanced driving assistant system (ADAS) function of the vehicle. The ADAS ECU 24 controls the steering mechanism 25, the headlight 26, the braking device 27, the engine 28, the motor 29, and the like on the basis of a control signal from the front camera ECU 42, for example, to control various kinds of driving assistance.
The steering mechanism 25 operates in accordance with an operation of the steering wheel by the driver or a control signal supplied from the automatic driving ECU 23 or the ADAS ECU 24, and performs control on the traveling direction of the vehicle, or performs steering angle control.
The headlight 26 operates in accordance with a control signal supplied from the automatic driving ECU 23 or the ADAS ECU 24, and illuminates the area in front of the vehicle by emitting a beam.
The braking device 27 operates in accordance with a brake operation by the driver or a control signal supplied from the automatic driving ECU 23 or the ADAS ECU 24, and stops or decelerates the vehicle.
The engine 28 is a power source of the vehicle, and is driven in accordance with a control signal supplied from the automatic driving ECU 23 or the ADAS ECU 24.
The motor 29 is a power source of the vehicle, receives power supply from a generator or a battery (not shown), and is driven in accordance with a control signal from the automatic driving ECU 23 or the ADAS ECU 24.
Note that driving the engine 28 and driving the motor 29 are switched by the automatic driving ECU 23, as appropriate, during running of the vehicle.
<Example Configuration of the Imaging Unit 41>
The imaging unit 41 includes an imaging device 121, a restoration unit 122, a control unit 123, a storage unit 124, and a communication unit 125. Further, the restoration unit 122, the control unit 123, the storage unit 124, and the communication unit 125 constitute a signal processing control unit 111 that performs signal processing, control on the imaging unit 41, and the like. Note that the imaging unit 41 does not include any imaging lens (free of imaging lenses).
Further, the imaging device 121, the restoration unit 122, the control unit 123, the storage unit 124, and the communication unit 125 are connected to one another via a bus B2, and transmit/receive data and the like via the bus B2. Note that, in the description below, the bus B2 in a case where each component of the imaging unit 41 performs data transmission/reception or the like via the bus B2 will not be mentioned, for ease of explanation. For example, a case where the communication unit 125 supplies data to the control unit 123 via the bus B2 will be described as a case where the communication unit 125 supplies data to the control unit 123.
The imaging device 121 is an imaging device in which the detection sensitivity of each pixel has an incident angle directivity, and outputs an image including a detection signal indicating a detection signal level corresponding to the amount of incident light, to the restoration unit 122 or the bus B2. The detection sensitivity of each pixel having an incident angle directivity means that the light-receiving sensitivity characteristics corresponding to the incident angle of incident light entering each pixel vary with each pixel. However, the light-receiving sensitivity characteristics of all the pixels are not necessarily completely different, and the light-receiving sensitivity characteristics of some pixels may be the same.
More specifically, the imaging device 121 may have a basic structure similar to that of a general imaging device such as a complementary metal oxide semiconductor (CMOS) image sensor, for example. However, the configuration of each of the pixels constituting the pixel array unit of the imaging device 121 differs from that of a general imaging device, and is a configuration that has an incident angle directivity, as will be described later with reference to
Here, all objects are a set of point light sources, for example, and light is emitted from each point light source in all directions. For example, an object surface 102 of an object in the top left of
In this case, as shown in the top left of
On the other hand, since the incident angle directivities of the pixels Pa to Pc differ from one another, light beams of the same light intensity emitted from the same point light source are detected with different sensitivities in the respective pixels. As a result, light beams of the same light intensity are detected at different detection signal levels in the respective pixels. For example, the detection signal levels with respect to the light beams of the light intensity a from the point light source PA have different values in the respective pixels Pa to Pc.
Further, the light-receiving sensitivity level of each pixel with respect to a light beam from each point light source is determined by multiplying the light intensity of the light beam by a coefficient indicating the light-receiving sensitivity (which is the incident angle directivity) with respect to the incident angle of the light beam. For example, the detection signal level of the pixel Pa with respect to the light beam from the point light source PA is determined by multiplying the light intensity a of the light beam of the point light source PA by a coefficient indicating the incident angle directivity of the pixel Pa with respect to the incident angle of the light beam entering the pixel Pa.
Accordingly, the detection signal levels DA, DB, and DC of the pixels Pc, Pb, and Pa are expressed by Equations (1) to (3) shown below, respectively.
DA=α1×a+β1×b+γ1×c (1)
DB=α2×a+β2×b+γ2×c (2)
DC=α3×a+β3×b+γ3×c (3)
Here, the coefficient α1 is a coefficient indicating the incident angle directivity of the pixel Pc with respect to the incident angle of the light beam from the point light source PA to the pixel Pc, and is set in accordance with the incident angle. Further, α1×a indicates the detection signal level of the pixel Pc with respect to the light beam from the point light source PA.
The coefficient 131 is a coefficient indicating the incident angle directivity of the pixel Pc with respect to the incident angle of the light beam from the point light source PB to the pixel Pc, and is set in accordance with the incident angle. Further, β1×b indicates the detection signal level of the pixel Pc with respect to the light beam from the point light source PB.
The coefficient γ1 is a coefficient indicating the incident angle directivity of the pixel Pc with respect to the incident angle of the light beam from the point light source PC to the pixel Pc, and is set in accordance with the incident angle. Further, γ1×c indicates the detection signal level of the pixel Pc with respect to the light beam from the point light source PC.
As described above, the detection signal level DA of the pixel Pa is determined by the sum of products of the respective light intensities a, b, and c of the light beams from the point light sources PA, PB, and PC in the pixel Pc, and the coefficients α1, β1, and γ1 indicating the incident angle directivities depending on the respective incident angles.
Likewise, the detection signal level DB of the pixel Pb is determined by the sum of products of the respective light intensities a, b, and c of the light beams from the point light sources PA, PB, and PC in the pixel Pb, and the coefficients α2, β2, and γ2 indicating the incident angle directivities depending on the respective incident angles, as shown in Equation (2). Also, the detection signal level DC of the pixel Pc is determined by the sum of products of the respective light intensities a, b, and c of the light beams from the point light sources PA, PB, and PC in the pixel Pa, and the coefficients α2, β2, and γ2 indicating the incident angle directivities depending on the respective incident angles, as shown in Equation (3).
However, the detection signal levels DA, DB, and DC of the pixels Pa, Pb, and Pc are mixed with the light intensities a, b, and c of the light beams emitted from the point light sources PA, PB, and PC, respectively, as shown in Equations (1) to (3). Therefore, as shown in the top right of
Meanwhile, the light intensities a to c of the light beams of the respective point light sources PA to PC are determined by creating simultaneous equations formed with Equations (1) to (3) and solving the created simultaneous equations. The pixels having the pixel values corresponding to the obtained light intensities a to c are then arranged in accordance with the layout (relative positions) of the point light sources PA to PC, so that a restored image in which an image of the object surface 102 is formed is restored as shown in the bottom right of
In this manner, the imaging device 121 that has an incident angle directivity in each pixel without requiring any imaging lens and any pinhole can be obtained.
In the description below, a set of coefficients (the coefficients α1, β1, and γ1, for example) for each of the equations forming the simultaneous equations will be referred to as a coefficient set. In the description below, a group formed with a plurality of coefficient sets (the coefficient set of α1, β1, and γ1, the coefficient set of α2, β2, and γ2, the coefficient set of α3, β3, and γ3, for example) corresponding to a plurality of equations included in the simultaneous equations will be referred to as a coefficient set group.
Here, if the object distance from the object surface 102 to the light receiving surface of the imaging device 121 varies, the incident angles of the light beams from the respective point light sources on the object surface 102 to the imaging device 121 vary, and therefore, a different coefficient set group is required for each object distance. Therefore, in the imaging unit 41, coefficient set groups for the respective distances (object distances) from the imaging device 121 to the object surface are prepared in advance, simultaneous equations are created by switching the coefficient set groups for each object distance, and the created simultaneous equations are solved. Thus, restored images of the object surface at various object distances can be obtained on the basis of one detection image. For example, after a detection image is captured and recorded once, the coefficient set groups are switched in accordance with the distance to the object surface, and a restored image is restored, so that a restored image of the object surface at a desired object distance can be generated.
Further, even on the object surface 102 at the same object distance, if the number and the layout of the point light sources to be set vary, the incident angles of the light beams from the respective point light sources to the imaging device 121 also vary. Therefore, a plurality of coefficient set groups might be required for the object surface 102 at the same object distance in some cases. Furthermore, the incident angle directivity of each pixel 121a needs to be set so that the independence of the simultaneous equations described above can be ensured.
Further, an image to be output by the imaging device 121 is an image formed with detection signals in which an image of the object is not formed as shown in the top right of
In view of this, an image formed with detection signals in which an image of the object is not formed as shown in the top right of
Note that not all the pixels need to have different incident angle directivities from one another, but some pixels may have the same incident angle directivity.
The restoration unit 122 acquires, from the storage unit 124, a coefficient set group that corresponds to the object distance corresponding to the distance from the imaging device 121 to the object surface 102 (the object surface corresponding to the restored image) in
The image restored from the detection image will be referred to as a restored image. However, in a case where the imaging device 121 has sensitivity only to light out of the visible wavelength band, such as ultraviolet rays, the restored image is not an image from which the object can be recognized as in a normal image, but is also referred to as a restored image in this case.
Further, a restored image that is an image in which an image of the object is formed and is an image not yet subjected to color separation such as demosaicing or a synchronization process will be hereinafter referred to as a RAW image, and a detection image captured by the imaging device 121 will be distinguished as an image compliant with the array of color filters, but not as a RAW image.
Note that the number of pixels of the imaging device 121 and the number of pixels constituting the restored image are not necessarily the same.
Further, the restoration unit 122 performs demosaicing, γ correction, white balance adjustment, conversion into a predetermined compression format, and the like, on the restored image as necessary. The restoration unit 122 then outputs the restored image to the bus B2.
The control unit 123 includes various processors, for example, to control each component of the imaging unit 41 and perform various kinds of processing.
The storage unit 124 includes one or more storage devices such as a read only memory (ROM), a random access memory (RAM), and a flash memory, and stores programs, data, and the like to be used in processes by the imaging unit 41, for example. The storage unit 124 associates coefficient set groups corresponding to the above coefficients α1 to α3, β1 to β3, and γ1 to γ3 with various object distances, and stores the coefficient set groups, for example. More specifically, the storage unit 124 stores, for each object surface 102 at each object distance, a coefficient set group including coefficients for the respective pixels 121a of the imaging device 121 with respect to the respective point light sources set on the object surface 102, for example.
The communication unit 125 communicates with the front camera ECU 42 by a predetermined communication method.
Next, a first example configuration of the imaging device 121 of the imaging unit 41 shown in
In the imaging device 121 shown in
For example, in a pixel 121a-1 and a pixel 121a-2, the ranges in which the light receiving regions of the photodiodes are shielded from light by a light shielding film 121b-1 and a light shielding film 121b-2 are different (at least the light shielding regions (positions) or the light shielding areas are different). Specifically, in the pixel 121a-1, the light shielding film 121b-1 is provided so as to shield part of the left-side portion of the light receiving region of the photodiode from light by a predetermined width. On the other hand, in the pixel 121a-2, the light shielding film 121b-2 is provided so as to shield part of the right-side portion of the light receiving region from light by a predetermined width. Note that the width by which the light shielding film 121b-1 shields the light receiving region of the photodiode from light and the width by which the light shielding film 121b-2 shields the light receiving region of the photodiode from light may be different or may be the same. Likewise, in the other pixels 121a, the light shielding films 121b are randomly disposed in the pixel array unit so as to shield a different region in the light receiving region from light for each pixel.
The top portion of
In the imaging device 121 in the top portion of
Note that, in the description below, in a case where there is no need to distinguish the pixels 121a-1 and 121a-2 from each other, the number at the end of each reference numeral will be omitted, and the pixels will be simply referred to as the pixels 121a. In the description below, numbers and alphabets at the end of reference numerals might be omitted too for other components in the specification.
Further,
The pixels 121a-1 and 121a-2 further include photodiodes 121e-1 and 121e-2, respectively, as photoelectric conversion elements in the photoelectric conversion layer Z11. Furthermore, on the photodiodes 121e-1 and 121e-2, on-chip lenses 121c-1 and 121c-2, and color filters 121d-1 and 121d-2 are stacked in this order from the top.
The on-chip lenses 121c-1 and 121c-2 condense incident light onto the photodiodes 121e-1 and 121e-2.
The color filters 121d-1 and 121d-2 are optical filters that transmit light of a specific wavelength such as red, green, blue, infrared, and white, for example. Note that, in the case of white, the color filters 121d-1 and 121d-2 may be transparent filters, or may not be provided.
In the photoelectric conversion layer Z11 of the pixels 121a-1 and 121a-2, light shielding films 121g-1 to 121g-3 are formed at boundaries between the respective pixels, and prevent incident light L from entering the adjacent pixels and causing crosstalk, as shown in
Further, as shown in the top and the middle portions of
Note that, as shown in the top portion of
Further, as shown in the bottom portion of
The anode electrode of the photodiode 161 is grounded, and the cathode electrode of the photodiode 161 is connected to the gate electrode of the amplification transistor 165 via the transfer transistor 162.
The transfer transistor 162 is driven in accordance with a transfer signal TG. For example, when the transfer signal TG supplied to the gate electrode of the transfer transistor 162 switches to the high level, the transfer transistor 162 is turned on. As a result, the electric charge accumulated in the photodiode 161 is transferred to the FD unit 163 via the transfer transistor 162.
The FD unit 163 is a floating diffusion region that has a charge capacity C1 and is provided between the transfer transistor 162 and the amplification transistor 165, and temporarily accumulates the electric charge transferred from the photodiode 161 via the transfer transistor 162. The FD unit 163 is a charge detection unit that converts electric charge into voltage, and the electric charge accumulated in the FD unit 163 is converted into voltage at the amplification transistor 165.
The select transistor 164 is driven in accordance with a select signal SEL. When the select signal SEL supplied to the gate electrode of the select transistor 164 is switched to the high level, the select transistor 164 is turned on, to connect the amplification transistor 165 and the vertical signal line 167.
The amplification transistor 165 serves as the input unit for a source follower that is a readout circuit that reads out a signal obtained through photoelectric conversion performed at the photodiode 161, and outputs a detection signal (pixel signal) at the level corresponding to the electric charge accumulated in the FD unit 163, to the vertical signal line 167. That is, the amplification transistor 165 has its drain terminal connected to a power supply VDD, and its source terminal connected to the vertical signal line 167 via the select transistor 164, to form a source follower together with the current source 168 connected to one end of the vertical signal line 167. The value (output pixel value) of the detection signal is modulated in accordance with the incident angle of incident light from the object, and has characteristics (directivity) that vary with the incident angle (or has an incident angle directivity).
The reset transistor 166 is driven in accordance with a reset signal RST. For example, when the reset signal RST supplied to the gate electrode of the reset transistor 166 is switched to the high level, the electric charge accumulated in the FD unit 163 is released to the power supply VDD, so that the FD unit 163 is reset.
Note that the shape of the light shielding film 121b of each pixel 121a is not limited to the example shown in
The configuration of the imaging device 121 in
In the imaging device 121 in
Further, in the imaging device 121 in
The circuit configuration shown in the bottom portion of
With such a configuration, the electric charges accumulated in the photodiodes 121f-1 to 121f-4 is transferred to the common FD unit 163 having a predetermined capacity provided in the connecting portion between the photodiodes 121f-1 to 121f-4 and the gate electrode of the amplification transistor 165. A signal corresponding to the level of the electric charge retained in the FD unit 163 is then read as a detection signal (pixel signal).
Accordingly, the electric charges accumulated in the photodiodes 121f-1 to 121f-4 can be made to selectively contribute to the output of the pixel 121a, or the detection signal in various combinations. That is, electric charges can be read independently from each of the photodiodes 121f-1 to 121f-4, and the photodiodes 121f-1 to 121f-4 to contribute to outputs (or the degrees of contribution of the photodiodes 121f-1 to 121f-4 to outputs) are made to differ from one another. Thus, different incident angle directivities can be obtained.
For example, the electric charges in the photodiode 121f-1 and the photodiode 121f-3 are transferred to the FD unit 163, and the signals obtained by reading the respective electric charges are added, so that an incident angle directivity in the horizontal direction can be obtained. Likewise, the electric charges in the photodiode 121f-1 and the photodiode 121f-2 are transferred to the FD unit 163, and the signals obtained by reading the respective electric charges are added, so that an incident angle directivity in the vertical direction can be obtained.
Further, a signal obtained on the basis of the electric charges selectively read out independently from the four photodiodes 121f-1 to 121f-4 is a detection signal corresponding to one pixel of a detection image.
Note that contribution of (the electric charge in) each photodiode 121f to a detection signal depends not only on whether or not the electric charge (detection value) in each photodiode 121f is to be transferred to the FD unit 163, but also on resetting of the electric charges accumulated in the photodiodes 121f before the transfer to the FD unit 163 using an electronic shutter function or the like, for example. For example, if the electric charge in a photodiode 121f is reset immediately before the transfer to the FD unit 163, the photodiode 121f does not contribute to a detection signal at all. On the other hand, time is allowed between resetting the electric charge in a photodiode 121f and transfer of the electric charge to the FD unit 163, so that the photodiode 121f partially contributes to a detection signal.
As described above, in the case of the imaging device 121 in
Note that, in the imaging device 121 in
Further,
For example, a photodiode is not necessarily divided into equal portions, and the dividing positions of the photodiode may vary with each pixel. Therefore, even if the photodiodes 121f at the same position among a plurality of pixels are made to contribute to outputs, for example, the incident angle directivity varies among the pixels. Also, the number of divisions is made to vary among the pixels, for example, incident angle directivities can be set more freely. Further, both the number of divisions and the dividing positions may be made to vary among the pixels, for example.
Furthermore, both the imaging device 121 in
Note that, as for the imaging device 121 in
<Basic Characteristics and the Like of the Imaging Device 121>
Next, the basic characteristics and the like of the imaging device 121 are described with reference to
<Principles of Generating an Incident Angle Directivity>
The incident angle directivity of each pixel of the imaging device 121 is generated by the principles illustrated in
Each of the pixels in the top left portion and the top right portion of
In the pixel shown in the top left portion of
For example, in the pixel shown in the top left portion of
Meanwhile, in the pixel shown in the top right portion of
Further, in the pixel shown in the bottom left portion of
Specifically, in the pixel shown in the bottom left portion of
Likewise, in a case where two photodiodes 121f-13 and 121f-14 are included as in the pixel shown in the bottom right portion of
Note that, in each pixel shown in the top portions of
<Incident Angle Directivities in Configurations Including On-Chip Lenses>
Next, incident angle directivities in configurations including on-chip lenses 121c are described with reference to
The graph in the top portion of
Meanwhile, the pixel shown in the middle left portion of
Likewise, the pixel shown in the middle right portion of
In the pixel shown in the middle left portion of
Also, in the pixel shown in the middle right portion of
The solid-line and dashed-line waveforms shown in the top portion of
As described above, an incident angle directivity is the characteristics of the light-receiving sensitivity of each pixel depending on the incident angle θ, but it can also be said that this is the characteristics of the light shielding value depending on the incident angle θ in each pixel in the middle portions of
Further, in the pixel shown in the bottom left portion of
Further, likewise, in the pixel shown in the bottom right portion of
Here, the centroid of the incident angle directivity of a pixel 121a is defined as follows.
The centroid of the incident angle directivity is the centroid of the distribution of the intensity of incident light that enters the light receiving surface of the pixel 121a. The light receiving surface of the pixel 121a is the light receiving surface of the photodiode 121e in each pixel 121a shown in the middle portions of
For example, the detection signal level on the ordinate axis of the graph shown in the top portion of
θg=Σ(a(θ)×θ)/Σa(θ) (4)
Further, the point at which the centroidal light beam intersects the light receiving surface of the pixel 121a is the centroid of the incident angle directivity of the pixel 121a.
Also, as in the pixels shown in the bottom portions of
Note that, in the description below, an example case where pixels 121a that achieve incident angle directivities using the light shielding films 121b like the pixel 121a shown in
<Relationship Between Light-Shielded Region and Angle of View>
Next, the relationship between the light-shielded regions and the angles of view of pixels 121a is described with reference to
For example, a pixel 121a shielded from light by the light shielding film 121b by a width d1 from each edge of the four sides as shown in the top portion of
For example, in a case where the pixel 121a shown in
On the other hand, in a case where the pixel 121a′ in
That is, the pixel 121a having a narrow light-shielded region is a wide angle-of-view pixel suitable for imaging a wide region on the object surface 102, while the pixel 121a′ having a wide light-shielded region is a narrow angle-of-view pixel suitable for imaging a narrow region on the object surface 102. Note that the wide angle-of-view pixel and the narrow angle-of-view pixel mentioned herein are expressions for comparing both the pixels 121a and 121a′ shown in
Therefore, the pixel 121a is used to restore an image I1 shown in
Meanwhile, as shown in the bottom portion of
Note that the angle of view SQ2 is smaller than the angle of view SQ1. Therefore, in a case where an image of the angle of view SQ2 and an image of the angle of view SQ1 are to be restored with the same number of pixels, it is possible to obtain a restored image with higher image quality by restoring the image of the angle of view SQ2 than by restoring the image of the angle of view SQ1.
That is, in a case where restored images are to be obtained with the same number of pixels, a restored image with higher image quality can be obtained by restoring an image with a smaller angle of view.
For example, the right portion of
In
The principal light-shielded portion Z101 in the left portion of
Here, the openings Z111 of the respective pixels 121a are regularly arranged. Specifically, the position of the opening Z111 in the horizontal direction in each pixel 121a is the same among the pixels 121a in the same column in the vertical direction. Also, the position of the opening Z111 in each pixel 121a is the same among the pixels 121a in the same row in the horizontal direction.
On the other hand, the position of the opening Z111 in each pixel 121a in the horizontal direction is shifted by a predetermined distance in accordance with the position of the pixel 121a in the horizontal direction. That is, as the position of the pixel 121a becomes closer to the right, the left side of the opening Z111 moves to a position shifted to the right by a width dx1, dx2, . . . , and dxn from the left side of the pixel 121a. The distance between the width dx1 and the width dx2, the distance between the width dx2 and the width dx3, . . . , and the distance between the width dxn−1 and the width dxn each have the value obtained by dividing the length obtained by subtracting the width of the opening Z111 from the width of the region Z102 in the horizontal direction by the number n−1 of pixels in the horizontal direction.
Also, the position of the opening Z111 in each pixel 121a in the vertical direction is shifted by a predetermined distance in accordance with the position of the pixel 121a in the vertical direction. That is, as the position of the pixel 121a becomes closer to the bottom, the top side of the opening Z111 moves to a position shifted to the bottom by a height dy1, dy2, . . . , and dyn from the top side of the pixel 121a. The distance between the height dy1 and the height dy2, the distance between the height dy2 and the height dy3, . . . , and the distance between the height dyn−1 and the height dyn each have the value obtained by dividing the length obtained by subtracting the height of the opening Z111 from the height of the region Z102 in the vertical direction by the number m−1 of pixels in the vertical direction.
The right portion of
In
The principal light-shielded portion Z151 in the left portion of
Here, the openings Z161 of the respective pixels 121a′ are regularly arranged, like the openings Z111 of the respective pixels 121a shown in
On the other hand, the position of the opening Z161 in each pixel 121a′ in the horizontal direction is shifted by a predetermined distance in accordance with the position of the pixel 121a′ in the horizontal direction. That is, as the position of the pixel 121a′ becomes closer to the right, the left side of the opening Z161 moves to a position shifted to the right by a width dx1′, dx2′, . . . , and dxn′ from the left side of the pixel 121a′. The distance between the width dx1′ and the width dx2′, the distance between the width dx2′ and the width dx3′, . . . , and the distance between the width dxn−1′ and the width dxn′ each have the value obtained by dividing the length obtained by subtracting the width of the opening Z161 from the width of the region Z152 in the horizontal direction by the number n−1 of pixels in the horizontal direction.
Also, the position of the opening Z161 in each pixel 121a′ in the vertical direction is shifted by a predetermined distance in accordance with the position of the pixel 121a′ in the vertical direction. That is, as the position of the pixel 121a′ becomes closer to the bottom, the top side of the opening Z161 moves to a position shifted to the bottom by a height dy1′, dy2′, . . . , and dyn′ from the top side of the pixel 121a′. The distance between the height dy1′ and the height dy2′, the distance between the height dy2′ and the height dy3′, . . . , and the distance between the height dyn−1′ and the height dyn′ each have the value obtained by dividing the length obtained by subtracting the height of the opening Z161 from the height of the region Z152 in the vertical direction by the number m−1 of pixels in the vertical direction.
Here, the length obtained by subtracting the width of the opening Z111 from the width of the region Z102 in the horizontal direction in each pixel 121a shown in
Also, the length obtained by subtracting the height of the opening Z111 from the height of the region Z102 in the vertical direction in each pixel 121a shown in
As described above, the stepwise differences in the positions in the horizontal direction and the vertical direction of the opening Z111 of the light shielding film 121b of each pixel 121a shown in
As the combination of the light-shielded region of the principal light-shielded portion and the opening region of the opening is varied as above, it becomes possible to obtain the imaging device 121 including pixels having various angles of view (or having various incident angle directivities).
Note that, in the example described above, the pixels 121a and the pixels 121a′ are separately arranged in the region ZA and the region ZB. However, this is for ease of explanation, and pixels 121a corresponding to different angles of view are preferably disposed in the same region.
For example, as shown in
In this case, or in a case where the total number of pixels 121a is X, for example, it is possible to restore a restored image, using a detection image of a X/4 pixel for each of the four kinds of angles of view. At this stage, four kinds of coefficient set groups that vary with the respective angles of view are used, and restored images with different angles of view from one another are restored with four different simultaneous equations.
Accordingly, restored image are restored with the use of detection images obtained from the pixels suitable for imaging with the angles of view of the restored image to be restored, so that appropriate restored images for the four kinds of angles of view can be obtained.
Further, an image having an intermediate angle of view of the four angles of view, and images having angles of view around the intermediate angle of view may be generated by interpolation from images with the four angles of view, or pseudo optical zoom may be achieved by seamlessly generating images having various angles of view.
Note that, in a case where an image with a wide angle of view is to be obtained as a restored image, for example, all the wide angle-of-view pixels may be used, or some of the wide angle-of-view pixels may be used. Also, in a case where an image with a narrow angle of view is to be obtained as a restored image, for example, all the narrow angle-of-view pixels may be used, or some of the narrow angle-of-view pixels may be used.
<Example Installation of the Front Camera Module 21>
Next, example installation of the front camera module 21 is described with reference to
<Example Hardware Configuration of the Front Camera Module 21>
In the front camera module 21 shown in
The LLC chip 202 is a semiconductor chip including the imaging unit 41 shown in
The signal processing chip 203 is a semiconductor chip including the front camera ECU 42 and the MCU 43 shown in
As the LLC chip 202 and the signal processing chip 203 are disposed on the same substrate 201 as described above, a flexible substrate becomes unnecessary, and unnecessary radiation is reduced.
<Method for Attaching the Front Camera Module 21>
Next, an example method for attaching the front camera module 21 is described with reference to
The front camera module 21 is detachably attached with a bracket 222 so that the surface on which the LLC chip 202 is mounted extends along the surface on the vehicle interior side of the windshield 221. With this arrangement, the light receiving surface of the imaging device 121 provided on the surface of the LLC chip 202 faces and comes into contact with or close to the surface of the windshield 221 on the vehicle interior side, and becomes substantially parallel to the surface of the windshield 221 on the vehicle interior side.
Accordingly, the space between the light receiving surface of the imaging device 121 and the windshield 221 disappears, or becomes very narrow. As a result, reflection from the windshield 221 due to reflection of incident light, and dew condensation between the light receiving surface of the imaging device 121 and the windshield 221 are prevented.
The front camera module 21 is also connected to the bus B1 of the in-vehicle system 11 via a cable 223.
Note that the front camera module 21 is preferably installed at a position that does not block the field of view of an occupant such as the driver of the vehicle. Further, as shown in
<First Embodiment of the Pixel Array Unit of the Imaging Device 121>
Next, the first embodiment of the pixel array unit of the imaging device 121 is described with reference to
The opening Aa of the light shielding film Sa of each pixel Pa is set within a rectangular opening setting region Ra indicated by a dashed line. Accordingly, the region other than the opening setting region Ra of the light shielding film Sa of each pixel Pa serves as the principal light-shielded portion of the light shielding film Sa.
The size, the shape, and the position of the opening setting region Ra are common among the respective pixels Pa. The height of the opening setting region Ra in the vertical direction is smaller than ½ of the height of the pixel Pa, and the width thereof in the horizontal direction is slightly smaller than the width of the pixel Pa. Further, the opening setting region Ra is set at the center in the horizontal direction in the pixel Pa, and at a position closer to the top in the vertical direction. Accordingly, the centroid of the opening setting region Ra is biased upward from the center of the pixel Pa.
The shape and the size of the rectangular opening Aa are common among the respective pixels Pa. Also, the opening Aa is formed within the opening setting region Ra of each pixel Pa, in accordance with a rule similar to the rule described above with reference to
Specifically, the opening Aa is located at the left end of the opening setting region Ra in each pixel Pa in the left end column in the pixel array unit, and is located at the upper end of the opening setting region Ra in each pixel Pa in the upper end row in the pixel array unit. Further, as the position of the pixel Pa becomes closer to the right, the opening Aa shifts to the right at equal intervals within the opening setting region Ra, and is located at the right end of the opening setting region Ra in each pixel Pa in the right end column in the pixel array unit. Also, as the position of the pixel Pa becomes closer to the bottom, the opening Aa shifts to the bottom at equal intervals within the opening setting region Ra, and is located at the lower end of the opening setting region Ra in each pixel Pa in the lower end row in the pixel array unit.
Accordingly, the position of the opening Aa in the horizontal direction is the same in each pixel Pa in the same column in the vertical direction. Also, the position of the opening Aa in the vertical direction is the same in each pixel Pa in the same row in the horizontal direction. Accordingly, the position of the opening Aa in each pixel Pa, which is the position at which incident light enters each pixel Pa, varies with each pixel Pa, and, as a result, the incident angle directivities of the respective pixels Pa differ from one another.
Further, the openings Aa of the respective pixels Pa cover the opening setting region Ra. That is, the region in which the openings Aa of the respective pixels Pa are overlapped on one another is equal to the opening setting region Ra. Note that the layout pattern of the openings Aa is not limited to the above configuration, and may be any layout, as long as the region in which the openings Aa are overlapped on one another is equal to the opening setting region Ra. For example, the openings Aa may be randomly arranged within the opening setting region Ra.
Here, the centroid of the incident angle directivity of each pixel Pa substantially coincides with the centroid of the opening Aa of each pixel Pa, and is biased upward from the center of each pixel Pa. Accordingly, the average of the centroids of the incident angle directivities of the respective pixels Pa is biased upward from the centers of the pixels Pa. That is, the average of the incident angles of centroidal light beams in the respective pixels Pa is biased downward with respect to the normal direction of the light receiving surface of the pixel array unit.
Accordingly, the view in front of the vehicle can be imaged with an appropriate angle of view, even though the LLC chip 202 is installed parallel to the windshield 221, and the light receiving surface of the pixel array unit of the imaging device 121 faces upward. More specifically, as shown in
Note that the position of the opening setting region Ra, which is the offset of the centroid of the opening setting region Ra from the center of the pixel Pa, is set in accordance with the inclination of the windshield 221, the distance to the object to be imaged, and the like. Further, the shape and the size of the opening setting region Ra are set on the basis of the angle of view with which imaging is to be performed.
Furthermore, even if the LLC chip 202 (or the light receiving surface of the imaging device 121) does not face forward, the view in front of the vehicle can be imaged, and any imaging lens is unnecessary. Accordingly, as described above with reference to
<Imaging Process by the Imaging Unit 41>
Next, an imaging process to be performed by the imaging unit 41 shown in
In step S1, the imaging device 121 images an object. As a result, a detection signal indicating the detection signal level corresponding to the amount of incident light from the object is output from each pixel 121a (pixel Pa) of the imaging device 121 having different incident angle directivities, and the imaging device 121 supplies a detection image formed with the detection signals of the respective pixels 121a to the restoration unit 122.
In step S2, the restoration unit 122 obtains coefficients to be used for image restoration. Specifically, the restoration unit 122 sets the distance to the object surface 102 to be restored, which is the object distance. Note that any method can be adopted as the method for setting the object distance. For example, the restoration unit 122 sets an object distance set by a user, or an object distance detected by various sensors as the distance to the object surface 102 to be restored.
Next, the restoration unit 122 reads, from the storage unit 124, the coefficient set group associated with the set object distance.
In step S3, the restoration unit 122 restores an image, using the detection image and the coefficients. Specifically, the restoration unit 122 creates the simultaneous equations described with reference to Equations (1) to (3) shown above, using the detection signal level of each pixel in the detection image and the coefficient set group acquired through the process in step S2. Next, the restoration unit 122 solves the created simultaneous equations, to calculate the light intensity of each point light source on the object surface 102 corresponding to the set object distance. The restoration unit 122 then arranges the pixels having the pixel values corresponding to the calculated light intensities, in accordance with the layout of the respective point light sources on the object surface 102. By doing so, the restoration unit 122 generates a restored image in which an image of the object is formed.
In step S4, the imaging unit 41 performs various kinds of processing on the restored image. For example, the restoration unit 122 performs demosaicing, γ correction, white balance adjustment, conversion into a predetermined compression format, and the like, on the restored image as necessary. The restoration unit 122 also supplies the obtained restored image as a sensing image to the front camera ECU 42 via the communication unit 125.
After that, the imaging process comes to an end.
Next, a second embodiment of the present technology is described with reference to
The second embodiment differs from the first embodiment in the light shielding pattern in the pixel array unit of the imaging device 121.
The pixel Pb is disposed in an odd-numbered column in the pixel array unit, and the pixel Pc is disposed in an even-numbered column in the pixel array unit.
The position of the opening setting region is different between the pixel Pb and the pixel Pc. Specifically, the shapes and the sizes of the opening setting region Rb of the light shielding film Sb of the pixel Pb and the opening setting region Rc of the light shielding film Sc of the pixel Pc are the same as those of the opening setting region Ra of the light shielding film Sa of the pixel Pa in
Meanwhile, the opening setting region Rb is set at a position shifted upward in the pixel Pb, compared with the opening setting region Ra. Also, the opening setting region Rc is set at a position shifted downward in the pixel Pc, compared with the opening setting region Ra. However, the centroid of the opening setting region Rc is biased upward from the center of the pixel Pc, like the centroid of the opening setting region Ra. In this manner, the position in the vertical direction in the pixel is different between the opening setting region Rb and opening setting region Rc.
Further, the opening Ab of the pixel Pb has the same shape and size as those of the opening Aa of the pixel Pa, and is located in the opening setting region Rb according to a rule similar to the rule described above with reference to
Specifically, the opening Ab is located at the left end of the opening setting region Rb in each pixel Pb in the left end column in the pixel array unit, and is located at the upper end of the opening setting region Rb in each pixel Pb in the upper end row in the pixel array unit. Further, as the position of the pixel Pb becomes closer to the right, the opening Ab shifts to the right at equal intervals within the opening setting region Rb, and is located at the right end of the opening setting region Rb in each pixel Pb in the second column from the right in the pixel array unit. Also, as the position of the pixel Pb becomes closer to the bottom, the opening Ab shifts to the bottom at equal intervals within the opening setting region Rb, and is located at the lower end of the opening setting region Rb in each pixel Pb in the lower end row in the pixel array unit.
Accordingly, the position of the opening Ab in the horizontal direction in each pixel Pb is the same among the pixels Pb in the same column in the vertical direction. Also, the position of the opening Ab in the vertical direction in each pixel Pb is the same among the pixels Pb in the same row in the horizontal direction. Accordingly, the position of the opening Ab in each pixel Pb, which is the position at which incident light enters each pixel Pb, varies with each pixel Pb, and, as a result, the incident angle directivities of the respective pixels Pa differ from one another.
Further, the openings Ab of the respective pixels Pb cover the opening setting region Rb. That is, the region in which the openings Ab of the respective pixels Pb are overlapped on one another is equal to the opening setting region Rb. Note that the layout pattern of the openings Ab is not limited to the above configuration, and may be any layout, as long as the region in which the openings Ab are overlapped on one another is equal to the opening setting region Rb. For example, the openings Ab may be randomly arranged within the opening setting region Rb.
Further, the opening Ac of the pixel Pc has the same shape and size as those of the opening Aa of the pixel Pa, and is located in the opening setting region Rc according to a rule similar to the rule described above with reference to
Specifically, the opening Ac is located at the left end of the opening setting region Rc in each pixel Pc in the second column from the left in the pixel array unit, and is located at the upper end of the opening setting region Rc in each pixel Pc in the upper end row in the pixel array unit. Further, as the position of the pixel Pc becomes closer to the right, the opening Ac shifts to the right at equal intervals within the opening setting region Rc, and is located at the right end of the opening setting region Rc in each pixel Pc in the right end column in the pixel array unit. Also, as the position of the pixel Pc becomes closer to the bottom, the opening Ac shifts to the bottom at equal intervals within the opening setting region Rc, and is located at the lower end of the opening setting region Rc in each pixel Pc in the lower end row in the pixel array unit.
Accordingly, the position of the opening Ac in the horizontal direction in each pixel Pc is the same among the pixels Pc in the same column in the vertical direction. Also, the position of the opening Ac in the vertical direction in each pixel Pc is the same among the pixels Pc in the same row in the horizontal direction. Accordingly, the position of the opening Ac in each pixel Pc, which is the position at which incident light enters each pixel Pc, varies with each pixel Pc, and, as a result, the incident angle directivities of the respective pixels Pc differ from one another.
Further, the openings Ac of the respective pixels Pc cover the opening setting region Rc. That is, the region in which the openings Ac of the respective pixels Pc are overlapped on one another is equal to the opening setting region Rc. Note that the layout pattern of the openings Ac is not limited to the above configuration, and may be any layout, as long as the region in which the openings Ac are overlapped on one another is equal to the opening setting region Rc. For example, the openings Ac may be randomly arranged within the opening setting region Rc.
Here, the centroid of the incident angle directivity of each pixel Pb substantially coincides with the centroid of the opening Ab of each pixel Pb, and is biased upward from the center of each pixel Pb. Accordingly, the average of the centroids of the incident angle directivities of the respective pixels Pb is biased upward from the centers of the pixels Pb. That is, the average of the incident angles of centroidal light beams in the respective pixels Pb is biased downward with respect to the normal direction of the light receiving surface of the pixel array unit.
Also, the centroid of the incident angle directivity of each pixel Pc substantially coincides with the centroid of the opening Ac of each pixel Pc, and is biased upward from the center of each pixel Pc among most pixels Pc. Accordingly, the average of the centroids of the incident angle directivities of the respective pixels Pc is biased upward from the centers of the pixels Pc. That is, the average of the incident angles of centroidal light beams in the respective pixels Pc is biased downward with respect to the normal direction of the light receiving surface of the pixel array unit.
Meanwhile, the offset from the center of each pixel Pb in the opening setting region Rb is larger than the offset from the center of each pixel Pc in the opening setting region Rc. Therefore, the average of the incident angles of centroidal light beams in the respective pixels Pb is inclined downward, compared with the average of the incident angles of the centroidal light beams in the respective pixels Pc.
Accordingly, the pixels Pb and the pixels Pc of the imaging device 121 enable imaging of different fields of view in the vertical direction, as shown in
With this arrangement, it is possible to intensively detect lane markings and the like on the road surface in front of and below the vehicle, using a sensing image obtained by the pixels Pb of the imaging device 121, for example. Thus, detection accuracy can be increased. Also, it is possible to intensively detect vehicles, pedestrians, obstacles, and the like in front of the vehicle, using a sensing image obtained by the pixels Pc of the imaging device 121, for example. Thus, detection accuracy can be increased.
Note that a drive unit that independently drives each pixel Pb and each pixel Pc may be provided, for example, and imaging by each pixel Pb and imaging by each pixel Pc may be performed simultaneously or individually.
Further, in a case where imaging by each pixel Pb and imaging by each pixel Pc are simultaneously performed, restoring of a restored image by either side of the pixels may be stopped if the restoring is unnecessary, for example. In a case where only lane markings and the like on the road surface are detected, for example, only a restoration image by the pixels Pb may be restored, and the restoring of a restored image by the pixels Pc may be stopped. Conversely, in a case where only traffic lights, road signs, and the like are detected, for example, only a restoration image by the pixels Pc may be restored, and the restoring of a restored image by the pixels Pb may be stopped. As a result, the processing to be performed by the imaging unit 41 can be reduced.
Further, in a case where imaging by each pixel Pb and imaging by each pixel Pc are individually performed, for example, imaging may be alternately performed, or imaging by one side may be stopped as necessary. In a case where only lane markings and the like on the road surface are detected, for example, only the imaging by the pixels Pb is performed, and the imaging by the pixels Pc may be stopped. As a result, the processing to be performed by the imaging unit 41 can be reduced.
Note that, in this case, coefficient set groups corresponding to the angles of view of restored images in addition to object distances are further prepared, for example, and a restored image is restored with the use the coefficient set group corresponding to the object distance and the angle of view. For example, a coefficient set group for the pixels Pb and a coefficient set group for the pixel Pc are separately prepared, and each coefficient set group is selectively used. Thus, restored images corresponding to the respective pixels can be restored.
Next, a third embodiment of the present technology is described with reference to
The third embodiment differs from the first embodiment and the second embodiment in the light shielding pattern in the pixel array unit of the imaging device 121.
The pixel Pd is disposed in an odd-numbered column in the pixel array unit, and the pixel Pe is disposed in an even-numbered column in the pixel array unit.
The shape and the size of the opening setting region are different between the pixel Pd and the pixel Pe. Specifically, the shape, the size, and the position of the opening setting region Rd of the light shielding film Sd of the pixel Pd are the same as those of the opening setting region Ra of the light shielding film Sa of the pixel Pa in
Furthermore, the opening Ad of the pixel Pd has the same shape and size as those of the opening Aa of the pixel Pa, and is set in the opening setting region Rd according to a rule similar to the rule described above with reference to
Specifically, the opening Ad is located at the left end of the opening setting region Rd in each pixel Pd in the left end column in the pixel array unit, and is located at the upper end of the opening setting region Rd in each pixel Pd in the upper end row in the pixel array unit. Further, as the position of the pixel Pd becomes closer to the right, the opening Ad shifts to the right at equal intervals within the opening setting region Rd, and is located at the right end of the opening setting region Rd in each pixel Pd in the second column from the right in the pixel array unit. Also, as the position of the pixel Pd becomes closer to the bottom, the opening Ad shifts to the bottom at equal intervals within the opening setting region Rd, and is located at the lower end of the opening setting region Rd in each pixel Pd in the lower end row in the pixel array unit.
Accordingly, the position of the opening Ad in the horizontal direction in each pixel Pd is the same among the pixels Pd in the same column in the vertical direction. Also, the position of the opening Ad in the vertical direction in each pixel Pd is the same among the pixels Pd in the same row in the horizontal direction. Accordingly, the position of the opening Ad in each pixel Pd, which is the position at which incident light enters each pixel Pd, varies with each pixel Pd, and, as a result, the incident angle directivities of the respective pixels Pd differ from one another.
Further, the openings Ad of the respective pixels Pd cover the opening setting region Rd. That is, the region in which the openings Ad of the respective pixels Pd are overlapped on one another is equal to the opening setting region Rd. Note that the layout pattern of the openings Ad is not limited to the above configuration, and may be any layout, as long as the region in which the openings Ad are overlapped on one another is equal to the opening setting region Rd. For example, the openings Ad may be randomly arranged within the opening setting region Rd.
Further, the opening Ae of the pixel Pe has the same shape and size as those of the opening Aa of the pixel Pa, and is located in the opening setting region Re according to a rule similar to the rule described above with reference to
Specifically, the opening Ae is located at the left end of the opening setting region Re in each pixel Pe in the second column from the left in the pixel array unit, and is located at the upper end of the opening setting region Re in each pixel Pe in the upper end row in the pixel array unit. Further, as the position of the pixel Pe becomes closer to the right, the opening Ae shifts to the right at equal intervals within the opening setting region Re, and is located at the right end of the opening setting region Re in each pixel Pe in the right end column in the pixel array unit. Also, as the position of the pixel Pe becomes closer to the bottom, the opening Ae shifts to the bottom at equal intervals within the opening setting region Re, and is located at the lower end of the opening setting region Re in each pixel Pe in the lower end row in the pixel array unit.
Accordingly, the position of the opening Ae in the horizontal direction in each pixel Pe is the same among the pixels Pe in the same column in the vertical direction. Also, the position of the opening Ae in the vertical direction in each pixel Pe is the same among the pixels Pe in the same row in the horizontal direction. Accordingly, the position of the opening Ae in each pixel Pe, which is the position at which incident light enters each pixel Pe, varies with each pixel Pe, and, as a result, the incident angle directivities of the respective pixels Pe differ from one another.
Further, the openings Ae of the respective pixels Pe cover the opening setting region Re. That is, the region in which the openings Ab of the respective pixels Pb are overlapped on one another is equal to the opening setting region Re. Note that the layout pattern of the openings Ae is not limited to the above configuration, and may be any layout, as long as the region in which the openings Ae are overlapped on one another is equal to the opening setting region Re. For example, the openings Ae may be randomly arranged within the opening setting region Re.
Here, the average of the centroids of the incident angle directivities of the respective pixels Pd substantially coincides with the average of the centroids of the incident angle directivities of the respective pixels Pa in
Accordingly, the pixels Pd and the pixels Pe of the imaging device 121 enable imaging of different fields of view in the horizontal direction, as shown in
Meanwhile, in a case where the number of pixels Pd and the number of pixels Pe are the same, it is possible to obtain a restored image with a higher image quality (a higher object resolution) by restoring an image captured with the use of the pixels Pe having a narrow angle of view than by restoring an image captured with the use of the pixels Pd having a narrow angle of view, as described above. Therefore, it becomes possible to monitor a place farther ahead of the vehicle 251, on the basis of a sensing image obtained by the pixels Pe.
Note that imaging by the respective pixels Pd and imaging by the respective pixels Pe may be performed simultaneously or individually, as in the second embodiment.
Also, a coefficient set group for the pixels Pd and a coefficient set group for the pixel Pe may be separately prepared, and restored images corresponding to the respective pixels may be restored.
The following is a description of modifications of the above described embodiments of the present technology.
<Modifications Relating to Light Shielding Patterns>
Although
Also, opening setting regions having different heights may be combined, or opening setting regions having different widths and heights may be combined, for example. Further, not only the position in the vertical direction but also the position in the horizontal direction of the opening setting region may be changed, for example.
Although
Further,
<Modifications Relating to the Front Camera Module 21>
In the examples described above, the imaging unit 41, and the front camera ECU 42 and the MCU 43 are provided on two different semiconductor chips. However, other configurations can be adopted. For example, the imaging device 121 of the imaging unit 41, and the signal processing control unit 111 of the imaging unit 41, the front camera ECU 42, and the MCU 43 may be provided on two different semiconductor chips, or the imaging device 121, the signal processing control unit 111 of the imaging unit 41, and the front camera ECU 42 and the MCU 43 may be provided on three different semiconductor chips. Alternatively, the imaging unit 41, the front camera ECU 42, and the MCU 43 may be provided on one semiconductor chip, for example.
Also, the front camera module 21 may be attached to a window (a side window, a rear window, or the like, for example) of the vehicle other than the windshield 221, to perform imaging in a direction other than the front direction of the vehicle, for example. Further, the direction in which the centroid of the incident angle directivity is biased is not limited to an upward direction with respect to the vertical direction, but may be a downward direction with respect to the horizontal direction or the vertical direction. Particularly, in a case where the camera module 21 is attached to a side window, it is possible to capture a blind spot for the vehicle by decentering the centroid of the incident angle directivity in the horizontal direction.
Further, as shown in
For example, the front camera module 21a is attached to a position similar to that of the front camera module 21 in
The present technology can also be applied in a case where the front camera module 21 is attached in contact with or in proximity to a plate-like transparent or translucent member other than the windows of a vehicle, and imaging in a direction different from the normal direction of the surfaces of the member is performed through the member. For example, the present technology can be applied in a case where imaging in a direction (such as a downward direction, an upward direction, for example) other than the front direction of a window is performed from the inside of a building through the window.
Note that the light shielding pattern in the pixel array unit is set so that the average of the centroids of the incident angle directivities of the respective pixels is biased downward from the center of each pixel in a case where imaging in an upward direction is performed, the average of the centroids of the incident angle directivities of the respective pixels is biased leftward from the center of each pixel in a case where imaging in a rightward direction is performed, and the average of the centroids of the incident angle directivities of the respective pixels is biased rightward from the center of each pixel in a case where imaging in a leftward direction is performed, for example.
<Modifications Relating to the Imaging Device 121>
For example, in each pixel 121a in
Further, a front-illuminated pixel 121a shown in
In the pixel 121a shown in
Specifically, the optical filter 902 is disposed at a predetermined distance from the light receiving surface 901A of the imaging device 901 so as to cover the entire surface of the light receiving surface 901A. Light from the object surface 102 is modulated by the optical filter 902, and then enters the light receiving surface 901A of the imaging device 901.
For example, an optical filter 902BW having a black-and-white lattice pattern shown in
The light-receiving sensitivity characteristics of the imaging device 901 with respect to light from the point light source PA are like a waveform Sa, for example. That is, shadows are formed by the black pattern portions of the optical filter 902BW, and therefore, a grayscale pattern is formed in the image on the light receiving surface 901A with respect to the light from the point light source PA. Likewise, the light-receiving sensitivity characteristics of the imaging device 901 with respect to light from the point light source PB are like a waveform Sb, for example. That is, shadows are formed by the black pattern portions of the optical filter 902BW, and therefore, a grayscale pattern is formed in the image on the light receiving surface 901A with respect to the light from the point light source PB.
Note that light from the point light source PA and light from the point light source PB have different incident angles with respect to the respective white pattern portions of the optical filter 902BW, and therefore, differences are generated in the appearance of the grayscale pattern on the light receiving surface. Accordingly, each pixel of the imaging device 901 has an incident angle directivity with respect to each point light source on the object surface 102.
Details of this method are disclosed by M. Salman Asif and four others in “Flatcam: Replacing lenses with masks and computation”, “2015 IEEE International Conference on Computer Vision Workshop (ICCVW)”, 2015, pp. 663-666, for example.
Note that an optical filter 902HW shown in
The linearly polarizing element 911A transmits only light in a predetermined polarizing direction among substantially unpolarized light beams emitted from the point light source PA. In the description below, the linearly polarizing element 911A transmits only light in a polarizing direction parallel to the drawing. Of the polarized light beams transmitted through the linearly polarizing element 911A, polarized light transmitted through the polarizing portions of the ½ wavelength plate 912 changes its polarizing direction to a direction perpendicular to the drawing, as the polarization plane is rotated. On the other hand, of the polarized light beams transmitted through the linearly polarizing element 911A, polarized light transmitted through the white pattern portions of the ½ wavelength plate 912 does not change its polarizing direction that remains parallel to the drawing. The linearly polarizing element 911B then transmits the polarized light transmitted through the white pattern portions, but hardly transmits the polarized light transmitted through the polarizing portions. Therefore, the light amount of the polarized light transmitted through the polarizing portions becomes smaller than that of the polarized light transmitted through the white pattern portions. As a result, a grayscale pattern substantially similar to that in the case with the optical filter BW is formed on the light receiving surface 901A of the imaging device 901.
Further, as shown in A of
Details of this method are disclosed in JP 2016-510910 W mentioned above, for example.
The present technology can also be applied to an imaging apparatus and an imaging device that images light of a wavelength other than visible light, such as infrared light. In this case, a restored image is not an image from which the user can visually recognize the object, but an image from which the user cannot visually recognize the object. In this case, the present technology is also used to increase the quality of a restored image in an image processing apparatus or the like that can recognize the object. Note that it is difficult for a conventional imaging lens to transmit far-infrared light, and therefore, the present technology is effective in a case where imaging of far-infrared light is performed, for example. Accordingly, a restored image may be an image of far-infrared light. Alternatively, a restored image is not necessarily an image of far-infrared light, but may be an image of some other visible light or invisible light.
Further, by applying machine learning such as deep learning, for example, it is also possible to perform image recognition and the like using a detection image before restoration, without a restored image. In this case, the present technology can also be used to increase the accuracy of image recognition using a detection image before restoration. In other words, the image quality of the detection image before restoration becomes higher.
In this case, the front camera ECU 42 in
The series of processes described above can be performed by hardware, and can also be performed by software. In a case where the series of processes are to be performed by software, the program that forms the software is installed into a computer. Here, the computer may be a computer (such as the control unit 123, for example) incorporated in dedicated hardware.
The program to be executed by the computer may be recorded on a recording medium as a packaged medium or the like, for example, and be then provided. Alternatively, the program can be provided via a wired or wireless transmission medium, such as a local area network, the Internet, or digital satellite broadcasting.
Note that the program to be executed by the computer may be a program for performing processes in chronological order in accordance with the sequence described in this specification, or may be a program for performing processes in parallel or performing a process when necessary, such as when there is a call.
Further, embodiments of the present technology are not limited to the above described embodiments, and various modifications may be made to them without departing from the scope of the present technology.
For example, the present technology may be embodied in a cloud computing configuration in which one function is shared among a plurality of devices via a network, and processing is performed by the devices cooperating with one another.
Further, the respective steps described with reference to the flowcharts described above may be carried out by one device or may be shared among a plurality of devices.
Furthermore, in a case where a plurality of processes is included in one step, the plurality of processes included in the one step may be performed by one device or may be shared among a plurality of devices.
Note that the present technology may also be embodied in the configurations described below.
(1)
An imaging system including
an imaging device that includes a plurality of pixels that receives incident light entering from an object after passing through neither an imaging lens nor a pinhole, and each outputs one detection signal indicating an output pixel value modulated in accordance with an incident angle of the incident light, the imaging device being mounted so that a light receiving surface faces a surface of a windshield on an inner side of a vehicle and is in contact with or in proximity to the surface of the windshield,
in which an average of centroids of incident angle directivities indicating directivities of the plurality of pixels with respect to the incident angle of the incident light is biased in one direction from a center of the pixel.
(2)
The imaging system according to (1), in which
the plurality of pixels includes a plurality of first pixels in which the incident light enters at different positions from one another in a first region in which the centroid is biased in one direction from the center of the pixel.
(3)
The imaging system according to (2), in which
each of the plurality of pixels includes:
a photoelectric conversion element; and
a light shielding film that blocks part of the incident light from entering the photoelectric conversion element, and
openings of the light shielding films of the plurality of first pixels are located at different positions from one another in the first region.
(4).
The imaging system according to (3), in which
the plurality of pixels further includes a plurality of second pixels in which centroids are biased in one direction from the center of the pixel, and openings of the light shielding films are located at different positions from one another in a second region different from the first region.
(5)
The imaging system according to (4), in which
a position in a vertical direction in the pixel is different between the first region and the second region.
(6)
The imaging system according to (5), in which
the first region and the second region have the same shape and size.
(7)
The imaging system according to any one of (4) to (6), in which
the first region and the second region have different widths in a horizontal direction.
(8)
The imaging system according to any one of (4) to (7), further including
a drive unit that drives the plurality of first pixels and the plurality of second pixels independently of each other.
(9)
The imaging system according to any one of (1) to (8), further including
a first signal processing unit that processes a detection image generated on the basis of the detection signals of the plurality of pixels.
(10)
The imaging system according to (9), in which
the first signal processing unit restores a restored image from the detection image.
(11)
The imaging system according to (10), further including
a second signal processing unit that performs image recognition on a view in front of the vehicle, on the basis of the restored image.
(12)
The imaging system according to (11), further including:
a first semiconductor chip that includes the imaging device;
a second semiconductor chip that includes the second signal processing unit; and
a substrate on which the first semiconductor chip and the second semiconductor chip are mounted.
(13)
The imaging system according to (9), in which
the first signal processing unit performs image recognition on a view in front of the vehicle, on the basis of the detection image.
(14)
The imaging system according to (13), further including:
a first semiconductor chip that includes the imaging device;
a second semiconductor chip that includes the first signal processing unit; and
a substrate on which the first semiconductor chip and the second semiconductor chip are mounted.
(15)
The imaging system according to (1), in which
the average of the centroids of the incident angle directivities of the plurality of pixels is biased in one direction from the center of the pixel, in accordance with an inclination of the windshield.
(16)
The imaging system according to (1), in which
the centroid of the incident angle directivity of each of the plurality of pixels is biased in one direction from the center of the pixel.
(17)
The imaging system according to any one of (1) to (16), in which
the average of the centroids of the incident angle directivities is biased upward with respect to a vertical direction in a state where the imaging device is attached to the vehicle.
(18)
The imaging system according to any one of (1) to (17), in which
the imaging device is detachably attached to the windshield via a bracket.
(19)
An imaging device including
a plurality of pixels that receives incident light entering from an object after passing through neither an imaging lens nor a pinhole, and each outputs a detection signal indicating an output pixel value modulated in accordance with an incident angle of the incident light,
in which an average of centroids of incident angle directivities indicating directivities of the plurality of pixels with respect to the incident angle of the incident light deviates from a center of the pixel.
(20)
The imaging device according to (19), in which
the imaging device is mounted so that a light receiving surface faces a surface of a windshield on an inner side of a vehicle and is in contact with or in proximity to the surface of the windshield, and
the average of the centroids of the incident angle directivities of the plurality of pixels with respect to the incident light is biased upward from the center of the pixel.
Note that the advantageous effects described in this specification are merely examples, and the advantageous effects of the present technology are not limited to them and may include other effects.
Number | Date | Country | Kind |
---|---|---|---|
2019-085074 | Apr 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/016360 | 4/14/2020 | WO | 00 |