The present disclosure relates to a distance measuring system, a light receiving module, and a method of manufacturing a bandpass filter.
In recent years, a distance measuring system has been proposed in which information regarding a distance to a target object is obtained by emitting light to the target object and receiving the reflected light (for example, see Patent Document 1). The configuration of emitting infrared light and receiving the reflected light to obtain distance information has advantages, for example, a light source is not very noticeable, and an operation can be performed in parallel with capturing a normal visible light image.
In terms of reducing disturbance that affects measurement, it is preferable to limit a wavelength range of infrared light, which is the electromagnetic wavelength to be imaged, as narrowly as possible. For this reason, a bandpass filter that is transparent to only a specific wavelength band is often arranged in front of an imaging element.
In order to cope with a reduction in height of housings of electronic equipment, light receiving modules and the like used in portable electronic equipment are compelled to have a configuration of an optical system with so-called pupil correction, in which a chief ray angle differs greatly between the center and the periphery of the imaging element. Band characteristics of a bandpass filter shift in a wavelength direction depending on an angle of incident light. Therefore, in order to receive target light at the center and the periphery of a light receiving unit including an imaging element and the like without any trouble, it is necessary to set a bandwidth of the bandpass filter to be wider than a normal bandwidth. This causes an influence of disturbance light to increase.
It is therefore an object of the present disclosure to provide a distance measuring system, a light receiving module, and a method of manufacturing a bandpass filter that enables setting a narrow bandwidth for the bandpass filter and reducing the influence of disturbance light.
To achieve the above-described object, a distance measuring system according to the present disclosure includes:
a light source unit that emits infrared light toward a target object;
a light receiving unit that receives the infrared light from the target object; and
an arithmetic processing unit that obtains information regarding a distance to the target object on the basis of data from the light receiving unit,
in which an optical member including a bandpass filter that is selectively transparent to infrared light in a predetermined wavelength range is arranged on a light receiving surface side of the light receiving unit, and
the bandpass filter has a concave-shaped light incident surface.
To achieve the above-described object, a light receiving module according to the present disclosure includes:
a light receiving unit that receives infrared light; and
an optical member that is arranged on a light receiving surface side of the light receiving unit and includes a bandpass filter that is selectively transparent to infrared light in a predetermined wavelength range,
in which the bandpass filter has a concave-shaped light incident surface.
To achieve the above-described object, a method of manufacturing a bandpass filter according to the present disclosure includes:
forming a bandpass filter layer on a film sheet that is transparent to at least an infrared light component and subject to plastic deformation;
placing the film sheet on which the bandpass filter layer has been formed, on a mold in which a concave portion is formed on one surface and an opening that passes through from the concave portion to another surface is formed; and
sucking air in the concave portion from the other surface through the opening.
The present disclosure will be described below with reference to the drawings on the basis of an embodiment. The present disclosure is not limited to the embodiment, and the various numerical values, materials, and the like in the embodiment are examples. In the following description, the same elements or elements having the same functions will be denoted by the same reference numerals, without redundant description. Note that the description will be made in the order below.
1. Overall description of distance measuring system and light receiving module according to present disclosure
2. First embodiment
3. First modified example
4. Second modified example
5. Third modified example
6. Fourth modified example
7. First application example
8. Second application example
9. Configuration of present disclosure
[Overall Description of Distance Measuring System and Light Receiving Module According to Present Disclosure]
As described above, a distance measuring system according to the present disclosure includes:
a light source unit that emits infrared light toward a target object;
a light receiving unit that receives the infrared light from the target object; and
an arithmetic processing unit that obtains information regarding a distance to the target object on the basis of data from the light receiving unit,
in which an optical member including a bandpass filter that is selectively transparent to infrared light in a predetermined wavelength range is arranged on a light receiving surface side of the light receiving unit, and
the bandpass filter has a concave-shaped light incident surface.
The distance measuring system according to the present disclosure may have a configuration in which
the optical member includes a lens arranged on a light incident surface side of the bandpass filter, and
an incident angle of light at a maximum image height with respect to the light incident surface of the bandpass filter is 10 degrees or less.
The distance measuring system of the present disclosure including the preferable configuration described above may have a configuration in which
a transmission band of the bandpass filter has a half-width of 50 nm or less.
The distance measuring system of the present disclosure including the various preferable configurations described above may have a configuration in which
the bandpass filter includes
a first filter that is transparent to light in a predetermined wavelength range of infrared light, and
a second filter that is non-transparent to visible light and transparent to infrared light.
In this case,
the first filter and the second filter may be stacked and formed on one side of a base material
in the configuration. Alternatively,
the first filter may be formed on one surface of a base material, and
the second filter may be formed on another surface of the base material
in the configuration.
The distance measuring system of the present disclosure including the various preferable configurations described above may have a configuration in which
the first filter is arranged on the light incident surface side, and
the second filter is arranged on a light receiving unit side.
In this case, the second filter may have a concave shape that imitates the light incident surface in the configuration. Alternatively, the second filter may have a planar shape in the configuration.
Alternatively,
the second filter may be arranged on the light incident surface side, and
the first filter may be arranged on the light receiving unit side
in the configuration.
In this case, the first filter may have a concave shape that imitates the light incident surface in the configuration.
The distance measuring system of the present disclosure including the various preferable configurations described above may have a configuration in which
the light source unit includes an infrared laser element or an infrared light emitting diode element.
The distance measuring system of the present disclosure including the various preferable configurations described above may have a configuration in which
the light source unit emits infrared light having a center wavelength of approximately 850 nm, approximately 905 nm, or approximately 940 nm.
The distance measuring system of the present disclosure including the various preferable configurations described above may have a configuration in which
the arithmetic processing unit obtains distance information on the basis of a time of flight of light reflected from the target object.
Alternatively,
infrared light may be emitted in a predetermined pattern to the target object, and
the arithmetic processing unit may obtain distance information on the basis of a pattern of light reflected from the target object
in the configuration.
As described above, a light receiving module according to the present disclosure includes:
a light receiving unit that receives infrared light; and
an optical member that is arranged on a light receiving surface side of the light receiving unit and includes a bandpass filter that is selectively transparent to infrared light in a predetermined wavelength range,
in which the bandpass filter has a concave-shaped light incident surface.
The light receiving module according to the present disclosure may have a configuration in which
the optical member includes a lens arranged on a light incident surface side of the bandpass filter. In this case, an incident angle of light at a maximum image height with respect to the light incident surface of the bandpass filter may be 10 degrees or less in the configuration.
As described above, a method of manufacturing a bandpass filter according to the present disclosure includes:
forming a bandpass filter layer on a film sheet that is transparent to at least an infrared light component and subject to plastic deformation;
placing the film sheet on which the bandpass filter layer has been formed, on a mold in which a concave portion is formed on one surface and an opening that passes through from the concave portion to another surface is formed; and
sucking air in the concave portion from the other surface through the opening.
The method of manufacturing a bandpass filter according to the present disclosure may have a configuration in which
the film sheet, on which the bandpass filter layer has been formed, is singulated into a predetermined shape including a concave surface formed by sucking the air in the concave portion.
In the distance measuring system and the light receiving module of the present disclosure including the various preferable configurations described above, for example, a photoelectric conversion element or an imaging element such as a CMOS sensor or a CCD sensor in which pixels including various pixel transistors are arranged in a two-dimensional matrix in a row direction and a column direction may be used as the light receiving unit.
In the distance measuring system of the present disclosure including the various preferable configurations described above may have a configuration in which the arithmetic processing unit that obtains information regarding the distance to the target object on the basis of data from the light receiving unit operates on the basis of physical connection by hardware, or operates on the basis of a program. The same applies to a controller that controls the entire distance measuring system, and the like.
A first embodiment relates to a distance measuring system and a light receiving module according to the present disclosure.
A distance measuring system 1 includes:
a light source unit 70 that emits infrared light toward a target object;
a light receiving unit 20 that receives the infrared light from the target object; and
an arithmetic processing unit 40 that obtains information regarding a distance to the target object on the basis of data from the light receiving unit 20.
On a light receiving surface side of the light receiving unit 20, an optical member 10 including a bandpass filter 12 that is selectively transparent to infrared light in a predetermined wavelength range is arranged. The bandpass filter 12 has a concave-shaped light incident surface. The optical member 10 includes lenses (lens group) 11 arranged on a light incident surface side of the bandpass filter 12.
The light receiving unit 20 is constituted by a CMOS sensor or the like, and a signal of the light receiving unit 20 is digitized by an analog-to-digital conversion unit 30 and sent to the arithmetic processing unit 40. These operations are controlled by a controller 50.
The light source unit 70 emits, for example, infrared light having a wavelength in a range of about 700 to 1100 nm. The light source unit 70 includes a light emitting element such as an infrared laser element or an infrared light emitting diode element. The deviation from the center wavelength is about 1 nm for the former and about 10 nm for the latter. The light source unit 70 is driven by a light source driving unit 60 controlled by the controller 50.
The wavelength of the infrared light emitted by the light source unit 70 can be appropriately selected depending on the intended use and configuration of the distance measuring system. For example, a value such as approximately 850 nm, approximately 905 nm, or approximately 940 nm can be selected as the center wavelength.
The light receiving unit 20, the analog-to-digital conversion unit 30, the arithmetic processing unit 40, the controller 50, and the light source driving unit 60 are formed on a semiconductor substrate including, for example, silicon. They may be configured as a single chip, or may be configured as a plurality of chips in accordance with their functions. This will be described with reference to
A receiving system 1 may be configured as a unit so as to be suitable for, for example, being built in equipment, or may be configured separately.
The basic configuration of the distance measuring system 1 has been described above. Next, in order to facilitate understanding of the present disclosure, a reference example of a configuration in which a bandpass filter has a planar light incident surface, and a problem thereof will be described.
An optical member 90 of the reference example differs from the optical member 10 illustrated in
For example, in a case where a lens is configured so as to cope with a reduction in height, the lens is compelled to have a configuration in which the chief ray angle differs greatly between a central part and a peripheral part of the light receiving unit 20.
As a result, in a case where light is incident on the central part of the light receiving unit 20 and in a case where light is incident on the peripheral part, the incident angle of light with respect to the bandpass filter 92 also changes by about 30 degrees. In a case where light is obliquely incident on the bandpass filter 92, the optical path length of the light passing through the filter increases, so that the characteristics shift toward a short wavelength side.
Thus, for example, in a case where the reception target is infrared light having a center wavelength of 905 nm, it is necessary to set the band center of the bandpass filter 92 in a case where the angle with respect to the CRA is 0 to a wavelength longer than 905 nm. Furthermore, the bandwidth also needs to be set so as to enable transmission of 905 nm even in a case where the angle with respect to the CRA is 0 degrees to 30 degrees. As a result, the bandwidth of the bandpass filter 92 needs to be set wider than a normal bandwidth. This causes an increase in the influence of disturbance such as inclusion of ambient light.
The reference example of the configuration in which the bandpass filter has a planar light incident surface and the problem thereof have been described above.
Subsequently, the first embodiment will be described.
As illustrated in
Thus, for example, in a case where the reception target is infrared light having a center wavelength of 905 nm, the band center of the bandpass filter 12 in a case where the angle with respect to the CRA is 0 can be set to be close to 905 nm. Furthermore, even in a case where light is incident on the peripheral part of the light receiving unit 20, the amount of shift of the characteristic of the bandpass filter 12 toward the short wavelength side is reduced. As a result, the bandwidth of the bandpass filter 92 can be set to be narrower, and the influence of disturbance can be suppressed. With this arrangement, measurement accuracy can be improved.
According to
The bandpass filter 12 may have a configuration including a first filter that is transparent to light in a predetermined wavelength range of infrared light, and a second filter that is non-transparent to visible light and transparent to infrared light. A configuration example and a manufacturing method of the bandpass filter 12 will be described below with reference to the drawings.
An optical filter can be constituted by, for example, a multilayer film in which a high refractive index material and a low refractive index material are appropriately stacked. However, in a case where the optical filter is designed so that the wavelength band including target light may have transmission characteristics, even light having, for example, a frequency that has a multiplication relationship exhibits some transmission characteristics. Thus, the characteristics of the first filter 12A are schematically represented as illustrated in
In this example, the first filter 12A is constituted by an eleven-layer multilayer film. Silicon oxide is used as the high refractive index material, and silicon is used as the low refractive index material.
In this example, the second filter 12B is constituted by a five-layer multilayer film. Silicon oxide is used as the high refractive index material, and silicon is used as the low refractive index material.
A known method such as CVD, PDV, or ALD can be used as a method of forming a multilayer film, and it is preferable to select an ALD having advantages such as high-precision film formation and good coverage.
The first filter 12A and the second filter 12B may have a configuration in which they are stacked and formed on one side of a base material. The manufacturing method will be described below.
A base material 13 constituted by a material transparent to infrared light and having a concave formed on a surface is prepared (see
Note that, in the above-described example, the second filter 12B is formed, and then the first filter 12A is formed. However, a configuration in which the two are interchanged may be adopted.
Except for a difference that a base material 13A having a concave formed on a front surface and having a convex on a corresponding back surface portion is used, this example is similar to the process flow described with reference to
In the above-described configuration, the first filter 12A and the second filter 12B are stacked, but another configuration may also be used. For example, in such a configuration, the first filter 12A is formed on one surface of a base material, and the second filter 12B is formed on the other surface of the base material.
In
The base material 13A having a concave formed on the front surface and having a convex on the corresponding back surface portion is prepared (see
Note that, in the above-described example, the second filter 12B is formed, and then the first filter 12A is formed. However, a configuration in which the two are interchanged may be adopted.
Except for a difference that the base material 13 having a concave formed on a front surface and having a flat back surface is used, this example is similar to the process flow described with reference to
Note that the antireflection film 12D may be vapor-deposited on the film sheet 15A first, and then the reflective film 12C may be vapor-deposited. Furthermore, the film sheet 15A has a bandpass filter function obtained by kneading an absorbing material. Specifically, an absorbing material is kneaded into or vapor-deposited on a material based on a resin-based sheet such as cycloolefin polymer, polyethylene terephthalate (PET), or polycarbonate to obtain the film sheet having bandpass characteristics. With this configuration, light in a wavelength band, which has not been able to be removed only by a reflective film vapor-deposited on one surface of a film sheet, can be removed by the film sheet having the bandpass characteristics. Note that the film sheet 15A is not limited to the configuration in the present disclosure, and a film sheet material having no band-pass characteristics may be applied.
By using the fifth manufacturing method, a bandpass filter layer can be vapor-deposited on the planar film sheet, so that the bandpass filter layer can be vapor-deposited uniformly and the manufacturing cost can be reduced.
The light receiving unit 20 and the optical member 10 can also be configured as an integrated light receiving module. A method of manufacturing a light receiving module and the like will be described below.
A semiconductor wafer 200 on which a plurality of imaging elements is formed, a wafer-like frame 140 in which an opening corresponding to a light receiving surface is formed, and a wafer 120 on which a plurality of bandpass filters is formed are stacked (see
In some cases, the frame 140 having the opening may be replaced with an adhesive member having no opening in the configuration.
The method of manufacturing a light receiving module and the like have been described above.
As described above, the light receiving unit 20, the analog-to-digital conversion unit 30, the arithmetic processing unit 40, the controller 50, and the light source driving unit 60 illustrated in
Subsequently, acquisition of distance information will be described. In the distance measuring system 1 illustrated in
In the first embodiment, the band of the bandpass filter can be narrowed, and the influence of disturbance light can be reduced. Thus, high-quality ranging imaging can be achieved even under external light. Furthermore, a light receiving module having excellent wavelength selectivity can be provided by setting the shape of a bandpass filter in accordance with a lens module.
The technology according to the present disclosure can be applied to a variety of products. For example, the technology according to the present disclosure may be materialized as a device that is mounted on any type of mobile object such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, a robot, a construction machine, or an agricultural machine (tractor).
Each control unit includes a microcomputer that performs arithmetic processing in accordance with various programs, a storage unit that stores a program executed by the microcomputer, a parameter used for various computations, or the like, and a drive circuit that drives a device on which various controls are performed. Each control unit includes a network interface for performing communication with another control unit via the communication network 7010, and also includes a communication interface for performing wired or wireless communication with a device, sensor, or the like inside or outside a vehicle.
The drive system control unit 7100 controls operation of devices related to a drive system of the vehicle in accordance with various programs. For example, the drive system control unit 7100 functions as a device for controlling a driving force generation device for generating a driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, a steering mechanism that regulates a steering angle of the vehicle, a braking device that generates a braking force of the vehicle, and the like. The drive system control unit 7100 may have a function as a device for controlling an antilock brake system (ABS), an electronic stability control (ESC), or the like.
The drive system control unit 7100 is connected with a vehicle state detector 7110. The vehicle state detector 7110 includes, for example, at least one of a gyro sensor that detects an angular velocity of shaft rotation of a vehicle body, an acceleration sensor that detects an acceleration of the vehicle, or a sensor for detecting an operation amount of an accelerator pedal, an operation amount of a brake pedal, a steering angle of a steering wheel, an engine speed, a wheel rotation speed, or the like. The drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detector 7110, and controls the internal combustion engine, the driving motor, an electric power steering device, a brake device, or the like.
The body system control unit 7200 controls operation of various devices mounted on the vehicle body in accordance with various programs. For example, the body system control unit 7200 functions as a device for controlling a keyless entry system, a smart key system, a power window device, or various lamps such as a head lamp, a back lamp, a brake lamp, a blinker, or a fog lamp. In this case, radio waves transmitted from a portable device that substitutes for a key or signals from various switches can be input to the body system control unit 7200. The body system control unit 7200 receives the input of these radio waves or signals, and controls a door lock device, the power window device, a lamp, and the like of the vehicle.
The battery control unit 7300 controls a secondary battery 7310 that is a power supply source of the driving motor in accordance with various programs. For example, information such as a battery temperature, a battery output voltage, or a battery remaining capacity is input to the battery control unit 7300 from a battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and performs temperature regulation control of the secondary battery 7310 or control of a cooling device or the like included in the battery device.
The outside-of-vehicle information detection unit 7400 detects information outside the vehicle on which the vehicle control system 7000 is mounted. For example, the outside-of-vehicle information detection unit 7400 is connected with at least one of an imaging unit 7410 or an outside-of-vehicle information detector 7420. The imaging unit 7410 includes at least one of a time of flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, or another camera. The outside-of-vehicle information detector 7420 includes, for example, at least one of an environment sensor for detecting the current weather or climate, or a surrounding information detection sensor for detecting another vehicle, an obstacle, a pedestrian, or the like in the surroundings of the vehicle on which the vehicle control system 7000 is mounted.
The environment sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects the degree of sunshine, or a snow sensor that detects snowfall. The surrounding information detection sensor may be at least one of an ultrasonic sensor, a radar device, or a LIDAR (“light detection and ranging” or “laser imaging detection and ranging”) device. These imaging unit 7410 and outside-of-vehicle information detector 7420 may each be disposed as an independent sensor or device, or may be disposed as an integrated device including a plurality of sensors or devices.
Here,
Note that
Outside-of-vehicle information detectors 7920, 7922, 7924, 7926, 7928, and 7930 provided at the front, rear, sides, and corners of the vehicle 7900, and the top of the windshield in the vehicle interior may be, for example, ultrasonic sensors or radar devices. The outside-of-vehicle information detectors 7920, 7926, and 7930 provided at the front nose, the rear bumper, the back door, and the top of the windshield in the vehicle interior of the vehicle 7900 may be, for example, LIDAR devices. These outside-of-vehicle information detectors 7920 to 7930 are mainly used to detect a preceding vehicle, a pedestrian, an obstacle, or the like.
Returning to
Furthermore, the outside-of-vehicle information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing a person, a car, an obstacle, a sign, a character on a road surface, or the like on the basis of the received image data. The outside-of-vehicle information detection unit 7400 may also generate a bird's-eye view image or a panoramic image by performing processing such as distortion correction or positioning on the received image data, and generating a composite image from pieces of image data captured by different imaging units 7410. The outside-of-vehicle information detection unit 7400 may perform viewpoint conversion processing using pieces of image data captured by the different imaging units 7410.
The in-vehicle information detection unit 7500 detects information inside the vehicle. The in-vehicle information detection unit 7500 is connected with, for example, a driver state detector 7510 that detects a state of a driver. The driver state detector 7510 may include a camera that captures an image of the driver, a biological sensor that detects biological information of the driver, a microphone that collects sounds in the vehicle interior, or the like. The biological sensor is provided at, for example, a seat surface, the steering wheel, or the like, and detects biological information of an occupant sitting on a seat or a driver gripping the steering wheel. On the basis of detection information input from the driver state detector 7510, the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver, or determine whether or not the driver has fallen asleep. The in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on signals of collected sounds.
The integrated control unit 7600 controls overall operation in the vehicle control system 7000 in accordance with various programs. The integrated control unit 7600 is connected with an input unit 7800. The input unit 7800 includes a device that can be used by an occupant to perform an input operation, for example, a touch panel, a button, a microphone, a switch, a lever, or the like. Data obtained by speech recognition of speech input via the microphone may be input to the integrated control unit 7600. The input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or may be externally connected equipment such as a mobile phone or a personal digital assistant (PDA) that can be used to operate the vehicle control system 7000. The input unit 7800 may be, for example, a camera, in which case an occupant can input information by gesture. Alternatively, data to be input may be obtained by detecting a movement of a wearable appliance worn by an occupant. Moreover, the input unit 7800 may include, for example, an input control circuit that generates an input signal on the basis of information input by an occupant or the like using the input unit 7800 described above, and outputs the input signal to the integrated control unit 7600. By operating the input unit 7800, an occupant or the like inputs various types of data to the vehicle control system 7000 or gives an instruction on a processing operation.
The storage unit 7690 may include a read only memory (ROM) for storing various programs executed by a microcomputer, and a random access memory (RAM) for storing various parameters, computation results, sensor values, or the like. Furthermore, the storage unit 7690 may include a magnetic storage device such as a hard disc drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
The general-purpose communication interface 7620 is a versatile communication interface that mediates communication with a variety of types of equipment existing in an external environment 7750. The general-purpose communication interface 7620 may implement a cellular communication protocol such as global system of mobile communications (GSM) (registered trademark), WiMAX, long term evolution (LTE), or LTE-advanced (LTE-A), or another wireless communication protocol such as wireless LAN (also referred to as Wi-Fi (registered trademark)) or Bluetooth (registered trademark). The general-purpose communication interface 7620 may be connected to equipment (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or an operator-specific network) via, for example, a base station or an access point. Furthermore, the general-purpose communication interface 7620 may be connected to, for example, using peer-to-peer (P2P) technology, a terminal existing near the vehicle (for example, a terminal of a driver, pedestrian, or store, or a machine type communication (MTC) terminal).
The dedicated communication interface 7630 is a communication interface that supports a communication protocol designed for use in a vehicle. The dedicated communication interface 7630 may implement, for example, a standard protocol such as wireless access in vehicle environment (WAVE), which is a combination of lower-layer IEEE802.11p and upper-layer IEEE1609, dedicated short range communications (DSRC), or a cellular communication protocol. The dedicated communication interface 7630 typically performs V2X communication, which is a concept that includes at least one of vehicle to vehicle communication, vehicle to infrastructure communication, vehicle to home communication, or vehicle to pedestrian communication.
For example, the positioning unit 7640 receives a global navigation satellite system (GNSS) signal from a GNSS satellite (for example, a global positioning system (GPS) signal from a GPS satellite), executes positioning, and generates position information including the latitude, longitude, and altitude of the vehicle. Note that the positioning unit 7640 may specify a current position by exchanging signals with a wireless access point, or may acquire position information from a terminal such as a mobile phone, a PHS, or a smartphone having a positioning function.
For example, the beacon reception unit 7650 receives radio waves or electromagnetic waves transmitted from a wireless station or the like installed on a road to acquire information such as a current position, traffic congestion, suspension of traffic, or required time. Note that the function of the beacon reception unit 7650 may be included in the dedicated communication interface 7630 described above.
The in-vehicle equipment interface 7660 is a communication interface that mediates connections between the microcomputer 7610 and a variety of types of in-vehicle equipment 7760 existing inside the vehicle. The in-vehicle equipment interface 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), near field communication (NFC), or wireless USB (WUSB). Furthermore, the in-vehicle equipment interface 7660 may establish a wired connection such as universal serial bus (USB), high-definition multimedia interface (HDMI) (registered trademark), or mobile high-definition link (MHL) via a connection terminal (not illustrated) (and, if necessary, a cable). The in-vehicle equipment 7760 may include, for example, at least one of mobile equipment or wearable equipment possessed by an occupant, or information equipment carried in or attached to the vehicle. Furthermore, the in-vehicle equipment 7760 may include a navigation device that searches for a route to an optional destination. The in-vehicle equipment interface 7660 exchanges control signals or data signals with the in-vehicle equipment 7760.
The vehicle-mounted network interface 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The vehicle-mounted network interface 7680 transmits and receives signals and the like on the basis of a predetermined protocol supported by the communication network 7010.
On the basis of information acquired via at least one of the general-purpose communication interface 7620, the dedicated communication interface 7630, the positioning unit 7640, the beacon reception unit 7650, the in-vehicle equipment interface 7660, or the vehicle-mounted network interface 7680, the microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 in accordance with various programs. For example, the microcomputer 7610 may compute a control target value for the driving force generation device, the steering mechanism, or the braking device on the basis of information acquired from the inside and outside of the vehicle, and output a control command to the drive system control unit 7100. For example, the microcomputer 7610 may perform cooperative control for the purpose of implementing functions of an advanced driver assistance system (ADAS) including collision avoidance or shock mitigation of the vehicle, follow-up traveling based on an inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, vehicle lane departure warning, or the like. Furthermore, the microcomputer 7610 may perform cooperative control for the purpose of automatic operation, that is, autonomous driving without the driver's operation, or the like by controlling the driving force generation device, the steering mechanism, the braking device, or the like on the basis of information acquired from the surroundings of the vehicle.
The microcomputer 7610 may generate information regarding a three-dimensional distance between the vehicle and an object such as a structure or a person in the periphery of the vehicle and create local map information including information in the periphery of the current position of the vehicle on the basis of information acquired via at least one of the general-purpose communication interface 7620, the dedicated communication interface 7630, the positioning unit 7640, the beacon reception unit 7650, the in-vehicle equipment interface 7660, or the vehicle-mounted network interface 7680. Furthermore, the microcomputer 7610 may predict a danger such as a collision of the vehicle, approaching a pedestrian or the like, or entering a closed road on the basis of the acquired information, and generate a warning signal. The warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
The audio/image output unit 7670 transmits at least one of an audio output signal or an image output signal to an output device capable of visually or aurally notifying an occupant in the vehicle or the outside of the vehicle of information. In the example of
Note that, in the example illustrated in
The technology according to the present disclosure may be applied to, for example, an imaging unit of an outside-of-vehicle information detection unit among the configurations described above.
The technology according to the present disclosure can be applied to a variety of products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
In endoscopic surgery, an abdominal wall is pierced with a plurality of tubular hole-opening instruments called trocars 5025a to 5025d, instead of cutting and opening the abdominal wall. Then, a lens barrel 5003 of the endoscope 5001 and the other surgical tools 5017 are inserted into a body cavity of the patient 5071 through the trocars 5025a to 5025d. In the illustrated example, an insufflation tube 5019, an energy treatment tool 5021, and forceps 5023 are inserted into the body cavity of the patient 5071 as the other surgical tools 5017. Furthermore, the energy treatment tool 5021 is used to perform incision and exfoliation of tissue, sealing of a blood vessel, or the like by using a high-frequency current or ultrasonic vibration. However, the illustrated surgical tools 5017 are merely an example, and various surgical tools generally used in endoscopic surgery, such as tweezers, a retractor, and the like, may be used as the surgical tools 5017.
An image of a surgical site in the body cavity of the patient 5071 captured by the endoscope 5001 is displayed on a display device 5041. The operator 5067 performs a procedure such as excision of an affected part, for example, using the energy treatment tool 5021 or the forceps 5023 while viewing the image of the surgical site displayed on the display device 5041 in real time. Note that, although not illustrated, the insufflation tube 5019, the energy treatment tool 5021, and the forceps 5023 are supported by the operator 5067, an assistant, or the like during the surgery.
(Support Arm Device)
The support arm device 5027 includes an arm 5031 extending from a base portion 5029. In the illustrated example, the arm 5031 includes joints 5033a, 5033b, and 5033c, and links 5035a and 5035b, and is driven by control of an arm control device 5045. The arm 5031 supports the endoscope 5001 so as to control its position and orientation. With this arrangement, the position of the endoscope 5001 can be stably fixed.
(Endoscope)
The endoscope 5001 includes the lens barrel 5003 whose predetermined length from an end is inserted into the body cavity of the patient 5071, and a camera head 5005 connected to a proximal end of the lens barrel 5003. In the illustrated example, the endoscope 5001 configured as a so-called rigid endoscope having the lens barrel 5003 that is rigid is illustrated. Alternatively, the endoscope 5001 may be configured as a so-called flexible endoscope having the lens barrel 5003 that is flexible.
The lens barrel 5003 is provided with, at the end thereof, an opening portion in which an objective lens is fitted. The endoscope 5001 is connected with a light source device 5043. Light generated by the light source device 5043 is guided to the end of the lens barrel 5003 by a light guide extending inside the lens barrel, and is emitted through the objective lens toward an observation target in the body cavity of the patient 5071. Note that the endoscope 5001 may be a forward-viewing endoscope, an oblique-viewing endoscope, or a side-viewing endoscope.
The camera head 5005 is provided with an optical system and an imaging element inside thereof, and light reflected from the observation target (observation light) is focused on the imaging element by the optical system. The imaging element photoelectrically converts the observation light to generate an electric signal corresponding to the observation light, that is, an image signal corresponding to an observation image. The image signal is transmitted to a camera control unit (CCU) 5039 as raw data. Note that the camera head 5005 has a function of adjusting a magnification and a focal length by appropriately driving the optical system.
Note that the camera head 5005 may be provided with a plurality of imaging elements in order to support, for example, stereoscopic viewing (3D display) and the like. In this case, the lens barrel 5003 is provided with a plurality of relay optical systems inside thereof to guide observation light to every one of the plurality of imaging elements.
(Various Devices Mounted on Cart)
The CCU 5039 is constituted by a central processing unit (CPU), a graphics processing unit (GPU), and the like, and integrally controls operations of the endoscope 5001 and the display device 5041. Specifically, the CCU 5039 performs, on an image signal received from the camera head 5005, various types of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example. The CCU 5039 provides the display device 5041 with the image signal on which image processing has been performed. Furthermore, the CCU 5039 transmits a control signal to the camera head 5005 to control its driving. The control signal may contain information regarding imaging conditions such as the magnification and the focal length.
The CCU 5039 controls the display device 5041 to display an image based on the image signal on which image processing has been performed by the CCU 5039. In a case where, for example, the endoscope 5001 supports imaging with a high resolution such as 4K (3840 horizontal pixels×2160 vertical pixels) or 8K (7680 horizontal pixels×4320 vertical pixels), and/or in a case where the endoscope 5001 supports 3D display, a display device supporting high-resolution display and/or 3D display can be used accordingly as the display device 5041. In a case where imaging with a high resolution such as 4K or 8K is supported, a display device having a size of 55 inches or more can be used as the display device 5041 to provide more immersive feeling. Furthermore, a plurality of display devices 5041 having different resolutions and sizes may be provided depending on the intended use.
The light source device 5043 includes a light source such as a light emitting diode (LED), for example, and supplies the endoscope 5001 with emitted light at the time of imaging a surgical site.
The arm control device 5045 is constituted by a processor such as a CPU, for example, and operates in accordance with a predetermined program to control driving of the arm 5031 of the support arm device 5027 in accordance with a predetermined control method.
An input device 5047 is an input interface to the endoscopic surgery system 5000. A user can input various types of information and input instructions to the endoscopic surgery system 5000 via the input device 5047. For example, the user inputs, via the input device 5047, various types of information related to surgery, such as physical information of a patient and information regarding a surgical procedure. Furthermore, for example, the user may input, via the input device 5047, an instruction to drive the arm 5031, an instruction to change imaging conditions (the type of emitted light, the magnification and focal length, and the like) of the endoscope 5001, an instruction to drive the energy treatment tool 5021, and the like.
The type of the input device 5047 is not limited, and various known input devices may be used as the input device 5047. As the input device 5047, for example, a mouse, a keyboard, a touch panel, a switch, a foot switch 5057, and/or a lever can be applied. In a case where a touch panel is used as the input device 5047, the touch panel may be provided on a display surface of the display device 5041.
Alternatively, the input device 5047 is a device worn by a user, such as a glasses-type wearable device or a head mounted display (HMD), for example, and various inputs are performed in accordance with a user's gesture or line-of-sight detected by these devices. Furthermore, the input device 5047 includes a camera capable of detecting a movement of a user, and various inputs are performed in accordance with a user's gesture or line-of-sight detected from a video captured by the camera. Moreover, the input device 5047 includes a microphone capable of collecting a user's voice, and various inputs are performed by speech via the microphone. As described above, since the input device 5047 has a configuration in which various types of information can be input in a non-contact manner, in particular, a user belonging to a clean area (for example, the operator 5067) can operate equipment belonging to an unclean area in a non-contact manner. Furthermore, the user can operate the equipment while holding a surgical tool in hand, and this improves convenience of the user.
A treatment tool control device 5049 controls driving of the energy treatment tool 5021 for cauterization or incision of tissue, sealing of a blood vessel, or the like. In order to inflate a body cavity of the patient 5071 for the purpose of securing a field of view of the endoscope 5001 and securing a working space for the operator, an insufflation device 5051 sends gas through the insufflation tube 5019 into the body cavity. A recorder 5053 is a device that can record various types of information related to surgery. A printer 5055 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
A particularly characteristic configuration of the endoscopic surgery system 5000 will be described below in more detail.
(Support Arm Device)
The support arm device 5027 includes the base portion 5029 as a base, and the arm 5031 extending from the base portion 5029. In the illustrated example, the arm 5031 includes the plurality of joints 5033a, 5033b, and 5033c, and the plurality of links 5035a and 5035b connected by the joint 5033b. However,
The joints 5033a to 5033c are provided with actuators, and the joints 5033a to 5033c have a configuration that enables rotation about a predetermined rotation axis by driving of the actuators. The arm control device 5045 controls the driving of the actuators, thereby controlling a rotation angle of each of the joints 5033a to 5033c, and controlling the driving of the arm 5031. With this arrangement, the position and orientation of the endoscope 5001 can be controlled. At this time, the arm control device 5045 can control the driving of the arm 5031 by various known control methods such as force control or position control.
For example, the position and orientation of the endoscope 5001 may be controlled by the operator 5067 performing an appropriate operation input via the input device 5047 (including the foot switch 5057), thereby causing the arm control device 5045 to appropriately control the driving of the arm 5031 in accordance with the operation input. With this control, the endoscope 5001 at an end of the arm 5031 can be moved from an optional position to an optional position, and then fixedly supported at the position after the movement. Note that the arm 5031 may be operated by a so-called master-slave method. In this case, the arm 5031 can be remotely controlled by a user via the input device 5047 installed at a location away from an operating room.
Furthermore, in a case where the force control is applied, so-called power assist control may be performed in which the arm control device 5045 receives an external force from a user and drives the actuators of the corresponding joints 5033a to 5033c so that the arm 5031 moves smoothly in accordance with the external force. With this arrangement, when the user moves the arm 5031 while directly touching the arm 5031, the arm 5031 can be moved with a relatively light force. Thus, the endoscope 5001 can be moved more intuitively and with a simpler operation, and this improves convenience of the user.
Here, in general, the endoscope 5001 has been supported by a doctor called an endoscopist during endoscopic surgery. On the other hand, by using the support arm device 5027, the position of the endoscope 5001 can be fixed more reliably without manual operation. This makes it possible to stably obtain an image of a surgical site and smoothly perform surgery.
Note that the arm control device 5045 is not necessarily provided at the cart 5037. Furthermore, the arm control device 5045 is not necessarily one device. For example, the arm control device 5045 may be provided one for each of the joints 5033a to 5033c of the arm 5031 of the support arm device 5027, and a plurality of the arm control devices 5045 may cooperate with one another to control the driving of the arm 5031.
(Light Source Device)
The light source device 5043 supplies the endoscope 5001 with emitted light at the time of imaging a surgical site. The light source device 5043 is constituted by a white light source including, for example, an LED, a laser light source, or a combination thereof. At this time, in a case where the white light source includes a combination of RGB laser light sources, an output intensity and output timing of each color (each wavelength) can be controlled with high precision, and this enables white balance adjustment of a captured image at the light source device 5043. Furthermore, in this case, an image for each of R, G, and B can be captured in a time-division manner by emitting laser light from each of the RGB laser light sources to an observation target in a time-division manner, and controlling driving of the imaging element of the camera head 5005 in synchronization with the emission timing. According to this method, a color image can be obtained without providing a color filter in the imaging element.
Furthermore, driving of the light source device 5043 may be controlled so that the intensity of light to be output may change at a predetermined time interval. By controlling the driving of the imaging element of the camera head 5005 in synchronization with the timing of the change in the light intensity, acquiring images in a time-division manner, and generating a composite image from the images, a high dynamic range image without so-called blocked up shadows or blown out highlights can be generated.
Furthermore, the light source device 5043 may have a configuration in which light can be supplied in a predetermined wavelength band that can be used for special light observation. In special light observation, for example, by utilizing wavelength dependence of light absorption in body tissue, so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel in a mucosal surface layer is imaged with high contrast by emitting light in a band narrower than that of light emitted during normal observation (that is, white light). Alternatively, in special light observation, fluorescence observation may be performed in which an image is obtained by fluorescence generated by emitting excitation light. In fluorescence observation, for example, excitation light is emitted to body tissue and fluorescence from the body tissue is observed (autofluorescence observation), or a fluorescent image is obtained by locally injecting a reagent such as indocyanine green (ICG) into body tissue and emitting excitation light corresponding to a fluorescence wavelength of the reagent to the body tissue. The light source device 5043 may have a configuration in which narrow-band light and/or excitation light that can be used for such special light observation can be supplied.
(Camera Head and CCU)
Functions of the camera head 5005 of the endoscope 5001 and the CCU 5039 will be described in more detail with reference to
Referring to
First, the functional configuration of the camera head 5005 will be described. The lens unit 5007 is an optical system provided at a connection with the lens barrel 5003. Observation light taken in from the end of the lens barrel 5003 is guided to the camera head 5005 and is incident on the lens unit 5007. The lens unit 5007 is constituted by a combination of a plurality of lenses including a zoom lens and a focus lens. Optical characteristics of the lens unit 5007 are adjusted so that observation light may be focused on a light receiving surface of an imaging element of the imaging unit 5009. Furthermore, the zoom lens and the focus lens have a configuration in which their positions can be moved on an optical axis for adjustment of a magnification and a focus of a captured image.
The imaging unit 5009 is constituted by the imaging element, and is arranged at a stage subsequent to the lens unit 5007. Observation light that has passed through the lens unit 5007 is focused on the light receiving surface of the imaging element, and an image signal corresponding to an observation image is generated by photoelectric conversion. The image signal generated by the imaging unit 5009 is provided to the communication unit 5013.
As the imaging element included in the imaging unit 5009, for example, a complementary metal oxide semiconductor (CMOS) type image sensor that has a Bayer array and can capture color images is used. Note that, as the imaging element, an imaging element capable of capturing a high-resolution image of, for example, 4K or more may be used. An image of a surgical site can be obtained with a high resolution, and this allows the operator 5067 to grasp the state of the surgical site in more detail, and proceed with surgery more smoothly.
Furthermore, the imaging element included in the imaging unit 5009 has a configuration including a pair of imaging elements, one for acquiring a right-eye image signal and the other for acquiring a left-eye image signal supporting 3D display. The 3D display allows the operator 5067 to grasp the depth of living tissue in the surgical site more accurately. Note that, in a case where the imaging unit 5009 has a multi-plate type configuration, a plurality of the lens units 5007 is provided to support each of the imaging elements.
Furthermore, the imaging unit 5009 is not necessarily provided in the camera head 5005. For example, the imaging unit 5009 may be provided inside the lens barrel 5003 just behind the objective lens.
The driving unit 5011 is constituted by an actuator, and the camera head controller 5015 controls the zoom lens and the focus lens of the lens unit 5007 to move by a predetermined distance along the optical axis. With this arrangement, the magnification and the focus of an image captured by the imaging unit 5009 can be appropriately adjusted.
The communication unit 5013 is constituted by a communication device for transmitting and receiving various types of information to and from the CCU 5039. The communication unit 5013 transmits an image signal obtained from the imaging unit 5009 as raw data to the CCU 5039 via the transmission cable 5065. At this time, it is preferable that the image signal be transmitted by optical communication in order to display a captured image of a surgical site with a low latency. This is because, during surgery, the operator 5067 performs surgery while observing the state of an affected part from a captured image, and it is required that a moving image of the surgical site be displayed in real time as much as possible for safer and more reliable surgery. In a case where optical communication is performed, the communication unit 5013 is provided with a photoelectric conversion module that converts an electric signal into an optical signal. An image signal is converted into an optical signal by the photoelectric conversion module, and then transmitted to the CCU 5039 via the transmission cable 5065.
Furthermore, the communication unit 5013 receives a control signal for controlling driving of the camera head 5005 from the CCU 5039. The control signal contains, for example, information for specifying a frame rate of a captured image, information for specifying an exposure value at the time of imaging, and/or information for specifying a magnification and focus of the captured image, information regarding imaging conditions, and the like. The communication unit 5013 provides the received control signal to the camera head controller 5015. Note that the control signal from the CCU 5039 may also be transmitted by optical communication. In this case, the communication unit 5013 is provided with a photoelectric conversion module that converts an optical signal into an electric signal. The control signal is converted into an electric signal by the photoelectric conversion module, and then provided to the camera head controller 5015.
Note that the above-described imaging conditions such as the frame rate, the exposure value, the magnification, and the focus are automatically set by the controller 5063 of the CCU 5039 on the basis of an acquired image signal. That is, the endoscope 5001 has a so-called auto exposure (AE) function, an auto focus (AF) function, and an auto white balance (AWB) function.
The camera head controller 5015 controls the driving of the camera head 5005 on the basis of the control signal from the CCU 5039 received via the communication unit 5013. For example, the camera head controller 5015 controls driving of the imaging element of the imaging unit 5009 on the basis of information for specifying a frame rate of a captured image and/or information for specifying exposure at the time of imaging. Furthermore, for example, the camera head controller 5015 appropriately moves the zoom lens and the focus lens of the lens unit 5007 via the driving unit 5011 on the basis of information for specifying a magnification and a focus of a captured image. The camera head controller 5015 may further include a function of storing information for recognizing the lens barrel 5003 and the camera head 5005.
Note that, by arranging the configurations of the lens unit 5007, the imaging unit 5009, and the like in a hermetically sealed structure having high airtightness and waterproofness, the camera head 5005 can have resistance to autoclave sterilization.
Next, the functional configuration of the CCU 5039 will be described. The communication unit 5059 is constituted by a communication device for transmitting and receiving various types of information to and from the camera head 5005. The communication unit 5059 receives an image signal transmitted from the camera head 5005 via the transmission cable 5065. At this time, as described above, the image signal can be suitably transmitted by optical communication. In this case, to support optical communication, the communication unit 5059 is provided with a photoelectric conversion module that converts an optical signal into an electric signal. The communication unit 5059 provides the image processing unit 5061 with the image signal converted into an electric signal.
Furthermore, the communication unit 5059 transmits a control signal for controlling the driving of the camera head 5005 to the camera head 5005. The control signal may also be transmitted by optical communication.
The image processing unit 5061 performs various types of image processing on an image signal that is raw data transmitted from the camera head 5005. Examples of the image processing include various types of known signal processing such as development processing, high image quality processing (such as band emphasis processing, super-resolution processing, noise reduction (NR) processing, and/or camera shake correction processing), and/or enlargement processing (electronic zoom processing). Furthermore, the image processing unit 5061 performs demodulation processing on the image signal for performing AE, AF, and AWB.
The image processing unit 5061 is constituted by a processor such as a CPU or a GPU, and the image processing and demodulation processing described above can be performed by the processor operating in accordance with a predetermined program. Note that, in a case where the image processing unit 5061 is constituted by a plurality of GPUs, the image processing unit 5061 appropriately divides information related to the image signal, and image processing is performed in parallel by the plurality of GPUs.
The controller 5063 performs various controls related to capturing of an image of a surgical site by the endoscope 5001 and display of the captured image. For example, the controller 5063 generates a control signal for controlling the driving of the camera head 5005. At this time, in a case where imaging conditions have been input by a user, the controller 5063 generates a control signal on the basis of the input by the user. Alternatively, in a case where the endoscope 5001 has an AE function, an AF function, and an AWB function, the controller 5063 appropriately calculates an optimal exposure value, focal length, and white balance in accordance with a result of demodulation processing performed by the image processing unit 5061, and generates a control signal.
Furthermore, the controller 5063 causes the display device 5041 to display an image of a surgical site on the basis of an image signal on which image processing unit 5061 has performed image processing. At this time, the controller 5063 uses various image recognition technologies to recognize various objects in the image of the surgical site. For example, the controller 5063 can be recognize a surgical tool such as forceps, a specific living body site, bleeding, mist at the time of using the energy treatment tool 5021, and the like by detecting a shape, color, and the like of an edge of an object in the image of the surgical site. When displaying the image of the surgical site on the display device 5041, the controller 5063 superimposes various types of surgery support information upon the image of the surgical site using results of the recognition. By superimposing the surgery support information and presenting it to the operator 5067, surgery can be performed more safely and reliably.
The transmission cable 5065 connecting the camera head 5005 and the CCU 5039 is an electric signal cable that supports electric signal communication, an optical fiber cable that supports optical communication, or a composite cable thereof.
Here, in the illustrated example, wired communication is performed using the transmission cable 5065, but wireless communication may be performed between the camera head 5005 and the CCU 5039. In a case where wireless communication is performed between the two, the transmission cable 5065 does not need to be laid in the operating room. This may resolve a situation in which movement of medical staff in the operating room is hindered by the transmission cable 5065.
The example of the endoscopic surgery system 5000 to which the technology according to the present disclosure can be applied has been described above. Note that, although the endoscopic surgery system 5000 has been described as an example here, systems to which the technology according to the present disclosure can be applied are not limited to such an example. For example, the technology according to the present disclosure may be applied to an inspection flexible endoscope system or a microscopic surgery system.
The technology according to the present disclosure can be applied to, for example, a camera head among the configurations described above.
[Configuration of Present Disclosure]
Note that the present disclosure may also have the following configurations.
[A1]
A distance measuring system including:
a light source unit that emits infrared light toward a target object;
a light receiving unit that receives the infrared light from the target object; and
an arithmetic processing unit that obtains information regarding a distance to the target object on the basis of data from the light receiving unit,
in which an optical member including a bandpass filter that is selectively transparent to infrared light in a predetermined wavelength range is arranged on a light receiving surface side of the light receiving unit, and
the bandpass filter has a concave-shaped light incident surface.
[A2]
The distance measuring system according to [A1], in which
the optical member includes a lens arranged on a light incident surface side of the bandpass filter, and
an incident angle of light at a maximum image height with respect to the light incident surface of the bandpass filter is 10 degrees or less.
[A3]
The distance measuring system according to [A1] or [A2], in which
a transmission band of the bandpass filter has a half-width of 50 nm or less.
[A4]
The distance measuring system according to any one of [A1] to [A3], in which
the bandpass filter includes
a first filter that is transparent to light in a predetermined wavelength range of infrared light, and
a second filter that is non-transparent to visible light and transparent to infrared light.
[A5]
The distance measuring system according to [A4], in which
the first filter and the second filter are stacked and formed on one side of a base material.
[A6]
The distance measuring system according to [A4], in which
the first filter is formed on one surface of a base material, and
the second filter is formed on another surface of the base material.
[A7]
The distance measuring system according to any one of [A4] to [A6], in which
the first filter is arranged on the light incident surface side, and
the second filter is arranged on a light receiving unit side.
[A8]
The distance measuring system according to [A7], in which
the second filter has a concave shape that imitates the light incident surface.
[A9]
The distance measuring system according to [A7], in which
the second filter has a planar shape.
[A10]
The distance measuring system according to any one of [A4] to [A6], in which
the second filter is arranged on the light incident surface side, and
the first filter is arranged on a light receiving unit side.
[A11]
The distance measuring system according to [A10], in which
the first filter has a concave shape that imitates the light incident surface.
[A12]
The distance measuring system according to any one of [A1] to [A11], in which
the light source unit includes an infrared laser element or an infrared light emitting diode element.
[A13]
The distance measuring system according to any one of [A1] to [A12], in which
the light source unit emits infrared light having a center wavelength of approximately 850 nm, approximately 905 nm, or approximately 940 nm.
[A14]
The distance measuring system according to any one of [A1] to [A13], in which
the arithmetic processing unit obtains distance information on the basis of a time of flight of light reflected from the target object.
[A15]
The distance measuring system according to any one of [A1] to [A13], in which
infrared light is emitted in a predetermined pattern to the target object, and
the arithmetic processing unit obtains distance information on the basis of a pattern of light reflected from the target object.
[B1]
A light receiving module including:
a light receiving unit that receives infrared light; and
an optical member that is arranged on a light receiving surface side of the light receiving unit and includes a bandpass filter that is selectively transparent to infrared light in a predetermined wavelength range,
in which the bandpass filter has a concave-shaped light incident surface.
[B2]
The light receiving module according to [B1], in which
the optical member includes a lens arranged on a light incident surface side of the bandpass filter.
[B3]
The light receiving module according to [B2], in which
an incident angle of light at a maximum image height with respect to the light incident surface of the bandpass filter is 10 degrees or less.
[B4]
The light receiving module according to any one of [B1] to [B3], in which
a transmission band of the bandpass filter has a half-width of 50 nm or less.
[B5]
The light receiving module according to any one of [B1] to [B4], in which
the bandpass filter includes
a first filter that is transparent to light in a predetermined wavelength range of infrared light, and
a second filter that is non-transparent to visible light and transparent to infrared light.
[B6]
The light receiving module according to [B5], in which
the first filter and the second filter are stacked and formed on one side of a base material.
[B7]
The light receiving module according to [B5], in which
the first filter is formed on one surface of a base material, and
the second filter is formed on another surface of the base material.
[B8]
The light receiving module according to any one of [B5] to [B7], in which
the first filter is arranged on the light incident surface side, and
the second filter is arranged on a light receiving unit side.
[B9]
The light receiving module according to [B8], in which
the second filter has a concave shape that imitates the light incident surface.
[B10]
The light receiving module according to [B8], in which
the second filter has a planar shape.
[B11]
The light receiving module according to any one of [B5] to [B7], in which
the second filter is arranged on the light incident surface side, and
the first filter is arranged on a light receiving unit side.
[B12]
The light receiving module according to [B11], in which
the first filter has a concave shape that imitates the light incident surface.
Number | Date | Country | Kind |
---|---|---|---|
2018-028508 | Feb 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/006064 | 2/19/2019 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
20210003672 A1 | Jan 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2019/001822 | Jan 2019 | US |
Child | 16969465 | US |