This application is based upon and claims the benefit of priority from prior Japanese Patent Application No. 2016-178953 filed on Sep. 13, 2016 in Japan, the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to object recognition methods, programs, and optical systems.
An optical machine vision system takes an image of an object using an optical system, and detects the shape and the surface profile of the object based on the image. Such a system has an advantage as it may determine the shape in a contactless manner.
However, the information that can be extracted from the image includes degenerated data of the shape and the surface profile of the object. Therefore, when the shape data is extracted, the surface profile information acts as noise. On the contrary, when the surface profile data is extracted, the shape information acts as noise. Thus, in order to perform a highly accurate detection, the degradation problem needs to be solved.
An optical system according to an embodiment includes: an irradiation device including a light source configured to emit light to an object; an imaging device including an imaging element, and configured to receive reflected light from the object, and obtain an image with respect to a first wavelength and an image with respect to a second wavelength that is different from the first wavelength; and a processing circuit configured to compare the image with respect to the first wavelength and the image with respect to the second wavelength to extract and separate a Fresnel reflection component and a scattering component included in the reflected light, and obtain at least one of a surface shape or a surface profile of the object based on the Fresnel reflection component and the scattering component.
Embodiments will now be described with reference to the accompanying drawings.
The irradiation device 20 includes a light source capable of emitting light rays with at least two wavelengths. The light rays emitted from the irradiation device 20 are visible light rays in a wavelength range of about 450 nm to about 700 nm. The types of light, however, are not limited to the foregoing, and light with electromagnetic waves, from ultraviolet light to light with microwaves, may be used.
The imaging device 30 includes an imaging element capable of taking an image with respect to light rays having at least two wavelengths. An example of the imaging element is a camera. The two wavelengths of light rays that may be used to form an image are assumed to be λ1 and λ2, with λ1 being smaller. In the first embodiment, the imaging device 30 includes a spectroscope 35 capable of separating light rays into two wavelength regions, a wavelength region Λ1 including the wavelength λ1, and a wavelength region Λ2 including the wavelength λ2. The spectroscope 35 may be disposed on a light path between the irradiation device 20 and the imaging device 30.
The processing circuit 40 processes electrical signals from the imaging device 30. The controller 50 includes a control unit 52 and a memory 54. The control unit 52 controls the irradiation device 20, the imaging device 30, and the processing circuit 40. The control unit 52 controls, for example, the irradiation device 20 with respect to the types of light rays emitted from the irradiation device 20 to the object 100, the directions of the light rays, and the irradiation timing, the imaging device 30 with respect to the imaging position (imaging angle), imaging timing, and timing for outputting the taken image, and the processing circuit 40 with respect to the image processing timing. The memory 54 stores the above control procedure. The control unit 52 controls the irradiation device 20, the imaging device 30, and the processing circuit 40 based on the control procedure stored in the memory 54.
The object 100 may be anything as long as it reflects the irradiation light. In the first embodiment, the object 100 is, for example, a fiber fabric.
The optical system 10 emits a light ray 25 to the object 100 by means of the irradiation device 20, and makes an image from a reflected light ray 27 reflected from the object 100 by means of the imaging device 30. If the fiber fabric 100 is, for example, a silk fabric, it has a macro structure called “fibroin,” and a fine structure with a diameter of several tens nanometers, called “microfibril,” included within the macro structure. Thus, the object 100 includes a structure that is smaller than the wavelength of visible light. Such a structure causes light to be divided into a component reflected on the surface of the object 100 and a component passing though the surface of the object 100 but is scattered and reflected by the internal structure. Generally, an object including an internal structure that is smaller than the wavelength of light causes light to be divided into a reflection component that is reflected by the surface and a scatter reflection component that is reflected and scattered by the internal structure.
The irradiation light is, for example, LED (light emitting diode) light that is substantially fully converted from excitation light by means of a fluorescent material. Such LED light generally has high color rendering properties.
The principles of the optical system according to the first embodiment will be described below.
The reflection characteristics of the object 100 are generally expressed by a bidirectional reflectance distribution function (BRDF). As shown in
R=R(λ,x,Ωi,Ωo) (1)
where λ is the wavelength of light, x is the position on the object surface, Ωi is the incident angle of the irradiation light, and Ωo is the reflection angle.
The incident angle Ωi and the reflection angle Ωo represent the solid angles of the incident light ray 25 and the reflected light ray 27, respectively, and can be expressed in a manner described below:
In a coordinate system xyz with the point of origin P located on the object 100 as shown in
The BRDF of the object 100 including an internal structure that is smaller than the wavelength λ can be expressed by the following formula (2), by adding a component of the Fresnel reflection on the surface of the object and a component of the reflection caused by the internal scattering:
R
T
=R
Fresnel(λ,x,Ωi,Ωo)+RScatter(λ,x,Ωi,Ωo) (2)
where RFresnel denotes the component of the Fresnel reflection on the surface of the object, and RScatter denotes the component of the reflection caused by the internal scattering of the object.
The Fresnel reflection component of the BRDF for regions of the same material is determined by the refractive index of the material. The refractive index of a common material does not substantially change in the visible light region in many cases. Even a material with the refractive index that is dependent on the wavelength has a wavelength region in which the refractive index is constant. This wavelength region is set to be a detection wavelength region. With respect to the detection wavelength region, the formula (2) can be rewritten as the following formula (3).
R
T(λ,x,Ωi,Ωo)=RFresnel(x,Ωi,Ωo)+RScatter(λ,x,Ωi,Ωo) (3)
The first term on the right side of the formula (3), the Fresnel reflection component RFresnel, is not dependent on the wavelength. On the other hand, the second term on the right side of the formula (3), the scattering component RScatter, is dependent on the wavelength. Therefore, with respect to the two wavelengths λ1 and λ2, the following two formulas (3a) and (3b) can be obtained from the formula (3):
R
T(λ1,x,Ωi,Ωo)=R(Fresnel)(x,Ωi,Ωo)+R(Scatter)(λ,x,Ωi,Ωo) (3a)
R
T(λ2,x,Ωi,Ωo)=R(Fresnel)(x,Ωi,Ωo)+R(Scatter)(λ2,x,Ωi,Ωo) (3b)
Thus, the reflectance is expressed by the Fresnel reflection component RFresnel obtained from the Fresnel reflection occurring on the object surface, and the scattering component RScatter. The dependence of the Fresnel reflection component RFresnel on the wavelength is small. If the object includes a structure having a size smaller than the wavelength of the incident light, the scattering component RScatter is significantly dependent on the wavelength.
The scattering component is generated by “microfibril,” which is a nanostructure of the fiber fabric. The regular reflection component is the Fresnel reflection component generated by surface reflection on a macro structure called “fibroin” of the fiber fabric. The Fresnel reflection is dependent on the refractive index of the material. For many materials, the dependence of the refractive index on the wavelength is small in the visible light region. Therefore, the Fresnel reflection has small dependence on the wavelength, and substantially constant with respect to the wavelength.
In this embodiment, the wavelength region in which the Fresnel reflection of an object is constant is defined as a detection region ΛO. The Fresnel reflection component and other components of any object can be extracted from the BRDF using the aforementioned method if the detection is performed in the detection region ΛO. Extraction is Performed by the processing circuit 40.
The following formula (4) can be obtained from the difference between the formula (3a) and the formula (3b).
R
(Scatter)(λ1,x,Ωi,Ωo)=R(Scatter)(λ2,x,Ωi,Ωo)+R(T)(λ1,x,Ωi,Ωo)−R(T)(λ2,x,Ωi,Ωo) (4)
As can be understood from the formula (4), only the scattering component can be extracted from an observable amount, R(T). This extraction is also performed by the processing circuit 40.
The Fresnel reflection component can also be extracted by substituting the formula (4) into the formula (3a). Thus, the Fresnel reflection component and the scattering component can be separated from each other. The separation is also performed by the processing circuit 40.
Assuming that a typical scale of the finest structure of the object is L in the formula (4), and L has the following relationship with the wavelength λ2 of the light,
L<<λ
2 (5)
substantially no scattering occurs with respect to light having a wavelength λ2. Therefore, the following formula holds:
R
(Scatter)(λ2,x,Ωi,Ωo)≅0 (6)
At this time, the following formula (7) can be obtained from the formula (4):
R
(Scatter)(λ1,x,Ωi,Ωo)=R(T)(λ1,x,Ωi,Ωo)−R(T)(λ2,x,Ωi,Ωo) (7)
Thus, the scattering component with respect to the wavelength λ1 is completely determined. Therefore, by setting the wavelength λ2 to be considerably large, the scattering component and the Fresnel reflection component with respect to the wavelength λ1 can be completely determined.
As described above, the scattering component and the Fresnel reflection component can be obtained by processing, by the processing circuit 40, an image taken by the imaging device 30.
The Fresnel reflection component is generated by a reflection on an object surface, and thus has information on the surface shape of the object. Therefore, the shape of the object may be reconstructed from the Fresnel reflection component of the taken image. If the angle of light incident on the object is known, the reflection angle may be calculated from the Fresnel reflection component of the taken image. As a result, the normal direction on the object surface can be estimated, and thus the surface shape of the object can be reconstructed.
The scattering component is generated by a fine internal structure near the surface of the object, the structure having a smaller scale than the wavelength of light. Otherwise, the scattering component is not dependent on the wavelength, and the scattering component measured by the extraction method described above becomes substantially zero. Therefore, whether there is a fine internal structure in the object may be determined by the scattering component value. A typical scale L of a finest internal structure of the object has the following relationship with respect to the wavelength λ of light:
L<<λ (5a)
If the scattering component is extracted by the above method, data on the fine internal structure may be extracted by the processing circuit 40 due to the dependence of the scattering component on the wavelength, the irradiation angle, or the reflection angle.
Next, irradiation light will be described. Assuming that the amount of irradiation light proportional to the spectrum (hereinafter also referred to the spectrum) is P(λ), the pixel value O of the image taken by the imaging device 30 may be expressed as follows:
O(λ,x,Ωi,Ωo)=P(λ)R(λ·x,Ωi,Ωo) (8)
If the spectrum P(λ) is known, the BRDF of the object may be expressed as:
An LED spectrum generally includes an excitation light component and a conversion light component obtained from the conversion by a fluorescent material.
P(λ)=
Therefore, the shape of the spectrum may be changed due to variations in temperature. Thus, the formula (9) may change as follows:
This means that an error may be caused by variations in temperature.
On the other hand, if the excitation light is entirely absorbed by a fluorescent material, the spectrum of the LED light only includes a conversion light component from the fluorescent material, like the high color-rendering LED spectrum shown in
P(λ)=
The value δ is not dependent on the wavelength. Therefore, the relative shape of the spectrum does not change. In this case, the formula (9) may change as follows:
Thus, variations in temperature do not affect the shape of the spectrum.
As described above, the first embodiment is capable of obtaining a scattering component and a Fresnel reflection component of light reflected from an object, using light rays with two or more wavelengths. As a result, the shape or the surface profile of an object may be accurately detected.
In the optical system 10 according to the first embodiment, the mirror reflection or the Fresnel reflection occurs on the surface of an object, and scattering occurs inside the object. However, depending on the type of the object, diffuse reflection occurs on the surface of the object. The optical system 10 according to the first embodiment may also be used for such an object. This will be described as a second embodiment. The optical system according to the second embodiment has the same structure as the optical system according to the first embodiment, but the incident direction of light emitted from the irradiation device 20 is changed.
When the diffuse reflection occurs on the surface of the object, the formulas (3a) and (3b) change as follows:
R
T(λ1,x,Ωi,Ωo)=R(Fresnel)(x,Ωi,Ωo)+R(Diffuse)(x,Ωi,Ωo)+R(Scatter)(λ1,x,Ωi,Ωo) (14a)
R
T(λ2,x,Ωi,Ωo)=R(Fresnel)(x,Ωi,Ωo)+R(Diffuse)(x,Ωi,Ωo)+R(Scatter)(λ2,x,Ωi,Ωo) (14b)
The difference between the formulas (14a) and (14b) is as follows:
R
(Scatter)(λ1,x,Ωi,Ωo)=R(Scatter)(λ2,x,Ωi,Ωo)+R(T)(λ1,x,Ωi,Ωo)−R(T)(λ2,x,Ωi,Ωo) (15)
Therefore, like the first embodiment, only the scattering component may be extracted from R(T), which is a observable amount. For example, if the scattering component with respect to the wavelength λ is calculated, the following formula (16) can be obtained from the formula (14a):
R
(Fresnel)(x,Ωi,Ωo)+R(Diffuse)(x,Ωi,Ωo)=RT(λ,x,Ωi,Ωo)−R(Scatter)(λ,x,Ωi,Ωo) (16)
Since the right side of the formula (16) is known, the left side of the formula (16) can be calculated.
However, the Fresnel reflection component R(Fresnel) and the diffuse reflection component R(Scatter) in the formula (16) are degenerated. In order to deal with this, the irradiation direction or the reflection direction of light is changed by controlling the irradiation device 20 by means of the controller 50, and an image after this change is detected by the imaging device 30. Thereafter, the BRDF is obtained by the processing circuit 40 based on the detected images, and the sum of the Fresnel component and the diffuse reflection component is calculated by the processing circuit 40 using the formula (16). The Fresnel reflection component R(Fresnel) can be completely described if the refractive index is given. Thus, only the refractive index n is unknown. Therefore, using the Fresnel reflection formula with the refractive index n being an unknown parameter, the refractive index that may well match an actually measured value is estimated, and the degeneration of the formula (16) is solved.
Thus, the optical system according to the second embodiment can be applied to the case where not only the Fresnel reflection but also the diffuse reflection occurs on the surface of an object.
The second embodiment is capable of obtaining a scattering component and a Fresnel reflection component of reflected light from an object using light rays with two or more wavelengths, like the first embodiment. As a result, the shape or the surface profile of an object may be accurately detected.
An object recognition method according to a third embodiment will be described with reference to
Each camera 32 is capable of separating received light into light rays with at least two wavelengths. For example, the received light may be separated into R (red), G (green), and B (Blue) wavelength regions. The peak wavelength of each wavelength region is 450 nm, 550 nm, or 680 nm.
As shown in
If the irradiation angle of the irradiation light is known, the normal directions 29 of the object 100 can be calculated by the processing circuit 40 since the Fresnel reflection angle is equal to the irradiation angle. Therefore, the normal direction of a point on the object 100 to which light is emitted can be calculated. The shape of the object 100 may be obtained by calculating the normal direction at each of predefined points on the surface of the object 100. The shape of the object 100 can be reconstructed in this manner. Since calculating the normal direction at each point on the object 100 is very complicated and needs a lot of time, the surface of the object 100 may be divided into small portions of a mesh, and a normal direction may be obtained for each portion.
When the Fresnel reflection component is calculated, the scattering component can be extracted by the processing circuit 40 using the method described in the descriptions of the first embodiment. When the scattering component is extracted, information on the nanostructure within the object 100 may be extracted. For example, whether the object 100 is a fiber fabric may be determined by comparing the extracted scattering component with the scattering component with respect to the fiber fabric. Thus, the material of the object 100 may be estimated. As a result, the hardness, the elasticity, the weight, the density, and so on of the object may be estimated.
The object recognition method according to the third embodiment will be described with reference to
First, the control unit 52 controls the irradiation device to emit light to the object 100 (step S1 in
The processing circuit 40 calculates the surface shape of the object based on the extracted Fresnel reflection component (step S6). The processing circuit 40 also extracts the scattering component based on the extracted Fresnel reflection component (step S5). The processing circuit 40 further extracts the surface profile of the object based on the extracted scattering component (step S7).
The automatic apparatus is controlled based on the surface shape or the surface profile of the object thus obtained. The above procedure is stored in the memory of the controller 50 shown in
As described above, the shape and the surface profile of the object may be determined by the object recognition method according to the third embodiment. The object recognition method according to the third embodiment is useful for automatic apparatuses that grab things.
The third embodiment is capable of obtaining a scattering component and a Fresnel reflection component of reflected light from an object using light rays with two or more wavelengths, like the first embodiment. As a result, the shape or the surface profile of an object may be accurately detected.
An object recognition method according to a fourth embodiment will be described with reference to
In the fourth embodiment, the light sources 22 emit light rays to the object 100 with time intervals. The Fresnel reflection component and the scattering component are extracted for the light ray emitted from each light source 22, using the method described in the descriptions of the first embodiment. As a result, the object shape may be obtained from the normal direction of the object 100, and the object surface profile may be reconstructed from the scattering component.
The fourth embodiment is capable of accurately detecting the shape or the surface profile of an object, like the first embodiment.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the methods and systems described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2016-178953 | Sep 2016 | JP | national |