The present application claims foreign priority based on Japanese Patent Application No. 2018-223947, filed Nov. 29, 2018, the contents of which is incorporated herein by reference.
The present invention relates to a magnifying observation apparatus.
Generally, optical systems having high optical performance are required for microscopes to observe an observation target with high precision. In addition, microscopes that display an observation image on a display device in real time have been developed and competition to display an observation target accurately and minutely as it is has been made among microscope manufacturers.
On the other hand, the performance of image processing chips has been improved so that they can be adopted for microscopes. In addition, microscopes are now used by inspectors who inspect (magnifying-observes) products manufactured in factories as well as researchers in biology.
Accordingly, market needs for microscopes that differ from conventional ones aspiring to pursue high optical performance have been developed. Such microscopes are often referred to as magnifying observation apparatuses to distinguish from conventional microscopes (see JP-A-2018-013734).
By the way, there are needs to observe roughness on the surfaces of products manufactured in factories. The problem here is the depth of field of a lens. When the difference in height of roughness exceeds the depth of field, the roughness becomes blurred and the user cannot recognize the roughness accurately. Accordingly, an object of the invention is to provide a roughness enhancement image produced through depth synthesis.
The invention provides a magnifying observation apparatus including, for example, an optical system that includes an objective lens and an imaging lens; an illumination section that illuminates, from different directions, an observation target placed in a visual field of the optical system with illumination light; an imaging section that receives light from the observation target via the optical system and generates a luminance image of the observation target; a change section that changes a focal position of the optical system along an optical axis of the optical system; a control section that controls the illumination section, the imaging section, and the change section; and a display section that displays an observation image that is an image of the observation target, in which the control section has an image generating section that obtains a plurality of first luminance images by controlling the illumination section so as to illuminate the observation target with the illumination light from a first illumination direction and controlling the change section and the imaging section so as to image the observation target in a plurality of different focal positions, obtains a plurality of second luminance images by controlling the illumination section so as to illuminate the observation target with the illumination light from a second illumination direction symmetric with the first illumination direction about the optical axis and controlling the change section and the imaging section so as to image the observation target in a plurality of different focal positions, and generates a roughness enhancement image that enhances roughness on a surface of the observation target by applying depth synthesis and roughness enhancement to the plurality of first luminance images and the plurality of second luminance images and has a depth of field wider than a single luminance image obtainable by the imaging section and in which the display section displays the roughness enhancement image as the observation image.
According to the invention, there is provided a roughness enhancement image produced through depth synthesis.
An observation section 1 has a base section 10, a stand section 20, a head section 22, and a placement table 30. The base section 10 is used to place the observation section 1 on a desk or the like without shaking the observation section 1 and forms substantially a lower half of the observation section 1. The base section 10 is provided with the placement table 30. The placement table 30 is supported by the portion of the base section extending from the vicinity of middle portion in the front-rear direction to the front, and projects upward from the base section 10. The placement table 30 is a portion on which the observation target is placed and is formed as an electric placement table in the embodiment. That is, the observation target can be supported movably in both a width direction (X-direction) and a depth direction (Y-direction) of the electric placement table and rotatable about an up-down direction (Z-direction) and Z-axis. The stand section 20 is swingable with respect to the base section 10. For example, the stand section 20 is swingable clockwise or counterclockwise when the observation section 1 is seen from the front. When the stand section 20 swings, the head section 22 also swings. The stand section 20 and the head section 22 are attached movably in the Z-axis direction. The head section 22 has an objective lens, an imaging lens, an illumination apparatus, an imaging element, and the like. The head section 22 illuminates the observation target placed on the placement table 30 with illumination light, detects the received amount of the reflected light or transmitted light of the illumination light from the observation target, and generates an image of the observation target. It should be noted here that details on the structure and the function of the observation section 1 are disclosed in Japanese Patent Application No. 2018-161347 described by the same applicant as this specification. The entire contents of the disclosure are incorporated herein by reference as apart of this specification.
A display section 2 has a display screen 2a enabling color display such as, for example, a liquid crystal display panel or an organic EL panel and receives electric power from the outside. The display screen 2a may have a built-in touch operation panel (an example of a reception section). Although a control section 60 is built into a display section 2 as an example in the embodiment, the present invention is not limited to the embodiment and the control section 60 may be built into the observation section 1 or a console section 3, or the control section 60 may be an external unit separated from the display section 2, the observation section 1, and the console section 3. The display section 2 is connected to the observation section 1 via a cable 5 so that signals can be received or transmitted. Supply of electric power to the observation section 1 may be performed via the cable 5 or a power supply cable (not illustrated).
The console section 3 is connected to the control section 60. Unlike a general keyboard or mouse, the console section 3 is a special operation device capable of operating the observation section 1, inputting or selecting various types of information, selecting an image, specifying a region, or specifying a position. A mouse 4 as a pointing device is connected to the control section 60. The console section 3 and the mouse 4 may be replaced with, for example, a touch panel input device, an audio input device, or the like as long as the magnifying observation apparatus 100 can be operated. A touch panel input device may be integrated with the display section 2 and may be configured to detect an arbitrary position on the display screen displayed on the display section 2. The console section 3 and the mouse 4 are reception sections that receive an input of an arbitrary position specified by the user in an image displayed on the display section 2.
An apparatus for performing operation and control, a printer, a computer for performing various other processes, a storage device, a peripheral device, and the like can be connected to the magnifying observation apparatus 100 in addition to the apparatuses and devices described above. The connection used here may be a serial connection such as, for example, IEEE1394, RS-232x, RS-422, or USB, a parallel connection, or an electric, magnetic, or optical connection via a network such as 10BASE-T, 100BASE-TX, or 1000BASE-T. In addition to a wired connection, a wireless connection using a wireless LAN such as IEEE802.x, radio waves such as Bluetooth (registered trademark), an infrared ray, or optical communication may be used. In addition, for example, various memory cards, a magnetic disk, a magneto-optical disk, a semiconductor memory, a hard disk, or the like can be used as a storage medium used as a storage device for data exchange or storage of various settings. The magnifying observation apparatus 100 can be a magnifying observation system in which various units, apparatuses, devices described above are combined with each other.
As illustrated in
As illustrated in
Ixy=(i1−i2)/(i1+i2) (1)
The pixel value of the roughness enhancement image is obtained by dividing the difference between a pair of pixel values by the sum of the pair of pixel values. When the difference between the pixel values is normalized, an expression other than expression (1) may be adopted.
The difference between the luminance value i2 and the luminance value i1, which is the numerator of the expression (1), reflects the inclination (gradient) of a local position of the observation target W corresponding to the pixel. The roughness is enhanced by the inclination of the plane of the observation target W.
The absolute value of a luminance value depends on the reflectivity of the material and the color of the observation target W. Normalization is performed by dividing the numerator by the denominator, which is the sum of the luminance value i2 and the luminance value i1, to reflect the enhanced roughness component.
A coloring section 36 colorizes the roughness enhancement image by obtaining color information from a color image of the observation target W obtained by the imaging section 25. A height image generating section 35 obtains the height of the surface of the observation target W for each of the pixels of the roughness enhancement image by integrating the pixels and generates a height image in which the height is represented by the pixels. The height image generating section 35 may generate a color height image by coloring the pixels of the height image according to the height of the pixels.
As described above, in the roughness enhancement image, the color components of the image of the observation target W are removed. Lack of color components makes recognition of roughness difficult. Accordingly, the coloring section 36 performs coloring.
Although the state of the roughness of the observation target W cannot be grasped in
The roughness enhancement image can be obtained by obtaining the illumination value that depends on the local surface inclination of the observation target W while switching the direction of illumination.
In
In
The roughness enhancement section 34 generates a roughness enhancement image using the first depth combined image I1a and the second depth combined image I2a. The roughness enhancement section 34 generates a roughness enhancement image based on the difference in luminance between the first depth combined image I1a and the second depth combined image I2a.
It should be noted here that the observation target W for which a shadow SH is generated is used in
Roughness is grasped more easily based on a roughness enhancement image in the observation target W having minute concave and convex portions as illustrated in
Images are obtained by switching between illumination light from the first illumination direction and illumination light from the second illumination direction and a roughness enhancement image is generated by the method described above. This enhances the roughness on the surface of the observation target W.
At a point p in
As described above, a roughness enhancement image can be obtained by calculation processing that enhances a small inclination of the observation target using images having different illumination directions.
The UI section 65 displays a roughness enhancement image generated by the roughness enhancement section 34 on the display section 2. The user uses the roughness enhancement image to perform the magnifying observation of a part of the observation target W. A plurality of points of the observation target W may be subjected to magnifying observation. In this case, the user moves the placement table 30 in the X-direction or the Y-direction or rotates the placement table in the 0 direction by operating the console section 3. When a movement instruction for moving in the X-direction is input from the console section 3, the CPU 61 causes the placement table driving section 29 to move the placement table 30 in the X-direction. When a movement instruction for moving in the Y-direction is input from the console section 3, the CPU 61 causes the placement table driving section 29 to move the placement table 30 in the Y-direction. While such a movement instruction is input, the CPU 61 continuously moves the placement table 30 according to a movement instruction. When the user stops inputting a movement instruction, the CPU 61 stops the movement of the placement table 30 via the placement table driving section 29.
Since some processing time is necessary to generate a roughness enhancement image, the UI section 65 cannot display, on the display section 2, a roughness enhancement image generated by the roughness enhancement section 34 while the placement table 30 moves. Accordingly, while the placement table 30 moves, the UI section 65 displays the luminance image of the observation target W generated by the luminance image generating section 31 on the display section 2. Accordingly, the user checks whether the next observation portion is present within the visual field range of the imaging section 25 by checking the luminance image displayed on the display section 2. The UI section 65 displays the luminance images of the observation target W like a moving picture on the display section 2 by updating the luminance image of the observation target W at certain time intervals. When detecting that an input of a movement instruction is stopped (the placement table 30 is stopped), the CPU 61 instructs the image processor 66 to generate a roughness enhancement image. This performs the obtainment, the depth synthesis, and the roughness enhancement of luminance images described above again and displays a roughness enhancement image in a new observation position on the display section 2. Even though the user does not instruct the regeneration of a roughness enhancement image, a roughness enhancement image is regenerated as the placement table 30 is stopped. Accordingly, the user can observe a plurality of observation portions efficiently.
Pixels included in a roughness enhancement image represent roughness on the surface of the observation target W and do not include color information of the surface. Therefore, the user cannot easily grasp the relationship between the color and the roughness position of the observation portion. Accordingly, the coloring section 36 causes the imaging section 25 to image the observation target W by turning on all of the light source regions 140A to 140D of the ring illumination device 26. This causes the luminance image generating section 31 to generate a color image (luminance image) of the observation target W. The coloring section 36 obtains color information from the color image, colorizes the roughness enhancement image by mapping the color information to the roughness enhancement image, and displays the roughness enhancement image on the display section 2. This makes the user easily grasp the relationship between the color and the roughness position of the observation portion.
When the surface of the observation target W is made of metal, overexposed white pixels or underexposed black pixels may be generated on a luminance image. In such a case, roughness cannot be recognized easily on a finally generated roughness enhancement image. Accordingly, the HDR process section 32 may be adopted. The HDR process section 32 obtains a plurality of luminance sub-images having different exposure times in one focal position and generates one luminance image by applying an HDR process to the plurality of luminance sub-images. The HDR process section 32 obtains the plurality of luminance sub-images having different exposure times each time the focal position is changed and generates one luminance image by applying an HDR process to the plurality of luminance sub-images. With this, both the first luminance images I1l to I1n and the second luminance images 121 to I2n become images subjected to an HDR process. Accordingly, the depth synthesis section 33 generates the first depth image I1a by applying, to depth synthesis, the first luminance images I1l to I1n subjected to an HDR process and generates a second depth image I2a by applying, to depth synthesis, the second luminance images 121 to I2n subjected to an HDR process. In addition, the roughness enhancement section 34 generates a roughness enhancement image subjected to an HDR process by combining the first depth image I1a subjected to an HDR process and the second depth image I2a subjected to an HDR process. Accordingly, overexposed whites and underexposed blacks are not generated easily, so the user can grasp roughness on the observation target W more accurately by checking the roughness enhancement image.
The roughness enhancement image described above does not include height information of the surface of the observation target W. Therefore, crater illusion occurs. Crater illusion is a phenomenon in which concave portions and convex portions are erroneously recognized by the observer because concave shapes cannot be distinguished from convex shapes based on an image. Accordingly, the height image generating section 35 may calculate the height of the surface of the observation object W for each of the pixels of the roughness enhancement image by integrating the pixels, generate a height image in which the height is represented by the pixels, and display the height image on the display unit 2. In addition, the height image generating section 35 may convert the height data of pixels of the height image to color information, map the color data to the roughness enhancement image, generate a roughness enhancement image colored differently according to the height, and display the roughness enhancement image on the display section 2. This makes the user easily distinguish concave shapes from convex shapes on the surface of the observation target W based on the color information.
The description of the light source regions 140A and 140C provided with reference to
The luminance image generating section 31 illuminates the observation target W illumination light from the third illumination direction, which is different from the first illumination direction and the second illumination direction, by controlling the ring illumination device 26. The luminance image generating section 31 obtains the plurality of third luminance images by controlling the Z-direction driving section 28 and the imaging section 25 so as to image the observation target W in a plurality of different focal positions. Similarly, the luminance image generating section 31 illuminates the observation target W with illumination light from the fourth illumination direction by controlling the ring illumination device 26. The luminance image generating section 31 obtains the plurality of fourth luminance images by controlling the Z-direction driving section 28 and the imaging section 25 so as to image the observation target W in a plurality of different focal positions. The image processor 66 generates the roughness enhancement image corresponding to the illumination direction selected by the UI section 65 from the first illumination direction, the second illumination direction, the third illumination direction, and the fourth illumination direction, using the plurality of luminance images corresponding to the selected illumination direction among the plurality of first illumination images, the plurality of second illumination images, the plurality of third illumination images, and the plurality of fourth illumination images. The display section 2 displays the roughness enhancement image corresponding to the selected illumination direction. Although the first illumination direction is selected in the above embodiment, the second illumination direction may be selected. In this case, i1 is exchanged with i2 in expression (1). When the third illumination direction is selected, the luminance value of the pixels of the third luminance image is i1 and the luminance value of the pixels of the fourth luminance image is i2 in expression (1). When the fourth illumination direction is selected, the luminance value of the pixels of the fourth luminance image is i1 and the luminance value of the pixels of the third luminance image is i2 in expression (1).
In the above embodiment, depth synthesis is executed and then roughness enhancement is executed. However, roughness enhancement may be executed before depth synthesis is executed.
The roughness enhancement section 34 generates a roughness enhancement sub-image I1x using the first luminance image I1l and the second luminance image 121. Similarly, the roughness enhancement section 34 generates a roughness enhancement sub-image I2x using a first luminance image 112 and a second luminance image 122. The roughness enhancement section 34 repeats this process in the same manner and finally generates a roughness enhancement sub-image Ixn using the first luminance image I1n and the second luminance image I2n. With this, the depth synthesis section 33 applies depth synthesis to n roughness enhancement sub-images Ix′ to Ixn, generates a final roughness enhancement image, and displays the roughness enhancement image on the display section 2.
It should be noted here that the ring illumination device 26 may be disposed around the objective lens 23 or may be configured to illuminate the observation target W with illumination light via the objective lens 23. Other than this, the layout disclosed in US2018/024346 described by the same applicant as this specification may be used as the layout of the ring illumination device 26. The entire contents of the disclosure are incorporated herein by reference as a part of this specification.
Even when inclined observation is performed while the head section 22 of the observation section 1 swings with respect to the placement table 30, a roughness enhancement image can be generated as in the case in which the head section 22 is orthogonal to the placement table 30. As described above, the roughness enhancement image may be further subjected to depth synthesis. At this time, the driving of the head section 22 and the placement table 30 may be controlled so that the upper surface of a workpiece W coincides with an eucentric position. The control in this case is also disclosed in Japanese Patent Application No. 2018-161347 and the entire contents of the disclosure are incorporated herein by reference as a part of this specification.
The magnification observation process when a roughness enhancement image is displayed in an image display region 2a will be described.
At S0, the CPU 61 moves the placement table 30 by controlling the placement table driving section 29 according to a movement instruction input through the mouse 4, the console section 3, or the like.
When the start of movement is detected at S1, a live image is displayed in the image display region 2a. That is, the CPU 61 controls the imaging section 25 and the ring illumination device 26 via the imaging control section 62 and the illumination control section 64 so as to image the observation target W, controls the image processor 66 so as to generate a live image (luminance image updated at certain time intervals) of the observation target W, and displays the live image on the display section 2 via the display control section 63. The illumination control section 64 turns on all of the four light source regions 140A to 140D for the live image. The user moves the placement table 30 in the X-direction, the Y-direction, the Z-direction, and the θ-direction so that the observation target W enters the visual field range of the imaging section 25 while seeing the live image.
At S2, the CPU 61 decides whether the movement of the placement table 30 is completed. The CPU 61 decides that the movement of the placement table 30 is completed, for example, when an explicit movement instruction is input or a certain time passes after a movement instruction is input through the mouse 4 or the console section 3. The CPU 61 proceeds to S3 when deciding that the movement of the placement table 30 is completed.
At S3, the CPU 61 displays the live image when the movement is completed in the image display region 2a. If an image of the observation target W is not displayed in the display region in the period from S4 to S10, the user cannot easily grasp observation target W. Accordingly, the CPU 61 may display an image of the observation target W immediately before generating the roughness enhancement image in the image display region 2a as necessary while generating the roughness enhancement image.
At S4, the CPU 61 generates the plurality of first luminance images I1l to I1n having different focal positions with respect to the first illumination direction. The illumination control section 64 turns on one light source region corresponding to the first illumination direction among the light source regions 140A to 140D. The imaging control section 62 causes the imaging section 25 and the luminance image generating section 31 to obtain the first luminance images I1l to I1n while changing the focal position of the objective lens 23 little by little through the Z-direction driving section 28.
At S5, the CPU 61 generates the plurality of second luminance images 121 to I2n having different focal positions with respect to the second illumination direction. The illumination control section 64 turns on one light source region corresponding to the second illumination direction among the light source regions 140A to 140D. The imaging control section 62 causes the imaging section 25 and the luminance image generating section 31 to obtain the second luminance images 121 to I2n while changing the focal position of the objective lens 23 little by little through the Z-direction driving section 28.
At S6, the CPU 61 generates the first depth image I1a with respect to the first illumination direction by applying depth synthesis to the first luminance images I1l to I1n using the depth synthesis section 33.
At S7, the CPU 61 generates the second depth image I2a with respect to the second illumination direction by applying depth synthesis to the second luminance images 121 to I2n using the depth synthesis section 33.
At S8, the CPU 61 generates a roughness enhancement image based on the first depth image I1a and the second depth image I2a using the roughness enhancement section 34.
At S9, the CPU 61 colors the roughness enhancement image using the coloring section 36 or the height image generating section 35. This gives the color of the surface of the observation target W to the roughness enhancement image or colors the roughness enhancement image according to the height data obtained from a height image.
At S10, the CPU 61 displays the roughness enhancement image on the display section 2 through the UI section 65 and the display control section 63. That is, the roughness enhancement image is displayed instead of the live image of the observation target W.
At S11, the CPU 61 decides whether an end instruction of the magnification observation process has been input. When the end instruction has been input through the mouse 4 or the like, the CPU 61 ends the magnification observation process. When the end instruction has not been input, the CPU 61 proceeds to S12.
At S12, the CPU 61 decides whether a movement instruction of the placement table 30 has been input through the mouse 4 or the like. When the movement instruction has not been input, the CPU 61 returns to S11. When the movement instruction has been input, the CPU 61 returns to S1 and displays the live image of the observation target W on the display section 2. After that, when the movement of the placement table 30 is stopped, the CPU 61 performs the generation of the roughness enhancement image again.
As described in
The depth synthesis section 33 functions as a depth synthesis section that generates the first depth combined image by applying depth synthesis to the plurality of first luminance images and generates the second depth combined image by applying depth synthesis to the plurality of second luminance images. The roughness enhancement section 34 functions as an enhancement image generating section that generates a roughness enhancement image based on the difference in luminance between the first depth combined image and the second depth combined image.
The depth synthesis section 33 may analyze a plurality of pixels present in the same pixel position in the plurality of first luminance images, select the pixel having the highest focusing degree of the plurality of pixels as the focusing pixel in the pixel position, and thereby generate the first depth combined image including the focusing pixels in the plurality of pixel positions. It should be noted here that the focusing degree may be calculated for each pixel region including a plurality of adjacent pixels. The depth synthesis section 33 may analyze a plurality of pixels present in the same pixel position in the plurality of second luminance images, select the pixel having the highest focusing degree of the plurality of pixels as the focusing pixel in the pixel position, and thereby generate the second depth combined image including the focusing pixels in the plurality of pixel positions.
The placement table 30 is an example of an XY stage movable at least in the X-direction and the Y-direction on which the observation target W is placed. The placement table driving section 29 is an example of a driving section that drives the XY stage. As described in relation to S12, the CPU 61 may also function as a detection section that detects that the XY stage has stopped after moving. When the detection section detects that the XY stage has stepped after moving, the CPU 61 and the image processor 66 may perform the generation of the roughness enhancement image again. This enables the user to omit the need to explicitly instruct the generation of a roughness enhancement image.
It should be noted here that the detection section may make a decision based on an image of the observation target W. The detection section may decide that the XY stage has stopped when the difference between the previous time and the current time is none or small.
When the CPU 61 detects that the XY stage is moving, the CPU 61 may display a moving image (live image) obtained by the imaging section 25 on the display section 2. When detecting that the XY stage has stopped, the CPU 61 may cause the image processor 66 to perform the generation of the roughness enhancement image again and the display section 2 to display the roughness enhancement image.
The coloring section 36 may function as a color synthesis section that obtains color information from a color image of the observation target W obtained by the imaging section 25 and colorizes the roughness enhancement image. This makes the user easily understand the relationship between roughness and surface color.
When generating the first luminance image, the image processor 66 may obtain a plurality of luminance sub-images having different exposure times, apply an HDR process to the plurality of luminance sub-images, and thereby generate the first luminance image. Similarly, when generating the second luminance image, the image processor 66 may obtain a plurality of luminance sub-images having different exposure times, apply an HDR process to the plurality of luminance sub-images, and thereby generate the second luminance image. By applying an HDR process in this way, it is possible to obtain a roughness enhancement image produced through depth synthesis in which overexposed white portions and underexposed black portions are reduced.
The height image generating section 35 may calculate the height of the surface of the observation target for each of pixels of the roughness enhancement image by integrating the pixels and generate a height image in which the height is represented by the pixels. In addition, the height image generating section 35 may generate a color height image by coloring the pixels in the height image according to the height represented by the pixels. This makes crater illusion less likely to occur and the user can distinguish concave shapes from convex shapes more accurately.
The mouse 4, the console section 3, and the CPU 61 may function as a selection section that selects the illumination direction of the illumination section. The image processor 66 may control the illumination section so as to illuminate the observation target W with illumination light from the third illumination direction different from the first illumination direction and the second illumination direction and may control the change section and the imaging section so as to image the observation target in a plurality of different focal positions, and thereby obtain the plurality of third luminance images. The image processor 66 may control the illumination section so as to illuminate the observation target with illumination light from the fourth illumination direction symmetric with the third illumination direction about the optical axis, control the change section and the imaging section to image the observation target in a plurality of different focal positions, and thereby obtain the plurality of fourth luminance images. In addition, the image processor 66 may generate the roughness enhancement image corresponding to the illumination direction selected by the selection section from the first illumination direction, the second illumination direction, the third illumination direction, and the fourth illumination direction using a plurality of luminance images corresponding to the illumination direction selected by the selection section from the plurality of first luminance images, the plurality of second luminance images, the plurality of third luminance images, and the plurality of fourth luminance images. The display section 2 may display the roughness enhancement image corresponding to the illumination direction selected by the selection section.
The roughness enhancement section 34 may synthesize a pair of luminance images having the same focal position from the plurality of first luminance images and the plurality of second luminance images based on the difference in luminance and generate a plurality of roughness enhancement sub-images in which roughness is enhanced in each of a plurality of focal positions. The depth synthesis section 33 may function as a depth combined image generating section that generates a roughness enhancement image subjected to depth synthesis by applying depth synthesis to the plurality of roughness enhancement sub-images. As described above, roughness enhancement or depth synthesis may be performed first.
The illumination section may be the ring illumination device 26 disposed around the objective lens 23. The illumination section may be an illumination light source which is provided in the head section 22 and illuminates the observation target W with illumination light through the objective lens 23.
Number | Date | Country | Kind |
---|---|---|---|
2018-223947 | Nov 2018 | JP | national |