IMAGE SENSING DEVICE AND IMAGE SENSING METHOD

Information

  • Patent Application
  • 20220103770
  • Publication Number
    20220103770
  • Date Filed
    August 24, 2021
    2 years ago
  • Date Published
    March 31, 2022
    2 years ago
Abstract
An image sensing device and an image sensing method are provided. The image sensing device is adapted to sense a target object in a situation without ambient light. The image sensing device includes a modulated light source, an image sensor, and a processing circuit. The modulated light source includes light sources. The image sensor includes sensing pixels. The image sensor is a color filter-free sensor. In one frame period, the light sources emit sensing light beams at different timings, and the sensing pixels at different timings sense reflected light beams generated by the sensing light beams reflected by the target object. The processing circuit synthesizes first images corresponding to a part of the reflected light beams to generate a complete sensing image. The processing circuit obtains depth information of the target object according to another part of the reflected light beams.
Description
BACKGROUND
Technical Field

The disclosure relates to a sensing device, and particularly relates to an image sensing device and an image sensing method.


Description of Related Art

At present, RGB-D image sensors have been applied more and more extensively. However, the general RGB-D image sensors have poor image resolution. From another perspective, if the image resolution of the general RGB-D image sensors is to be improved, the density of sensing pixels should be increased, which may lead to an increase in the relevant costs. Besides, when the RGB-D image sensors are applied in the medical field in a situation without ambient light (e.g., in surgical endoscopes), the RGB-D image sensors also encounter issues of poor image resolution and high costs. To resolve said issues, several embodiments are provided below for explanation.


SUMMARY

The disclosure provides an image sensing device and an image sensing method, which may provide images with high resolution and provide depth information of sensing results.


In an embodiment of the disclosure, an image sensing device adapted to sense a target object in a situation without ambient light is provided. The image sensing device includes a modulated light source, an image sensor, and a processing circuit. The modulated light source includes a plurality of light sources. The image sensor includes a plurality of sensing pixels. The image sensor is a color filter-free sensor. The processing circuit is coupled to the modulated light source and the image sensor and configured to run the modulated light source and the image sensor. In one frame period, the light sources emit a plurality of sensing light beams at different timings, and the sensing pixels at different timings sense a plurality of reflected light beams generated by the sensing light beams reflected by the target object. The processing circuit synthesizes a plurality of first images corresponding to a part of the reflected light beams to generate a complete sensing image. The processing circuit obtains depth information of the target object according to another part of the reflected light beams.


In an embodiment of the disclosure, an image sensing method adapted to an image sensing device running in a situation without ambient light to sense a target object is provided. The image sensing device includes a modulated light source and an image sensor. The image sensing method includes: in one frame period, emitting a plurality of sensing light beams at different timings through a plurality of light sources of the modulated light source; in the frame period, sensing a plurality of reflected light beams at different timings by a plurality of sensing pixels of the image sensor, wherein the reflected light beams are generated by the sensing light beams reflected by the target object; synthesizing a plurality of first images corresponding to a part of the reflected light beams to generate a complete sensing image and obtaining depth information of the target object according to another part of the reflected light beams.


Based on the above, the image sensing device and the image sensing method provided in one or more embodiments of the disclosure may obtain the complete sensing image of the target object with high image resolution and the depth information of the target object through the color filter-free image sensor in one frame period.


To make the above more comprehensible, several embodiments accompanied with drawings are described in detail as follows.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the disclosure and, together with the description, serve to explain the principles of the disclosure.



FIG. 1 schematically illustrates circuitry of an image sensing device according to an embodiment of the disclosure.



FIG. 2 is a schematic view of an image sensing scenario according to an embodiment of the disclosure.



FIG. 3A, FIG. 3B, and FIG. 3C are schematic views of a plurality of sensing images according to an embodiment of the disclosure.



FIG. 3D is a schematic view of a complete sensing image according to an embodiment of the disclosure.



FIG. 3E is a schematic view of depth information according to an embodiment of the disclosure.



FIG. 4 is a flowchart of an image sensing method according to an embodiment of the disclosure.



FIG. 5 is an operation timing diagram according to an embodiment of the disclosure.



FIG. 6 is an operation timing diagram according to another embodiment of the disclosure.





DESCRIPTION OF THE EMBODIMENTS

Reference will now be made in detail to the present preferred embodiments of the disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.



FIG. 1 schematically illustrates circuitry of an image sensing device according to an embodiment of the disclosure. With reference to FIG. 1, an image sensor device 100 includes a processing circuit 110, an image sensor 120, and a modulated light source 130. The processing circuit 110 is coupled to the image sensor 120 and the modulated light source 130. The processing circuit 110 may be an internal image processing unit of an image sensor or a processing unit of a terminal device external to the image sensor, which should not be construed as a limitation in the disclosure. The processing circuit 110 may, for instance, include a central processing unit (CPU), a graphics processing unit (GPU), other programmable general-purpose or special-purpose microprocessors, digital signal processors (DSP), programmable controllers, application specific integrated circuits (ASIC), programmable logic devices (PLD), other similar processing devices, or a combination of these devices. The processing circuit 110 may be configured to generate driving signals, perform signal processing operations, and perform relevant calculation functions. Note that the image sensing device 100 provided in this embodiment is adapted to perform an image sensing operation and a range finding operation. Therefore, the processing circuit 110 may perform relevant image processing operations and parameter calculations to generate sensing images, synthesize the sensing images, and obtain depth information (range parameters).


In this embodiment, the image sensor 120 may be, for instance, an RGB-D image sensor, which should not be construed as a limitation in the disclosure. The image sensor 120 may include a sensing array, and the sensing array may include N sensing pixels arranged in an array, where N is a positive integer. Note that the image sensor 120 is a color filter-free sensor. In this embodiment, the modulated light source 130 may include a plurality of light sources, and the light sources may respectively correspond to different wavelengths. The light sources may include laser light sources. The light sources may include visible light sources and an invisible light source. The visible light sources may include, for instance, a red light source, a green light source, and a blue light source, and the invisible light source is, for instance, an infrared light source. However, the type of colors or wavelengths of the visible light sources and the invisible light source should not be construed as a limitation in the disclosure. In addition, in some embodiments of the disclosure, at least a part of a plurality of sensing light beams provided by the light sources may have different polarizations. The modulated light source 130 may modulate the sensing light beams according to different sensing or lighting requirements.



FIG. 2 is a schematic view of an image sensing scenario according to an embodiment of the disclosure. With reference to FIG. 1 and FIG. 2, the image sensing device 100 provided in this embodiment is adapted to sense a target object 200 in a situation without ambient light; for instance, the image sensing device may be applied in a surgical endoscope. In this embodiment, the processing circuit 110 may control the modulated light source 130 to emit a plurality of sensing light beams SL toward the target object 200 at different timings and may control the image sensor 120 to sense a plurality of reflected light beams RL at different timings, where the reflected light beams RL are generated by the sensing light beams SL reflected via a surface of the target object 200. Note that the reflected light beams provided in this embodiment correspond to sensing images respectively sensed by the image sensor 120 at different timings. The modulated light source 130 provided in this embodiment may emit the sensing light beams SL that are modulated, and the sensing light beams SL may be visible light beams of different colors or different wavelengths, respectively. As such, the reflected light beams RL may also have different colors or different wavelengths. The processing circuit 110 may generate a plurality of sensing images according to the reflected light beams RL and then synthesize the sensing images into a complete sensing image. Here, a plurality of pixels of each of the sensing images may serve as a plurality of sub-pixels of the pixels of the complete sensing image. Therefore, the processing circuit 110 may generate the complete sensing image with high image resolution. In addition, one of the sensing light beams SL may be an invisible light beam, and the processing circuit 110 may apply a sensing result of one of the sensing light beams SL sensed by the image sensor 120 as a range finding result, so as to obtain the depth information of the target object 200 with high image resolution.



FIG. 3A, FIG. 3B, and FIG. 3C are schematic views of a plurality of sensing images according to an embodiment of the disclosure. FIG. 3D is a schematic view of a complete sensing image according to an embodiment of the disclosure. FIG. 3E is a schematic view of depth information according to an embodiment of the disclosure. With reference to FIG. 1 to FIG. 3C, for instance, the processing circuit 110 may run the red light source of the modulated light source 130 to emit a red light beam toward the target object 200 and run N sensing pixels (e.g., 400×400) of the image sensor 120 to perform a sensing operation, so as to generate a sensing image 301 shown in FIG. 3A. In this regard, the sensing image 301 also has N pixels, and the image resolution may be 400×400. Since the target object 200 is irradiated by the red light beam in a situation without ambient light, a plurality of pixel values of pixels 301_1 to 301_N of the sensing image 301 may be the sensing results completely corresponding to the red light beam. Then, the processing circuit 110 may run the green light source of the modulated light source 130 to emit a green light beam toward the target object 200 and run the N sensing pixels (e.g., 400×400) of the image sensor 120 to perform a sensing operation, so as to generate a sensing image 302 shown in FIG. 3B. In this regard, the sensing image 302 also has N pixels, and the image resolution may be 400×400. Since the target object 200 is irradiated by the green light beam in a situation without ambient light, a plurality of pixel values of pixels 302_1 to 302_N of the sensing image 302 may be the sensing results completely corresponding to the green light beam. Then, the processing circuit 110 may run the blue light source of the modulated light source 130 to emit a blue light beam toward the target object 200 and run the N sensing pixels (e.g., 400×400) of the image sensor 120 to perform a sensing operation, so as to generate a sensing image 303 shown in FIG. 3C. In this regard, the sensing image 303 also has N pixels, and the image resolution may be 400×400. Since the target object 200 is irradiated by the blue light beam in a situation without ambient light, a plurality of pixel values of the pixels 303_1 to 303_N of the sensing image 303 may be the sensing results completely corresponding to the blue light beam.


With reference to FIG. 3D, for instance, the processing circuit 110 may synthesize the sensing images 301 to 303 (i.e., a plurality of first images) into one complete sensing image 304 as shown in FIG. 3D. Note that the complete sensing image 304 also has N pixels, and the image resolution may be 400×400. Each of a plurality of pixels P_1 to P_N of the complete sensing image 304 may include a plurality of sub-pixels 304_1 to 304_3. In this regard, the processing circuit 110 may apply each of the pixel values of the pixels 301_1 to 301_N of the sensing image 301 as the pixel value of each sub-pixel 304_1 of the pixels P_1 to P_N of the complete sensing image 304. The processing circuit 110 may apply each of the pixel values of the pixels 302_1 to 302_N of the sensing image 302 as the pixel value of each of the sub-pixel 304_2 of the pixels P_1 to P_N of the complete sensing image 304. The processing circuit 110 may apply each of the pixel values of the pixels 303_1 to 303_N of the sensing image 303 as the pixel value of each of the sub-pixel 304_3 of the pixels P_1 to P_N of the complete sensing image 304. Therefore, the image sensing device 100 may obtain the complete sensing image 304 with high image resolution.


With reference to FIG. 3E, for instance, the processing circuit 110 may run the infrared light source of the modulated light source 130 to emit an infrared light beam toward the target object 200 and run the N sensing pixels (e.g., 400×400) of the image sensor 120 to perform a sensing operation, so as to generate depth information 305 shown in FIG. 3E. In this regard, the depth information 305 may correspond to the range finding results of the N sensing pixels, and the image resolution of the depth information 305 may be 400×400. Since the target object 200 is irradiated by the infrared light beam in a situation without ambient light, the depth information 305 may have range finding results 305_1˜305_N corresponding to the N sensing pixels. Therefore, the image sensor device 100 may obtain the depth information 305 with high image resolution.


Note that a sensing array of a general image sensor equipped with a color filter and capable of performing a ranging finding function may have, for instance, 400×400 sensing pixels. In this regard, in the sensing array of the general image sensor equipped with a red color filter, the number of the sensing pixels is 100×100; in the sensing array of the general image sensor equipped with a green color filter, the number of the sensing pixels is 100× 100; in the sensing array of the general image sensor equipped with a blue color filter, the number of the sensing pixels is 100×100; in the sensing array of the general image sensor equipped with an infrared filter, the number of the sensing pixels is 100×100. As such, after the sensing images are synthesized, the image resolution of the complete sensing image generated by the general image sensor is 100×100, and the image resolution of the depth information generated by the general image sensor is 100×100. Namely, compared to the general image sensor provided in the above-mentioned embodiment, the image sensing device 100 provided in one or more embodiments of the disclosure may obtain the complete sensing image and the depth information with higher image resolution on the condition that the sensing pixel density of the sensing array of the image sensor remains unchanged. Moreover, compared with the general image sensor provided in the above-mentioned embodiment, the image sensing device 100 provided in one or more embodiments of the disclosure does not need to be equipped with the color filter, and thus the manufacturing cost of the image sensing device may be reduced.



FIG. 4 is a flowchart of an image sensing method according to an embodiment of the disclosure. With reference to FIG. 1 and FIG. 4, the image sensing device 100 depicted in FIG. 1 may perform following steps S410 to S440 to accomplish the image sensing function described in the previous embodiments. In step S410, in one frame period, the image sensing device 100 may emit a plurality of sensing light beams at different timings through a plurality of light sources of the modulated light source 130. In step S420, in the frame period, the image sensing device 100 may sense a plurality of reflected light beams through a plurality of sensing pixels of the image sensor at different timings, and the reflected light beams are generated by the sensing light beams reflected by the target object. In step S430, the image sensing device 100 may synthesize a plurality of first images corresponding to a part of the reflected light beams to generate a complete sensing image. In step S440, the image sensing device 100 may obtain depth information of the target object according to another part of the reflected light beams.



FIG. 5 is an operation timing diagram according to an embodiment of the disclosure. With reference to FIG. 1, FIG. 2 to FIG. 3D, and FIG. 5, a plurality of light sources of the modulated light source 130 may include a plurality of visible light sources and an invisible light source. In this embodiment, the visible light sources include a red light source, a green light source, and a blue light source, and the invisible light source is an infrared light source. In this embodiment, sub-sensing periods VP1 to VP3 of the sensing images 301 to 303 are synchronized with the red light source, the green light source, and the blue light source. A sub-sensing period IP of the depth information 305 is synchronized with the infrared light source. In this embodiment, one frame period FP includes visible light beam sensing periods and an invisible light beam sensing period. The visible light beam sensing periods include the sub-sensing periods VP1˜VP3, and the invisible light beam sensing period includes the sub-sensing period IP. Each of the visible light sources of the modulated light source 130 is individually turned on in a corresponding sub-sensing period of the sub-sensing periods VP1˜VP3.


For instance, at a light emitting timing SL_R shown in FIG. 5, the red light source is individually turned on in the sub-sensing period VP1 of the image sensor 120, so as to allow the image sensor 120 to obtain the sensing image 301. At a light emitting timing SL_G shown in FIG. 5, the green light source is individually turned on in the sub-sensing period VP2 of the image sensor 120, so as to allow the image sensor 120 to obtain the sensing image 302. At the light emitting timing SL_B shown in FIG. 5, the blue light source is individually turned on in the sub-sensing period VP3 of the image sensor 120, so as to allow the image sensor 120 to obtain the sensing image 303.


In this embodiment, the image sensing device 100 may apply an indirect time of flight (I-ToF) sensing technology. At a light emitting timing SL_IR1 shown in FIG. 5, the infrared light source of the modulated light source 130 is synchronously turned on in the sub-sensing period IP of the image sensor 120. Here, the infrared light source emits light at intervals to sequentially emit a plurality of intense pulsed light beams, so as to allow the image sensor 120 to obtain the depth information 305. In an embodiment, the image sensing device 100 may apply a direct time of flight (D-ToF) sensing technology. At the light emitting timing SL_IR2 shown in FIG. 5, the infrared light source of the modulated light source 130 is synchronously turned on in the sub-sensing period IP of the image sensor 120; here, the infrared light source emits an intense pulsed light beam, so as to allow the image sensor 120 to obtain the depth information 305. In another embodiment, the image sensor device 100 may apply a structured light range finding technology. At a light emitting timing SL_IR3 shown in FIG. 5, the infrared light source of the modulated light source 130 is synchronously turned on in the sub-sensing period IP of the image sensor 120; here, the infrared light source continuously emits a structured light beam to enable the image sensor 120 to obtain the depth information 305.


In this embodiment, there is no timing overlap among the sub-sensing periods VP1˜VP3 and the sub-sensing period IP. In particular, the sensing pixels of the image sensor 120 may respectively sense the reflected light beams in the sub-sensing periods VP1˜VP3 of each visible light beam sensing period, and there is no timing overlap among the sub-sensing periods VP1˜VP3. In addition, a length of the visible light beam sensing period provided in this embodiment (e.g., the sum of the lengths of the sub-sensing periods VP1˜VP3) may be less than 1/25 second; as such, in the sensing process, a viewer subject to persistence of vision may not be affected by the switching operation of the light sources of different colors.



FIG. 6 is an operation timing diagram according to another embodiment of the disclosure. With reference to FIG. 1 and FIG. 6, in other embodiments of the disclosure, the image obtained by the image sensor 120 is not limited to the RGB image. In some special application scenarios, for instance, when the endoscope is applied to shoot specific human organs or tumors, etc., the user requires the image to display a specific visualization effect. At this time, the modulated light source 130 may emit the sensing light beams (illumination light beams) of different colors, such as a yellow light beam. Therefore, at a light emitting timing SL_R′ and a light emitting timing SL_G′ shown in FIG. 6, the red light source and the green light source are simultaneously turned on in the sub-sensing period VP1 of the image sensor 120, so as to allow the image sensor 120 to obtain the sensing image of which the pixel value corresponding to the overall pixel value is yellow. At a light emitting timing SL_G′ shown in FIG. 6, the green light source is individually turned on in the sub-sensing period VP2 of the image sensor 120, so as to allow the image sensor 120 to obtain the sensing image of which the pixel value corresponding to the overall pixel value is green. At a light emitting timing SL_B′ in FIG. 6, the blue light source is individually turned on in the sub-sensing period VP3 of the image sensor 120, so as to allow the image sensor 120 to obtain the sensing image of which the pixel value corresponding to the overall pixel value is blue. As such, the processing circuit 110 provided in this embodiment may generate a synthesized complete sensing image with yellow sub-pixels, green sub-pixels, and blue sub-pixels. However, the manner of turning on the light sources is not limited to those provided in the previous embodiments. At least two of the visible light sources provided in one or more embodiments of the disclosure may be simultaneously turned on in any of the sub-sensing periods VP1, VP2, and VP3.


To sum up, the image sensing device and the image sensing method provided in one or more embodiments of the disclosure may sense the reflected light beams at different timings through the color filter-free image sensor, and the reflected light beams are generated after the sensing light beams of different colors irradiate the target object and are reflected. The image sensing device provided in one or more embodiments of the disclosure may synthesize the sensing images corresponding to the image sensing results of the reflected light beams to form a complete sensing image with high image resolution and may obtain the depth information of the target object according to the range finding result of the reflected light beams.


It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments without departing from the scope or spirit of the disclosure. In view of the foregoing, it is intended that the disclosure covers modifications and variations provided they fall within the scope of the following claims and their equivalents.

Claims
  • 1. An image sensing device adapted to sense a target object in a situation without ambient light, the image sensing device comprising: a modulated light source, comprising a plurality of light sources;an image sensor, comprising a plurality of sensing pixels, wherein the image sensor is a color filter-free sensor; anda processing circuit, coupled to the modulated light source and the image sensor and configured to run the modulated light source and the image sensor,wherein in one frame period, the light sources emit a plurality of sensing light beams at different timings, and the sensing pixels at different timings sense a plurality of reflected light beams generated by the sensing light beams reflected by the target object,wherein the processing circuit synthesizes a plurality of first images corresponding to a part of the reflected light beams to generate a complete sensing image, and the processing circuit obtains depth information of the target object according to another part of the reflected light beams.
  • 2. The image sensing device according to claim 1, wherein the light sources comprise a plurality of visible light sources and an invisible light source, and a plurality of sub-sensing periods of the depth information and the first images are synchronized with the visible light sources and the invisible light source.
  • 3. The image sensing device according to claim 2, wherein the sensing light beams comprise a plurality of visible light beams and an invisible light beam, and the sensing pixels sense a plurality of first reflected light beams and a second reflected light beam respectively generated by the visible light beams and the invisible light beam reflected by the target object in different time periods, wherein the processing circuit synthesizes the first images corresponding to the first reflected light beams to generate the complete sensing image, and the processing circuit obtains the depth information of the target object according to the second reflected light beam.
  • 4. The image sensing device according to claim 2, wherein the visible light sources comprise a red light source, a green light source, and a blue light source, and the invisible light source is an infrared light source.
  • 5. The image sensing device according to claim 2, wherein the sensing pixels sense a plurality of first reflected light beams in a visible light beam sensing period in the frame period, and the sensing sense the second reflected light beam in an invisible light beam sensing period in the frame period, wherein the visible light beam sensing period and the invisible light beam sensing period are not overlapped.
  • 6. The image sensing device according to claim 5, wherein the sensing pixels respectively sense the first reflected light beams in the sub-sensing periods in the visible light beam sensing period, wherein the sub-sensing periods are not overlapped.
  • 7. The image sensing device according to claim 5, wherein a length of the visible light beam sensing period is less than 1/25 second.
  • 8. The image sensing device according to claim 2, wherein each of the visible light sources is individually turned on in a corresponding sub-sensing period of the sub-sensing periods.
  • 9. The image sensing device according to claim 2, wherein at least two of the visible light sources are simultaneously turned on in one of the sub-sensing periods.
  • 10. The image sensing device according to claim 1, wherein the sensing light beams respectively correspond to different wavelengths.
  • 11. The image sensing device according to claim 1, wherein at least one part of the sensing light beams has different polarizations.
  • 12. The image sensing device according to claim 1, wherein the processing circuit performs a direct time of flight calculation according to the another part of the reflected light beams to obtain the depth information.
  • 13. The image sensing device according to claim 1, wherein the processing circuit performs an indirect time of flight calculation according to the another part of the reflected light beams to obtain the depth information.
  • 14. The image sensing device according to claim 1, wherein the another part of the reflected light beams is a structured light beam.
  • 15. An image sensing method adapted to an image sensing device running in a situation without ambient light to sense a target object, the image sensing device comprising a modulated light source and an image sensor, the image sensing method comprising: in one frame period, emitting a plurality of sensing light beams by a plurality of light sources of the modulated light source at different timings;in the frame period, sensing a plurality of reflected light beams by a plurality of sensing pixels of the image sensor at different timings, wherein the reflected light beams are generated by the sensing light beams reflected by the target object; andsynthesizing a plurality of first images corresponding to a part of the reflected light beams to generate a complete sensing image, and obtaining depth information of the target object according to another part of the reflected light beams.
  • 16. The image sensing method according to claim 15, wherein the light sources comprise a plurality of visible light sources and an invisible light source, and a plurality of sensing periods of the depth information and the first images are synchronized with the visible light sources and the invisible light source.
  • 17. The image sensing method according to claim 16, wherein the sensing light beams comprise a plurality of visible light beams and an invisible light beam, and the sensing pixels sense a plurality of first reflected light beams and a second reflected light beam respectively generated by the visible light beams and the invisible light beam reflected by the target object in different time periods, wherein the steps of generating the complete sensing image and obtaining the depth information comprise:synthesizing the first images corresponding to the first reflected light beams to generate the complete sensing image; andobtaining the depth information of the target object according to the second reflected light beam.
  • 18. The image sensing method according to claim 16, wherein the sensing pixels sense the first reflected light beams in a visible light beam sensing period in the frame period, and the sensing pixels sense the second reflected light beam in an invisible light beam sensing period in the frame period, wherein the visible light beam sensing period and the invisible light beam sensing period are not overlapped.
  • 19. The image sensing method according to claim 16, wherein each of the visible light sources is individually turned on in a corresponding sub-sensing period of the sub-sensing periods.
  • 20. The image sensing method according to claim 16, wherein at least two of the visible light sources are simultaneously turned on in one of the sub-sensing periods.
CROSS-REFERENCE TO RELATED APPLICATION

This application claims the priority benefit of U.S. provisional patent application Ser. No. 63/084,010, filed on Sep. 28, 2020. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.

Provisional Applications (1)
Number Date Country
63084010 Sep 2020 US