The present technology relates to a signal processing system, a signal processing device, and a signal processing method that are applicable to detection of a transfer direction and the like of a target.
Patent Literature 1 discloses a displacement measurement method using a speckle pattern generated through emission of laser light. According to the displacement measurement method described in Patent Literature 1, an image capturing sensor acquires a speckle pattern of a test target surface at a predetermined frame rate. Next, cross-correlation computation is performed on two speckle patterns acquired at a predetermined time interval. A transfer distance and a transfer speed of the test target surface is measured on the basis of a result of the cross-correlation computation. Note that, a partial region for the measurement is appropriately set with respect to a light reception surface of the image capturing sensor, and computation is performed by using speckle patterns obtained from the partial region. This makes it possible to improve accuracy of the measurement (see paragraphs [0034] to [0088], FIG. 9, FIG. 10, and the like of Patent Literature 1).
As described above, technologies capable of accurately detecting a displacement of a target have been desired.
In view of the circumstances as described above, a purpose of the present technology is to provide the signal processing system, the signal processing device, and the signal processing method that are capable of accurately detecting a displacement of a target.
In order to achieve the above-mentioned purpose, a signal processing system according to an embodiment of the present technology includes: an illumination section; an image capturing section; a parameter control section; and an acquisition section.
The illumination section emits laser light to a target.
The image capturing section captures an image of the target irradiated with the laser light.
The parameter control section changes a parameter related to image capturing of at least one of the illumination section or the image capturing section within exposure time of the image capturing section.
The acquisition section acquires movement information related to movement of the target on the basis of an image signal of the target whose image is captured by the image capturing section.
In this signal processing system, a parameter related to image capturing of at least one of the illumination section or the image capturing section is changed within the exposure time of the image capturing section. Next, movement information of the target is generated on the basis of the image signal of the target whose image has been captured by the image capturing section. This makes it possible to detect a movement direction and the like of the target on the basis of the image signal obtained through one-time image capturing, for example. As a result, it is possible to accurately detect a displacement of the target.
The acquisition section may acquire the movement information on the basis of information related to a speckle generated through emission of the laser light to the target, the information related to a speckle being included in the image signal.
This makes it possible to accurately detect a displacement of the target.
The image capturing section may include an image sensor that generates the image signal. In this case, the parameter control section may change at least one of intensity of the laser light or gain of the image sensor within the exposure time.
This makes it possible to accurately detect a movement direction and the like of the target on the basis of luminance information or the like included in the image signal.
The parameter control section may change the intensity of the laser light in a manner that change in the intensity of the laser light within the exposure time is asymmetric change based on intermediate time of the exposure time.
This makes it possible to accurately detect a movement direction and the like of the target on the basis of the luminance information or the like included in the image signal.
The parameter control section may change the intensity of the laser light in a manner that intensity of the laser light obtained at an emission start timing of the laser light within the exposure time is different from intensity of the laser light obtained at an emission end timing of the laser light.
This makes it possible to accurately detect an orientation of movement and the like of the target on the basis of the luminance information or the like included in the image signal.
The parameter control section may change the intensity of the laser light in a manner that the intensity of the laser light increases or decreases within the exposure time.
This makes it possible to accurately detect an orientation of movement and the like of the target.
The parameter control section may be capable of controlling emission time from an emission start timing of the laser light to an emission end timing of the laser light within the exposure time.
For example, it is possible to obtain high detection accuracy by controlling emission time in accordance with a movement speed or the like of the target.
The parameter control section may control the emission time in a manner that the emission time is shorter than the exposure time.
This makes it possible to obtain high detection accuracy even in the case where a movement speed of the target is fast.
The acquisition section may acquire the movement information including a relative orientation of movement and a relative movement direction of the target based on an image capturing position of the image capturing section.
The present technology makes it possible to accurately detect a relative orientation of movement and a relative movement direction of the target.
The acquisition section may acquire information including a relative orientation of movement and a relative movement direction of the image capturing section with respect to the target.
The present technology makes it possible to accurately detect a relative orientation of movement and a relative movement direction of the own device including the image capturing section, for example.
The acquisition section may acquire the movement information by comparing a plurality of pixel signals included in the image signal of the target with each other.
For example, it is possible to accurately detect a movement direction and the like of the target on the basis of luminance information included in each pixel signal.
The illumination section may include at least one of a semiconductor laser, a gas laser, a solid-state laser, or a liquid laser.
It is possible to accurately detect a movement direction and the like of the target even in the case of using various kinds of laser light sources.
The image sensor may be a charge coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
It is possible to accurately detect a movement direction and the like of the target even in the case of using various kinds of image sensors.
The signal processing system may be configured as an endoscope or a microscope.
It is possible to accurately detect a displacement of a living tissue or the like in a test or the like using an endoscope or a microscope.
The target may be a living tissue.
The present technology makes it possible to accurately detect a blood flow, a transfer of an organ, and the like.
A signal processing device according to an embodiment of the present technology includes: a parameter control section; and an acquisition section.
The parameter control section that changes a parameter related to image capturing of at least one of an illumination section or an image capturing section within exposure time of the image capturing section that captures an image of a target irradiated with laser light emitted from the illumination section.
The acquisition section that acquires movement information related to movement of the target on the basis of an image signal of the target whose image is captured by the image capturing section.
A signal processing method according to an embodiment of the present technology is executed by a computer system and includes changing a parameter related to image capturing of at least one of an illumination section or an image capturing section within exposure time of the image capturing section that captures an image of a target irradiated with laser light emitted from the illumination section.
Movement information related to movement of the target is acquired on the basis of an image signal of the target whose image is captured by the image capturing section.
As described above, according to the present technology, it is possible to accurately detect a displacement of a target. Note that, the effects described herein are not necessarily limited and may be any of the effects described in the present disclosure.
Hereinafter, embodiments of the present technology will be described with reference to the drawings.
[Configuration of Signal Processing System]
The illumination unit 10 emits illumination light to a subject (target) M, which is an image capturing target. As illustrated in
The image capturing unit 20 captures an image of the subject M irradiated with the laser light L. As illustrated in
As the image sensor, it is possible to use a charge coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor, for example. Of course, it is possible to use another type of image sensor. The lens system 22 forms an image of the subject M irradiated with the laser light L, on the image sensor of the camera 21. A specific configuration of the lens system 22 is not limited. Note that,
For example, the signal processing device 30 includes hardware that is necessary for configuring a computer such as a GPUCPU, ROM, RAM, and an HDD. The signal processing method according to the present technology is executed when the CPU loads a program into the RAM and executes the program. The program relates to the present technology and is recorded in the ROM or the like in advance. For example, the signal processing device 30 can be implemented by any computer such as a personal computer (PC). Of course, it is also possible to use hardware such as a GPU, an FPGA, or an ASIC.
As illustrated in
The image capturing control section 31 is capable of controlling respective operations related to image capturing of the laser light source 11 and the camera 21. For example, the image capturing control section 31 generates a synchronization signal such as a clock signal, synchronizes the laser light source 11 and the camera 21 on the basis of the synchronization signal, and exerts control. It is also possible to exert synchronization control of the laser light source 11 and the camera 21 by using the clock signal or the like generated in the illumination unit 10 or the image capturing unit 20 as the synchronization signal.
In addition, the image capturing control section 31 is capable of controlling respective image capturing parameters related to image capturing of the illumination unit 10 and the image capturing unit 20. The image capturing parameters related to the image capturing include any parameter related to image capturing of the subject M. For example, the image capturing parameter of the illumination unit 10 includes any parameters such as intensity, color, emission time, and the like of the laser light L. The image capturing parameter of the image capturing unit 20 includes any parameters such as exposure time, gain of the image sensor, a focal length, a focus position, an angle of view, and an f-number. In this embodiment, the image capturing control section 31 functions as the parameter control section.
As illustrated in
At the exposure end time T2, amount of the laser light L is approximately zero, but the amount of the laser light L is instantaneously increased before a next exposure start time T1. Therefore, as illustrated in
For example, amount of current to be applied to the laser light source 11 may be controlled as control of the illumination intensity. For example, by reducing the amount of current to be applied to the laser light source 11 from the exposure start time T1 to the exposure end time T2, it is possible to control illumination intensity as illustrated in
In addition, it is possible to use an optical element such as an optical filter as an element of the image capturing control section 31, and control the illumination intensity. For example, an ND filer or the like is disposed in an optical path of the laser light L emitted from the laser light source 11. By appropriately rotating the ND filer or the like while maintaining constant intensity of the laser light L, it is possible to control the intensity of the laser light L. As described above, by appropriately synchronizing and controlling the laser light source 11 and the optical element, it is possible to control the illumination intensity as illustrated in
The movement information generation section 32 generates movement information related to movement of the subject M on the basis of the image signal of the subject M whose image is captured by the image capturing unit 20. In this embodiment, the movement information including an orientation of the movement and a movement direction of the subject M is generated. The orientation of movement and the movement direction of the subject M typically correspond to an orientation of transfer and a transfer direction of the subject M.
Next, the description will be given while using an example in which the subject M transfers in a predetermined direction. Therefore, sometimes the movement direction, the orientation of movement, and the movement information are referred to as a transfer direction, an orientation of transfer, and transfer information.
Note that, even in the case where a portion of the subject M moves, it is possible to detect a movement direction and the like of the portion by using the present technology. In other words, the movement of the subject M is not limited to transfer of the subject M. The movement of the subject M includes any movement such as the movement of the portion of the subject M and the like. In this embodiment, the movement information generation section 32 functions as the acquisition section.
[Generation of Movement Information]
A speckle image 25 illustrated in
In the case where the subject M transfers, the speckle pattern SP transfers while the pattern is substantially maintained as illustrated in
However, the above-described method includes the following problems. Although the speckle pattern SP is maintained in the case of a certain transfer amount, the speckle pattern SP often changes in the case of a large transfer amount. In addition, even in the case of a slight transfer, sometimes the speckle pattern SP may change into a speckle pattern SP with a small correlation, depending on an image capturing condition or an illumination condition such as image capturing with high magnification or image capturing with a short image capturing distance. This makes it difficult to detect a displacement of the subject M even when calculating the correlation between the speckle pattern SP before the transfer and the speckle pattern SP after the transfer.
As illustrated in
When the speckle S transfers, light from the speckle S is incident on pixels P2 to P5 arrayed along the transfer direction. Therefore, the five pixels P1 to P5 respectively generates pixel signals corresponding to light reception amounts of the incident light.
As illustrated in
The speckle S transfers within the exposure time. Therefore, a bar-like image is partially displayed in a manner that a bright point trails along the transfer direction. In the bar-like image, color gradation appears along the orientation of the transfer. The bar-like image representing the transfer of the speckle S is treated as a transfer image 27 of the speckle S. In such a way, the gradational transfer image 27 is generated in this embodiment.
As illustrated in
As described above, in this embodiment, the intensity of the laser light L is changed as an image capturing parameter of the illumination unit 10 in a manner that pixel signals output from respective pixels are changed as the speckle S transfers when the image signal of the speckle image 25 of one frame is generated. This makes it possible to accurately detect a displacement of the subject M on the basis of the luminance information or the like included in the image signal, for example.
Here, a process of changing respective image capturing parameters within the exposure time in a manner that pixel signals output from respective pixels are changed as the speckle S transfers when the image signal of the speckle image 25 of one frame is generated is parameter control for displacement detection. The image capturing control section 31 is also referred to as a block that exerts the parameter control for the displacement detection.
Displacement detection performed in the case where parameter control for displacement detection illustrated in
As illustrated in
As illustrated in
As illustrated in
In the case where the luminance difference A is smaller (No in Step 201), it is determined that the subject M transfers in the up-down direction, and it is determined whether slope of the luminance difference A is positive or negative (Step 202). The slope of the luminance difference A is slope of luminance values based on the luminance value of the target pixel PI. The slope of the luminance difference A is positive in the case where the luminance value of the pixel PD disposed immediately below the target pixel PI is larger than the luminance value of the target pixel PI. The slope of the luminance difference A is negative in the case where the luminance value of the pixel PD disposed immediately below the target pixel PI is smaller than the luminance value of the target pixel PI.
In this embodiment, the illumination intensity decreases as the subject transfers. Therefore, pixels closer to the front side of the orientation of transfer the transfer have smaller luminance values. In the case where the slope of the luminance difference A is positive (Yes in Step 202), it is determined that the subject has transferred upward (Step 203). In the case where the slope of the luminance difference A is negative (No in Step 202), it is determined that the subject has transferred downward (Step 204).
In the case where the luminance difference B is smaller (Yes in Step 201), it is determined that the subject M has transferred in the left-right direction, and it is determined whether slope of the luminance difference B is positive or negative (Step 205). In the case where the slope of the luminance difference B is positive (Yes in Step 205), it is determined that the subject has transferred to the left (Step 206). In the case where the slope of the luminance difference B is negative (No in Step 205), it is determined that the subject has transferred to the right (Step 207).
In Step 102, an orientation of transfer or a transfer direction of the subject M is detected on the basis of the orientations of transfer and the transfer directions detected in the respective pixels. Typically, as a result of displacement detection performed on the respective pixels, the most common orientation of transfer and the most common transfer direction are detected as the orientation of transfer and the transfer direction of the subject M. The method of detecting an orientation and a transfer direction of the subject M is not limited to the method of statistically determining results of displacement detection of the respective pixels. It is possible to use any algorithm.
The displacement detection of the respective pixels illustrated in
The present inventors have done the following simulation by using speckle images generated under predetermined conditions. In other words, a composite image has been generated by synthesizing the following five images.
First image: a speckle image.
Second image: an image obtained by reducing luminance values of respective pixels of the first speckle image at a predetermined rate and shifting the pixels displayed with the luminance values to the right by one pixel in the horizontal direction.
Third image: an image obtained by reducing the luminance values of the respective pixels of the second speckle image at the same rate and shifting the pixels displayed with the luminance values to the right by one pixel in the horizontal direction.
Fourth image: an image obtained by reducing the luminance values of the respective pixels of the third speckle image at the same rate and shifting the pixels displayed with the luminance values to the right by one pixel in the horizontal direction.
Fifth image: an image obtained by reducing the luminance values of the respective pixels of the fourth speckle image at the same rate and shifting the pixels displayed with the luminance values to the right by one pixel in the horizontal direction.
The composite image is an image similar to a speck image of one frame obtained in the case where the parameter control for displacement detection illustrated in
The algorithm illustrated in
Note that, it is difficult to detect the orientation of transfer of the subject M when using the image signal of the speckle image acquired in the comparative example illustrated in
It is possible to acquire an image signal of one frame including the gradational transfer image 27 when the parameter control for displacement detection is exerted as described above. This makes it possible to accurately generate transfer information including an orientation of transfer and a transfer direction of the subject M by comparing a plurality of pixel signals included in an image signal. As a result, it is possible to accurately detect a displacement of the target M.
Emission time of the laser light L may be controlled as illustrated in
In the example illustrated in
In addition, an amount of illumination light at the emission start timing T3 and an amount of illumination light at the emission end timing T4 are substantially equal to the illumination intensity at the exposure start time T1 and the illumination intensity at the exposure end time T2 illustrated in
For example, in the case where a transfer speed of the subject M is fast, the transfer image 27 extending along the transfer direction has a long length. In addition, the gradation has low contrast because light reception amounts of the respective pixels decrease. As a result, there is a possibility that accuracy of movement detection based on the gradational transfer image 27 gets lower.
As illustrated in
As illustrated in
For example, in the case where a transfer speed of the subject M is slow, a long emission time is favorable because the transfer image 27 extending along the transfer direction has a short length. With regard to the gradation, a difference in illumination intensity between the start of emission and the end of emission is favorably large.
In the case where the intensity is offset as illustrated in
In the case where information related to the transfer speed of the subject M is obtained in advance, it is possible to automatically set the length of the emission time and the slope of the illumination intensity on the basis of the information. For example, sometimes an operator who observes the subject M may input information related to a transfer speed of the observation target.
Alternatively, it is also possible to estimate the transfer speed of the subject M on the basis of a captured speckle image, and control length of emission time and the like in accordance with the estimated transfer speed. For example, the speckle image has lower contrast as a transfer speed of a speckle pattern increases. It is possible to calculate speckle contrast (SC) of respective pixels and estimate a transfer speed of the subject M on the basis of calculated values of the speckle contrast.
For example, a pixel bock in a predetermined size is set for calculating the SC. Examples of the pixel bock include a 3×3 picture block, a 7×7 pixel block, and a 13×13 pixel block. The pixel block is set around a SC calculation target pixel, and the SC is calculated by using the following formula and luminance values of respective pixels included in the pixel block.
SC=Standard deviation of luminance values/Average of luminance values
For example, in the case where an average or the like of the SC of the pixels is small, it is possible to estimate that the subject M has a fast transfer speed. In the case where the average or the like of the SC of the pixels is large, it is possible to estimate that the subject M has a slow transfer speed. It is possible to improve accuracy of movement information of the subject M by controlling the length of the emission time and the like on the basis of such estimation results.
Note that, it is also possible to estimate the transfer speed of the subject M on the basis of the length of the transfer image 27 or the like. It is possible to use a value obtained by dividing the length of the transfer image 27 by the exposure time, as an estimated value of the transfer speed. Alternatively, it is also possible to directly control the length of the emission time and the like on the basis of the length of the transfer image 27. For example, in the case where the length of the transfer image 27 is longer than a predetermined threshold, the emission time is controlled in a manner that the emission time is shortened.
As the parameter control for displacement detection, exposure time of the image capturing unit 20 may be controlled on the basis of the transfer speed or the like of the subject M. For example, in the case where the transfer speed of the subject M is fast, the exposure time may be shortened. This makes it possible to improve detection accuracy of movement of the subject M. It is sufficient to appropriately control parameters in a manner that the gradational transfer image 27 appropriate for the displacement detection is generated.
As the movement information, it is possible to acquire the transfer speed generated on the basis of the length of the transfer image 27 and the SC of the pixels.
As illustrated in
For example, as illustrated in
In addition, the intensity of the laser light L may be changed in a manner that illumination intensity obtained at the emission start timing T3 of the laser light L within the exposure time is different from illumination intensity obtained at the emission end timing T4 of the laser light L. This makes it possible to determine an orientation of transfer on the basis of luminance values at both ends of the transfer image 27. For example, as illustrated in
As illustrated in
As illustrated in
Note that, it is also possible to detect an oblique transfer direction even in the case of using detection results only in the horizontal direction (left-right direction) and the perpendicular direction (up-down direction) as illustrated in
As illustrated in
As illustrated in
This makes it possible to detect an orientation of transfer and a transfer direction of the subject M even in the case where the speckle image includes large speckles S. For example, in the example illustrated in
In addition, the present technology is not limited to the case of statistically determining detection results of the respective pixels (respective pixel blocks). It is also possible to detect a plurality of gradational transfer images 27 appearing in the speckle image. For example, in the case where the same result is obtained through the flowchart illustrated in
In an example illustrated in
As described above, it is also possible to generate the gradational transfer image 27 in the single speckle image by controlling camera gain. As a result, it is possible to accurately detect an orientation of movement and a movement direction of the subject M.
Note that, it is also possible to control both the illumination intensity and the camera gain as the parameter control for displacement detection. For example, the illumination intensity and the camera gain are changed in conjunction with each other within one exposure time in a manner that the gradational transfer image 27 is generated. As described above, it is also possible to control the plurality of image capturing parameters in conjunction with each other. Of course, it is also possible to control a parameter other than the illumination intensity or the camera gain as the parameter control for displacement detection.
As described above, in the signal processing system 100 according to this embodiment, an image capturing parameter of at least one of the illumination unit 10 or the image capturing unit 20 is changed within exposure time of the image capturing unit 20. In addition, movement information of the subject M is generated on the basis of an image signal of the subject M whose image has been captured by the image capturing unit 20. This makes it possible to detect a movement direction and the like of the subject M on the basis of the image signal obtained through one-time image capturing, for example. As a result, it is possible to accurately detect a displacement of the subject M.
For example, as described with reference to
In this embodiment, it is possible to detect a displacement of the subject M with very high accuracy on the basis of the gradational transfer images 27 included in the speckle image of one frame. Therefore, it is possible to detect the displacement of the subject M in real time without acquiring a plurality of speckle images.
Note that, the movement information related to movement of the subject M includes information related to movement relative to the signal processing system 100. In other words, the present technology makes it possible to accurately detect a displacement of the subject M relative to the signal processing system 100, that is, a relative displacement of the subject M occurred when the signal processing system 100 moves, for example. For example, it is also possible to acquire movement information including a relative orientation of movement and a relative movement direction of the subject M based on an image capturing position of the image capturing unit 20.
In addition, the present technology makes it possible to detect movement of the signal processing system 100 relative to the subject M. For example, it is also possible to acquire information including an orientation of movement and a movement direction of the image capturing unit 20 relative to the subject M. For example, it is assumed that a device or the like includes the illumination unit 10, the image capturing unit 20, and the signal processing device 30 illustrated in
For example, the present technology is applicable to an endoscope, an optical microscope, and the like that are used in a medical/biological fields. In other words, the signal processing system 100 may be configured as the endoscope or the microscope.
In this case, examples of the subject M include a living tissue such as a cell, a tissue, or an organ of a living body. When using the present technology, it is possible to accurately detect a displacement of the living tissue. Of course, it is also possible to detect a displacement of a portion of the living tissue. For example, by performing the processes illustrated in
For example, it is possible to detect a displacement of a partial tissue, a partial cell, or the like in an organ. Alternatively, the present technology makes it possible to detect a blood flow in a blood vessel. For example, it is possible to detect a direction, an orientation, a blood flow rate, and the like of the blood flow. Note that, change in the speckle patterns SP is large because the blood is liquid. However, it is possible to detect the blood flow by shortening exposure time, illumination time, output time, etc. and adjusting other image capturing environments and illumination environments.
In addition, the present technology is applicable to detection of various displacements in other fields. For example, the present technology is applicable to devices and systems in various fields such as a printer device, a conveyance device of substrates, etc., a self-propelled robot device, a mouse, or a drone.
The present technology is not limited to the above-described embodiments. Various other embodiments are possible.
The example in which the laser light source is used as the illumination unit has been described above. Note that, the present technology is applicable to the case of using another coherent light source capable of emitting coherent light.
Note that, it is possible for the illumination unit to also function as the signal processing device and generate the movement information of the subject M. Alternatively, the illumination unit, the image capturing unit, and the signal processing device may be integrated.
In addition, when a computer operated by the operator or the like and another computer capable of communication via a network work operate in conjunction with each other, a signal processing method and a program according to the present technology are executed, and this makes it possible to configure the signal processing system according to the present technology.
That is, the information processing method and the program according to the present technology may be executed not only in a computer system configured by a single computer but also in a computer system in which a plurality of computers operates in conjunction with each other. It should be noted that in the present disclosure, the system means an aggregate of a plurality of components (device, module (parts), and the like) and it does not matter whether or not all the components are housed in the same casing. Therefore, a plurality of devices housed in separate casings and connected to one another via a network and a single device having a plurality of modules housed in a single casing are both the system.
The execution of the information processing method and the program according to the present technology by the computer system includes, for example, both of a case where a single computer controls image capturing parameters for displacement detection, acquires movement information of a subject, etc., and a case where those processes are executed by different computers. Further, the execution of the respective processes by predetermined computers includes causing another computer to execute some or all of those processes and acquiring results thereof.
That is, the information processing method and the program according to the present technology are also applicable to a cloud computing configuration in which one function is shared and cooperatively processed by a plurality of devices via a network.
Out of the feature parts according to the present technology described above, at least two feature parts can be combined. That is, the various feature parts described in the embodiments may be arbitrarily combined irrespective of the embodiments. Further, various effects described above are merely examples and are not limited, and other effects may be exerted.
Note that, the present technology may also be configured as below.
(1) A signal processing system including:
an illumination section that emits laser light to a target;
an image capturing section that captures an image of the target irradiated with the laser light;
a parameter control section that changes a parameter related to image capturing of at least one of the illumination section or the image capturing section within exposure time of the image capturing section; and
an acquisition section that acquires movement information related to movement of the target on the basis of an image signal of the target whose image is captured by the image capturing section.
(2) The signal processing system according to (1),
in which the acquisition section acquires the movement information on the basis of information related to a speckle generated through emission of the laser light to the target, the information related to a speckle being included in the image signal.
(3) The signal processing system according to (1) or (2), in which
the image capturing section includes an image sensor that generates the image signal, and
the parameter control section changes at least one of intensity of the laser light or gain of the image sensor within the exposure time.
(4) The signal processing system according to (3),
in which the parameter control section changes the intensity of the laser light in a manner that change in the intensity of the laser light within the exposure time is asymmetric change based on intermediate time of the exposure time.
(5) The signal processing system according to (3) or (4),
in which the parameter control section changes the intensity of the laser light in a manner that intensity of the laser light obtained at an emission start timing of the laser light within the exposure time is different from intensity of the laser light obtained at an emission end timing of the laser light.
(6) The signal processing system according to any one of (3) to (5),
in which the parameter control section changes the intensity of the laser light in a manner that the intensity of the laser light increases or decreases within the exposure time.
(7) The signal processing system according to any one of (3) to (6),
in which the parameter control section is capable of controlling emission time from an emission start timing of the laser light to an emission end timing of the laser light within the exposure time.
(8) The signal processing system according to (7),
in which the parameter control section controls the emission time in a manner that the emission time is shorter than the exposure time.
(9) The signal processing system according to any one of (1) to (8),
in which the acquisition section acquires the movement information including a relative orientation of movement and a relative movement direction of the target based on an image capturing position of the image capturing section.
(10) The signal processing system according to any one of (1) to (9),
in which the acquisition section acquires information including a relative orientation of movement and a relative movement direction of the image capturing section with respect to the target.
(11) The signal processing system according to any one of (1) to (10),
in which the acquisition section acquires the movement information by comparing a plurality of pixel signals included in the image signal of the target with each other.
(12) The signal processing system according to any one of (1) to (11),
in which the illumination section includes at least one of a semiconductor laser, a gas laser, a solid-state laser, or a liquid laser.
(13) The signal processing system according to any one of (1) to (12),
in which the image sensor is a charge coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
(14) The signal processing system according to any one of (1) to (13),
in which the signal processing system is configured as an endoscope or a microscope.
(15) The signal processing system according to any one of (1) to (14),
in which the target is a living tissue.
(16) A signal processing device including:
a parameter control section that changes a parameter related to image capturing of at least one of an illumination section or an image capturing section within exposure time of the image capturing section that captures an image of a target irradiated with laser light emitted from the illumination section; and
an acquisition section that acquires movement information related to movement of the target on the basis of an image signal of the target whose image is captured by the image capturing section.
(17) A signal processing method that is executed by a computer system, the signal processing method including:
changing a parameter related to image capturing of at least one of an illumination section or an image capturing section within exposure time of the image capturing section that captures an image of a target irradiated with laser light emitted from the illumination section; and
acquiring movement information related to movement of the target on the basis of an image signal of the target whose image is captured by the image capturing section.
Number | Date | Country | Kind |
---|---|---|---|
2017-083365 | Apr 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/005942 | 2/20/2018 | WO | 00 |