The present disclosure relates to a light source control device, a light source control method, a program, and a surgical system, and more specifically, to a light source control device, a light source control method, a program, and a surgical system that enable observation with a more appropriate exposure state.
In recent years, surgery using an endoscope has been performed instead of a traditional laparotomy in a medical field. Furthermore, it has been known that it is difficult to set an appropriate exposure state in observation with the endoscope, and a technology to uniformly irradiate a subject has been developed.
For example, Patent Document 1 discloses an endoscope apparatus which changes an irradiation light quantity and a light distribution pattern according to distance information and adjusts uneven light distribution by gain correction. In addition, Patent Document 2 discloses an endoscope apparatus which reduces the uneven light distribution (shading) by a plurality of light sources.
Patent Document 1: Japanese Patent Application Laid-Open No. 2015-8785
Patent Document 2: Japanese Patent Application Laid-Open No. 2012-245349
However, even if appropriate exposure can be realized by the technologies disclosed in Patent Documents 1 and 2 described above, reflection components are not considered in these technologies. Therefore, it has been difficult to obtain an image suitable for observation due to specular reflection.
The present disclosure has been made in view of such a situation, and enables observation with a more appropriate exposure state.
A light source control device according to one aspect of the present disclosure includes a reflection direction estimation unit which estimates a reflection direction in which irradiation light is reflected on a surface of a subject in a living body and a control unit which relatively controls a direction of a light source for emitting the irradiation light relative to a direction in which the subject is observed on the basis of the reflection direction of the irradiation light estimated by the reflection direction estimation unit and the direction in which the subject is observed.
A light source control method or a program according to one aspect of the present disclosure includes estimating a reflection direction in which irradiation light is reflected on a surface of a subject in a living body and relatively controlling a direction of a light source for emitting the irradiation light relative to a direction in which the subject is observed on the basis of the reflection direction of the estimated irradiation light and the direction in which the subject is observed.
A surgical system according to one aspect of the present disclosure includes a light source which irradiates an operated portion in a living body with irradiation light from a predetermined direction, an imaging unit which images the operated portion in the living body, a reflection direction estimation unit which estimates a reflection direction in which irradiation light is reflected on a surface of the operated portion, and a control unit which relatively controls a direction of the light source relative to an optical axial direction of the imaging unit on the basis of the reflection direction of the irradiation light estimated by the reflection direction estimation unit and the optical axial direction of the imaging unit.
In one aspect of the present disclosure, a reflection direction of irradiation light reflected on a surface of a subject in a living body is estimated, and a direction of a light source for emitting the irradiation light is relatively controlled relative to a direction in which the subject is observed on the basis of the reflection direction and the direction in which the subject is observed.
According to one aspect of the present disclosure, observation with a more appropriate exposure state can be realized.
A specific embodiment to which the present technology has been applied will be described below in detail with reference to the drawings.
<Exemplary Configuration of Endoscopic Surgical System>
In recent years, endoscopic surgery has been performed instead of a traditional laparotomy in a medical field. For example, in a case of abdominal surgery, an endoscopic surgical system 11 which is arranged in an operation room as illustrated in
In the operation room where such endoscopic surgery is performed, a cart 24 on which devices for the endoscopic surgery are mounted, a bed for patient 23 on which a patient lies, a foot switch 25, and the like are arranged. On the cart 24, for example, devices such as a camera control unit (CCU) 15, a light source device 16, a treatment tool device 17, a pneumoperitoneum device 18, a display 19, a recorder 20, and a printer 21 are mounted as medical devices.
An image signal of the affected part 26 imaged through an observation optical system of the camera head portion 12 is transmitted to the CCU 15 via a camera cable, and the signal is processed in the CCU 15. After that, the signal is output to the display 19, and an endoscopic image of the affected part 26 is displayed. The CCU 15 may be connected to the camera head portion 12 via the camera cable, or may be wirelessly connected to the camera head portion 12.
The light source device 16 is connected to the camera head portion 12 via a light guide cable, can irradiate the affected part 26 by switching light beams having various wavelengths. The treatment tool device 17 is, for example, a high-frequency output device which outputs a high-frequency current to the energy treatment tool 13 which cuts the affected part 26 by using electric heat.
The pneumoperitoneum device 18 includes an air supply and suction unit and supplies air to, for example, an abdominal region in the patient's body. The foot switch 25 controls the CCU 15, the treatment tool device 17, and the like with a foot operation by the operator, the assistant, and the like as a trigger signal.
In the endoscopic surgical system 11 configured in this way, an operation by the energy treatment tool 13 and the forceps 14 is imaged by the camera head portion 12, and signal processing is performed to the image signal in the CCU 15. The affected part 26 can be observed in the image to which the signal processing has been performed.
The CPU 52 and the GPU boards 53-1 and 53-2 perform various processing by executing various software such as related software, for example. The CPU 52 includes a processor. Each of the GPU boards 53-1 and 53-2 includes a graphics processing unit (GPU) and a dynamic random access memory (DRAM).
The memory 54 stores various data such as data corresponding to the input image signal from the camera head portion 12 and data corresponding to the output image signal to the display 19, for example. The CPU 52 serves to control reading and writing various data from and to the memory 54.
The central processing unit (CPU) 52 divides the image data stored in the memory 54 according to the data stored in the memory 54 and processing capabilities and processing contents of the GPU boards 53-1 and 53-2. Then, each GPU of the GPU boards 53-1 and 53-2 performs predetermined processing to the data which is divided and supplied and outputs the processing result to the CPU 52.
For example, the input output (IO) controller 55 serves to control transmission of signals between the CPU 52 and the recording medium 56, and the interface 57.
The recording medium 56 functions as a storage unit (not shown), and stores various data such as image data and various applications. Here, as the recording medium 56, for example, a solid state drive (SSD) or the like is exemplified. Furthermore, the recording medium 56 may be detachable from the CCU 15.
As the interface 57, for example, a universal serial bus (USB) terminal, a processing circuit, a local area network (LAN) terminal, a transmission/reception circuit, and the like are exemplified.
Note that the hardware configuration of the CCU 15 is not limited to the configuration illustrated in
By the way, by using the endoscopic surgical system 11, a surgical technique that suppresses invasiveness, which is a major disadvantage in surgical operations, can be realized. Whereas, a universal problem in endoscopic observation is difficulty in setting the exposure.
With reference to
In A of
For example, as illustrated in
On the other hand, as illustrated in
Therefore, the technology disclosed in Patent Documents 1 and 2 to obtain appropriate exposure has been conventionally developed. However, although the above technology can change the light quantity and the light distribution, reflection components are not considered. Therefore, there is a possibility to image an image which has an appropriate illuminance and is not suitable for observation.
That is, from the structural reasons of traditional endoscopes, specular reflection components have been easily generated in the observation by a medical endoscope.
For example, as illustrated in
In the endoscope 61 having such a configuration, an irradiation light vector 61 from the light source and an incident light vector θ2 to the imaging unit are substantially parallel (θ1≈θ2). Therefore, if the specular reflection occurs on the surface of the subject at a high frequency, the specular reflection prevents the observation. In addition, the specular reflection causes a large stress on the eyes of the observer in some cases. Therefore, to provide an appropriate observation environment, it is necessary to consider the specular reflection components. However, as described above, the technologies disclosed in Patent Documents 1 and 2 only control the light source with an appropriate illuminance so as to suppress unevenness of the illumination (shading), and the specular reflection has not been considered. Therefore, depending on the result of the control, there has been a possibility to cause more specular reflection.
Here, with reference to
For example, as illustrated in
[Formula 1]
I
d
=I
1
×K
d×cosα (1)
As indicated in Formula (1), the intensity Id of the diffuse reflection component varies according to the reflection direction (angle of reflection α).
Similarly, an intensity S of specular reflection light of the reflected light is expressed by the following formula (2) by using the intensity Ii of the incident light, a specular reflectance W (constant), a highlight coefficient n, and the angle Υ.
[Formula 2]
S=I
i
×W×cosnΥ (2)
As indicated in Formula (2), the intensity S of the specular reflection light decreases according to the angle Υ formed by the line-of-sight direction vector V and the reflection direction vector R.
Furthermore, in the Phong reflection model, a shade I at this point is expressed by the following formula (3) by using an intensity Ia and a reflectance Ka of environment light.
[Formula 3]
I=I
d
+S+I
a
=K
a (3)
Here, as indicated in Formula (1), since the intensity Id of the diffuse reflection component changes according to the reflection direction, to calculate an appropriate illuminance, it is necessary to use not only the distance information but also the reflection direction.
Therefore, the endoscopic surgical system 11 can provide an image suitable for observation by using the distance information from the light source to the subject and the light source of which the direction can be changed.
As illustrated in
The observation window 73 is arranged at the center of the front end surface of the lens barrel 72. The light which is reflected by the subject and enters the observation window 73 is transmitted to the imaging unit (for example, imaging unit 83 in
The irradiation windows 74-1 to 74-4 are arranged at intervals of substantially 90° as viewing the lens barrel 72 from the front side, and the irradiation light can be emitted through each of the irradiation windows 74-1 to 74-4. Then, as described later, by adjusting the intensity of the irradiation light applied to the subject through the irradiation windows 74-1 to 74-4, the light source direction of the irradiation light is controlled.
The rigid endoscope 71 is formed in this way, and in the endoscopic surgical system 11, the light source of the irradiation light applied to the subject through the irradiation windows 74-1 to 74-4 is controlled.
As illustrated in
The light source unit 82 has a plurality of light sources which generates the irradiation light irradiated from each of the irradiation windows 74-1 to 74-4 in
The imaging unit 83 images an image by using the light entered from the observation window 73 in
The signal processing unit 84 performs signal processing to the image signal supplied from the imaging unit 83, and causes the display 19 to display the image to which the signal processing has been performed.
The light source control unit 85 includes a light source information acquisition unit 91, a distance information acquisition unit 92, a subject inclination estimation unit 93, a reflection direction estimation unit 94, and a light distribution control unit 95, and controls the light source unit 82.
The light source information acquisition unit 91 obtains light source information output from the light source unit 82 and supplies the current light source direction obtained from the light source information to the reflection direction estimation unit 94.
The distance information acquisition unit 92 obtains the distance information from the imaging unit 83 to the subject and supplies the distance indicated by the distance information to the subject inclination estimation unit 93. For example, as described with reference to
A spatial coordinate (two-dimensional coordinate) of the image imaged by the imaging unit 83 is supplied from the imaging unit 83 to the subject inclination estimation unit 93. Then, the subject inclination estimation unit 93 performs processing of estimating a three-dimensional inclination of the subject by using the spatial coordinate and the distance from the light source to the subject supplied from the distance information acquisition unit 92, and supplies a subject vector obtained as a result of the processing to the reflection direction estimation unit 94. Furthermore, a method of estimating the inclination of the subject by the subject inclination estimation unit 93 will be described with reference to
The reflection direction estimation unit 94 performs processing of obtaining the reflection direction of the subject for each pixel or region by using the subject vector supplied from the subject inclination estimation unit 93 and the current direction of the light source supplied from the light source information acquisition unit 91, and supplies a reflection vector obtained as a result of the processing to the light distribution control unit 95.
On the basis of the reflection vector supplied from the reflection direction estimation unit 94, the light distribution control unit 95 estimates a diffuse reflection intensity and obtains the light source direction with which the subject illuminance becomes uniform as viewed from the direction in which the subject is observed (that is, optical axial direction of imaging unit 83) (refer to
The light source control system 81 is formed in this way, and can irradiate the subject with the irradiation light in the optimal light source direction on the basis of the reflection direction of the irradiation light on the surface of the subject and the direction in which the subject is observed. With this structure, the imaging unit 83 can image an image in a more appropriate exposure state.
With reference to
For example, as illustrated in
Then, as illustrated in
[Formula 4]
d=x
L
−x
R (4)
Furthermore, a relationship between a distance (depth) Z from the imaging units 83L and 83R to the point P in the real world and the imaging interval T between an imaging center OL of the imaging unit 83L and an imaging center OR of the imaging unit 83R is expressed by the following formula (5) by using the parallax d and a focal distance f.
Therefore, according to Formulas (4) and (5), the distance Z from the imaging units 83L and 83R to the point P in the real world is expressed by the following formula (6).
On the basis of such a stereoscopic image, the distance information acquisition unit 92 can obtain the distance from the imaging unit 83 to the subject. Then, the subject inclination estimation unit 93 can estimate the inclination of the subject with respect to the optical axial direction (depth direction) of the imaging unit 83 by using the distance from the imaging unit 83 to the subject.
With reference to
For example, as illustrated in A of
In this way, the subject inclination estimation unit 93 can obtain the inclination of the subject, and the reflection direction estimation unit 94 can estimate the reflection direction from the inclination of the subject.
That is, the reflection direction estimation unit 94 estimates the specular reflection intensity S and the diffuse reflection intensity Id for each pixel or region of which the inclination is obtained by the subject inclination estimation unit 93.
For example, the reflection direction estimation unit 94 can estimate the diffuse reflection intensity Id on the basis of a reflection component estimation mathematical model such as the Phong reflection model indicated in Formula (1), the inclination of the subject (angle θ1 of XZ plane and angle θ2 of YZ plane) obtained by Formula (7), and the light source direction supplied from the light source information acquisition unit 91.
Furthermore, similarly to the diffuse reflection intensity Id, the reflection direction estimation unit 94 can estimate the intensity S of the specular reflection light on the basis of the mathematical model indicated in Formula (2), the inclination of the subject (angle θ1 of XZ plane and angle θ2 of YZ plane) obtained by Formula (7), and the light source direction supplied from the light source information acquisition unit 91.
At this time, the reflection direction estimation unit 94 can obtain, for example, the intensity Ii of incident light from the light source unit 82. Furthermore, for the diffuse reflectance Kd, the specular reflectance W, and the highlight coefficient n used in Formulas (1) and (2), arbitrary constants can be used, or a reflection coefficient unique for each subject may be used by identifying the subject by using image recognition and the like. In addition, the reflection model used at this time is not limited to the Phong reflection model described above as long as the reflection intensity can be estimated, and other various models can be used.
Next,
With reference to
For example, as illustrated in
In this way, the light distribution control unit 95 estimates the light source direction so that the subject illuminance viewed from the imaging unit 83 becomes uniform so that the appropriate subject illumination can be realized.
With reference to
For example, the light distribution control unit 95 obtains a cumulative frequency for each angle of reflection from the reflection vector obtained by the reflection direction estimation unit 94 and obtains a light source vector which makes the frequency of generation of the specular reflection be the lowest. For example, a three-dimensional plot is illustrated in
In this way, the light distribution control unit 95 estimates the light source direction which suppresses the specular reflection to the imaging unit 83 and minimizes the specular reflection. Accordingly, the appropriate subject illumination can be realized.
As described with reference to
Note that when the light distribution control unit 95 determines the light source direction, there is a possibility that evaluations on both the uniform subject illuminance and the frequency of the specular reflection are not compatible. In this case, the light distribution control unit 95 can determine the light source direction which has the highest comprehensive evaluation regarding the above two points as an appropriate direction or determine the light source direction by weighting one of the above evaluations.
Next,
For example, when an image is imaged by the imaging unit 83, the processing is started. In step S11, for example, as described above with reference to
In step S12, on the basis of the distance supplied from the distance information acquisition unit 92 in step S11 and the spatial coordinate of the image imaged by the imaging unit 83, as described above with reference to
In step S13, the reflection direction estimation unit 94 estimates the reflection direction of the illumination light on the subject on the basis of the subject vector supplied from the subject inclination estimation unit 93 in step S12. Then, the reflection direction estimation unit 94 supplies the reflection vector indicating the reflection direction of the illumination light on the subject to the light distribution control unit 95.
In step S14, on the basis of the reflection vector supplied from the reflection direction estimation unit 94 in step S13, the light distribution control unit 95 obtains the optimal light source direction so as to uniform the subject illuminance or minimize the specular reflection.
In step S15, the light distribution control unit 95 supplies the optimal light source direction obtained in step S14 to the light source unit 82, and controls the light distribution so that the subject is irradiated with the irradiation light from the irradiation windows 74-1 to 74-4 (
After the processing in step S15, the light source control processing is terminated. Then, the imaging unit 83 images an image of the subject irradiated with the illumination light with the light distribution controlled as described above, and the processing is in a standby state until a next image is supplied from the imaging unit 83.
As described above, since the light source control unit 85 can obtain the optimal light source direction by estimating the reflection components on the surface of the subject, it is possible to observe the subject with a more appropriate exposure state. That is, the light source control unit 85 can improve visibility at the time of observation by using the image imaged by the imaging unit 83 by eliminating the nonuniformity of the subject illuminance or decreasing the frequency of the specular reflection.
Next,
In step S21, the subject inclination estimation unit 93 obtains the spatial coordinate of the image imaged by the imaging unit 83 and the distance information supplied from the distance information acquisition unit 92.
In step S22, the subject inclination estimation unit 93 obtains the target coordinates (x1, y1, z1) and (x2, y2, z2) which specify predetermined two points or two regions on the imaging plane.
In step S23, the subject inclination estimation unit 93 calculates the angle θ1 of the XZ plane and the angle θ2 of the YZ plane by using Formula (7) as described above with reference to
As described above, the subject inclination estimation unit 93 can estimate the inclination of the subject on the basis of the image and the distance information.
Here, in the processing of estimating the appropriate light source direction by the light distribution control unit 95, it is desirable to have three conditions, i.e., the direction has the small dispersion of the diffuse reflectance, has the low specular reflectance, and is close to the current light source direction. For example, when the endoscope image during a surgical operation or the like is observed as a moving image, a state should be avoided in which the impression of the subject (operated portion) is drastically and frequently changed according to the change in the illumination direction, and it is desirable that the light source direction be moderately changed with respect to the time and the space direction. Therefore, the above three conditions are required to observe the moving image.
On the other hand, in a case where a still image is imaged, since an optimal illumination condition at the imaging timing is desired, it is not necessary to satisfy the three conditions described above. For example, in this case, with conditions such that the dispersion of the diffuse reflectance is small and the specular reflectance is low, a new light source direction may be determined.
Furthermore, when the light distribution control unit 95 estimates an appropriate light source direction, the three conditions described above are independent factors. Therefore, the direction having the highest evaluation in each condition can be determined as a new light source direction. For example, as an example, description will be made with reference to a flowchart in
In step S31, the light distribution control unit 95 performs initialization, sets a variable i in the X axis direction so that the X-axis light source direction is the minimum in a variable range n (i=X−n), and sets a variable j in the Y axis direction so that the Y-axis light source direction is the minimum in a variable range n (j=Y−n).
In step S32, the light distribution control unit 95 calculates a new light source direction D according to the following formula (8) by using a dispersion value Diff of the diffuse reflection component, a specular reflectance Spec, and a weight W of the distance. Furthermore, in Formula (8), the dispersion value Diff of the diffuse reflection component, the specular reflectance Spec, and the weight W of the distance have values from zero to one.
[Formula 8]
D=Max((1−Diff[i,j])×(1−Spec[i,j])×W[i,j], D) (8)
In step S33, the light distribution control unit 95 determines whether the current X-axis light source direction is the maximum (X+n) in the variable range n.
In step S33, in a case where the light distribution control unit 95 has determined that the current X-axis light source direction is not the maximum in the variable range n, the procedure proceeds to step S34, and the light distribution control unit 95 increments the variable i in the X axis direction (i=i+1). The procedure returns to step S32, and similar processing is repeated after that.
On the other hand, in step S33, in a case where the light distribution control unit 95 has determined that the current X-axis light source direction is the maximum in the variable range n, the procedure proceeds to step S35. That is, in this case, with the current Y-axis light source direction, the illumination light in all the X-axis light source directions from the minimum to the maximum in the variable range is irradiated.
In step S35, the light distribution control unit 95 determines whether the current Y-axis light source direction is the maximum (Y+n) in the variable range n.
In step S35, in a case where the light distribution control unit 95 has determined that the current Y-axis light source direction is not the maximum in the variable range n, the procedure proceeds to step S36, and the light distribution control unit 95 increments the variable j in the Y axis direction (j=j+1). After that the procedure returns to step S32, and similar processing is repeated.
On the other hand, in step S35, in a case where the light distribution control unit 95 has determined that the current Y-axis light source direction is the maximum in the variable range n, the processing is terminated. With this processing, the illumination light in all the X-axis light source directions and Y-axis light source directions in the variable range from the minimum to the maximum is irradiated, and the light source direction obtained in the final step S32 is the optimal light source direction.
Furthermore, in such processing, the distance weight W can be determined on the basis of a relative angle with the new light source direction (Min=0, Max=n) with the current light source direction as a reference. For example, the preferable design is made by using the normal distribution (refer to
Here, in Formula (9), as a weighting coefficient σ, a constant may be used, and a variable which is adaptable for the distance from the imaging unit 83 to the subject may be used. Furthermore, in a case where the weighting coefficient σ is used as a variable, it is preferable to appropriately determine the weighting coefficient σ from the distance information. For example, if the distance from the imaging unit 83 to the subject is long, the weighting coefficient σ is set to be small by largely moving an illuminated range by a slight change in the light source direction, and if the distance from the imaging unit 83 to the subject is short, the weighting coefficient σ is set to be large. In this way, more excellent observation can be performed.
As described above, in the endoscopic surgical system 11, by controlling the light source direction to decrease the dispersion of the diffuse reflectance or decrease the specular reflectance, for example, generation of an extremely bright portion in the image is avoided, and the observation can be performed with a more appropriate exposure state.
Note that, in the embodiment, the method of controlling the light source direction as the optical axis of the imaging unit 83 is fixed has been described. However, it is preferable to relatively adjust the optical axis of the imaging unit 83 and the light source direction, for example, the optical axis of the imaging unit 83 may be controlled while the light source direction is fixed. Furthermore, the method of controlling the light source direction is not limited to the method of controlling the irradiation intensity of the irradiation light emitted from each of the irradiation windows 74-1 to 74-4 in
Furthermore, in addition to the endoscopic surgical system 11, for example, the present technology can be applied to various devices used to perform observation under an illumination environment similar to that of the endoscope.
Note that it is not necessary for the processing described above with reference to the flowcharts to be necessarily executed in time series in an order described in the flowchart, and the processing described above includes processing which is executed in parallel or processing which is individually executed (for example, parallel processing or processing by object). Furthermore, the program may be executed by a single CPU, and may be distributively processed by a plurality of CPUs. Furthermore, herein, the system indicates a whole device including a plurality of devices.
Furthermore, the above-mentioned series of processing (information processing method) can be performed by hardware and software. In a case where the software executes the series of processing, a program included in the software is installed from a program recording medium, in which the program has been recorded, to a computer installed in a dedicated hardware or, for example, a general-purpose personal computer which can perform various functions by installing various programs.
In a computer, a central processing unit (CPU) 101, a read only memory (ROM) 102, and a random access memory (RAM) 103 are connected to each other with a bus 104.
In addition, an input/output interface 105 is connected to the bus 104. The input/output interface 105 is connected to an input unit 106 including a keyboard, a mouse, a microphone, and the like, an output unit 107 including a display, a speaker, and the like, a storage unit 108 including a hard disk, a nonvolatile memory, and the like, a communication unit 109 including a network interface, a drive 110 which drives a removable medium 111 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
In the computer configured as above, the CPU 101 loads, for example, the program stored in the storage unit 108 to the RAM 103 via the input/output interface 105 and the bus 104 and executes the program so that the above-mentioned series of processing is performed.
The program executed by the computer (CPU 101) is provided, for example, by recording the program to the removable medium 111 which is a package medium such as a magnetic disk (including flexible disk), an optical disk (compact disc-read only memory (CD-ROM), a digital versatile disc (DVD), and the like), a magneto-optical disk, or a semiconductor memory or via a wired or wireless transmission medium such as a local area network, the Internet, or a digital satellite broadcast.
Then, the program can be installed to the storage unit 108 via the input/output interface 105 by mounting the removable medium 111 in the drive 110. Furthermore, the program can be received by the communication unit 109 via the wired or wireless transmission medium and installed to the storage unit 108. In addition, the program can be previously installed to the ROM 102 and the storage unit 108.
Note that, the present technology can have the configuration below.
(1) A light source control device including:
a reflection direction estimation unit configured to estimate a reflection direction in which irradiation light is reflected on a surface of a subject in a living body; and
a control unit configured to relatively control a direction of a light source for emitting the irradiation light relative to a direction in which the subject is observed on the basis of the reflection direction of the irradiation light estimated by the reflection direction estimation unit and the direction in which the subject is observed.
(2) The light source control device according to (1), in which
on the basis of the reflection direction, the control unit controls the direction of the light source so that an illuminance of the subject viewed from the direction in which the subject is observed becomes uniform.
(3) The light source control device according to (1), in which
on the basis of the reflection direction, the control unit controls the direction of the light source so that specular reflection in the direction in which the subject is observed is suppressed.
(4) The light source control device according to any one of (1) to (3), further including:
a subject inclination estimation unit configured to estimate an inclination of the subject relative to the direction in which the subject is observed, in which
the reflection direction estimation unit estimates the reflection direction on the basis of the inclination of the subject estimated by the subject inclination estimation unit.
(5) The light source control device according to (4), further including:
a distance information acquisition unit configured to obtain a distance from an imaging unit for imaging the subject to the subject, in which
the subject inclination estimation unit estimates the inclination of the subject on the basis of a spatial coordinate of an image imaged by the imaging unit and the distance obtained by the distance information acquisition unit.
(6) The light source control device according to (5), in which
the reflection direction estimation unit estimates the inclination of the subject for each pixel or region of the image imaged by the imaging unit.
(7) The light source control device according to any one of (1) to (6), in which
the subject in the living body is imaged by an endoscope.
(8) The light source control device according to (5), in which
the distance information acquisition unit obtains the distance by using stereoscopic images of the subject in the living body imaged by an endoscope.
(9) A light source control method including:
estimating a reflection direction in which irradiation light is reflected on a surface of a subject in a living body; and
relatively controlling a direction of a light source for emitting the irradiation light relative to a direction in which the subject is observed on the basis of the estimated reflection direction of the irradiation light and the direction in which the subject is observed.
(10) A program for causing a computer to execute steps including:
estimating a reflection direction in which irradiation light is reflected on a surface of a subject in a living body; and
relatively controlling a direction of a light source for emitting the irradiation light relative to a direction in which the subject is observed on the basis of the estimated reflection direction of the irradiation light and the direction in which the subject is observed.
(11) A surgical system including:
a light source configured to irradiate an operated portion in a living body with irradiation light from a predetermined direction;
an imaging unit configured to image the operated portion in the living body;
a reflection direction estimation unit configured to estimate a reflection direction in which the irradiation light is reflected on a surface of the operated portion; and
a control unit configured to relatively control a direction of the light source for emitting the irradiation light relative to an optical axial direction of the imaging unit on the basis of the reflection direction of the irradiation light estimated by the reflection direction estimation unit and the optical axial direction of the imaging unit.
In addition, the present embodiment is not limited to the embodiment described above and can be variously changed without departing from the scope of the present disclosure.
REFERENCE SIGNS LIST
Number | Date | Country | Kind |
---|---|---|---|
2015-199670 | Oct 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/077939 | 9/23/2016 | WO | 00 |