1. Field of the Invention
The present disclosure relates to at least one imaging apparatus configured to perform imaging of an object, at least one control method for the imaging apparatus, at least one lighting system configured to illuminate an object with illumination light, at least one control method for the lighting system, and a storage medium storing a program for causing a computer to execute the at least one control method.
2. Description of the Related Art
In recent years, research and practical application of pathological imaging with regard to visualization and imaging of a lesion by some form of methods have been carried out.
For example, in fluorescence imaging or the like, it is becoming possible to visually check the presence or absence of tumor cells without a large scale apparatus mainly in the field of animal experiments. Japanese Patent No. 3482440 and Japanese Patent Laid-Open No. 2006-180926 disclose recent developments in this field.
In medical practices too, not only a so-called biopsy diagnosis in related art but also a pathology examination may be performed in mid-course of a surgical operation in recent years. For example, in an operation for resecting the tumor cells, an examination such as X-ray CT, MRI, or PET is performed in advance, and the operation is prepared by previously identifying a resection range.
However, in actuality, various determinations and alternations related to an operation strategy are made depending on a situation of a resected part during the operation. At this time, a pathological diagnosis of the resected part may be urgently needed in some cases, and an intraoperative rapid pathological diagnosis (intraoperative rapid cell diagnosis) is actually conducted.
In the current intraoperative pathological diagnosis, a part corresponding to an observation objective is partially resected, and the resected part is visualized by a technique such as fluorescence imaging under a predetermined different environment, so that the urgent pathological diagnosis is carried out on the basis of this visualized part.
As described above, the intraoperative pathology examination is needed to carry out the appropriate operation. An environment where the pathological imaging related to the pathology examination or the like can be conducted on site during the operation without interrupting the operational procedure is preferably created.
The following advantages are attained when the pathological imaging and the diagnosis can be conducted in an actual time on site during the operation.
With the above-described advantages, it is important to make it possible to perform the pathological imaging without interrupting the various works on site during the operation.
However, in general, since the pathological imaging for the pathology examination represented by the fluorescence imaging or the like involves a low luminance, the pathological imaging is performed in an environment where the external light is excluded. Hereinafter, imaging that requires such an environment is referred to as “low luminance imaging”.
According to Japanese Patent No. 3482440 described above, it is described that the fluorescence imaging is premised on the environment where the external light is excluded.
On the other hand, the illumination of an operation room is set to be bright on its nature, and in particular, the illumination of an operative field is set to be extremely bright. Whereas the brightness of a sick bay is normally 100 lx to 200 lx, the brightness of the operation room is in the region of 1,000 lx, and the brightness of the operative field is in the region of 20,000 lx.
With regard to an objective of intraoperative pathological imaging, the operative field is mainly the object, but as described above, it is difficult to perform the low luminance imaging under the bright illumination even when an optical filter is used. In addition, the optical filter may not be effective in some cases depending on a relationship between a wavelength to be imaged and a wavelength of the external light.
On the other hand, in a case where the pathological imaging is prioritized, an option for adopting a method of temporarily turning off the illumination of the operation room as described in Japanese Patent Laid-Open No. 2006-180926 exists. However, the various works are interrupted, or the patient is put into a risky situation in darkness, so that this method is not also considered as appropriate.
Furthermore, in a case where the low luminance imaging is performed, it is necessary to increase the amount of light exposure to a needed level while the external light is excluded.
To achieve this, a method with which the exposure time is extended, a diaphragm is opened, the sensitivity of an image pickup element is improved, or the like is also conceivable. In addition, in the case of the fluorescence imaging, a method of increasing the intensity of excited light instead of flash illumination light is conceivable.
However, for example, in a case where the fluorescence imaging is performed during the operation, it is difficult to perform the low luminance imaging of the moving object when faced with one or more of the following aspects or condition(s).
With the above-described condition(s), it is difficult to perform the low luminance imaging of the moving object. That is, up to now, it is difficult to obtain an appropriate object image of the moving object while the works are conducted on this moving object.
The present disclosure provides at least one system in which an appropriate object image can be obtained without disturbing works of an operator who performs the works on an object, such as a moving object.
At least one imaging apparatus that performs imaging of an object in an environment irradiated with illumination light by a lighting system according to an aspect of the present disclosure includes: an input unit configured to input an instruction for performing the imaging; an identification unit configured to identify a timing in an extinction period of the illumination light at or during which an operator who performs a work on the object does not recognize an extinction of the illumination light corresponding to a timing after the instruction for performing the imaging is input from the input unit; and an image pickup unit configured to perform the imaging of the object to pick up an object image at the timing in the extinction period identified by the identification unit.
At least one lighting system that is communicable with an imaging apparatus configured to perform imaging of an object and irradiates the object with illumination light having a first light intensity according to another aspect of the present disclosure includes: a communication unit configured to perform a communication with the imaging apparatus and receive extinction instruction information related to a timing in an extinction period of the illumination light from the imaging apparatus; and a control unit configured to control an extinction of the illumination light on a basis of the extinction instruction information and perform a control to emit illumination light having a second light intensity higher than the first light intensity in at least one of a period immediately before and a period immediately after the extinction period of the illumination light on a basis of a predetermined rule. Furthermore, an imaging apparatus according to another aspect of the present disclosure includes: an image pickup unit configured to perform imaging of an object and obtain an image; a timing control unit configured to control a timing related to a first imaging performed by the image pickup unit in a state in which the object is irradiated with a first illumination light and a timing related to a second imaging performed by the image pickup unit in a state in which the object is irradiated with a second illumination light that is different from the first illumination light or a state in which the object is irradiated with none of the first illumination light and the second illumination light; a first obtaining unit configured to obtain a plurality of first images by performing the first imaging a plurality of times by the image pickup unit; a second obtaining unit configured to obtain a plurality of second images by performing the second imaging a plurality of times by the image pickup unit; a detection unit configured to detect motion information related to a motion of the object between the plurality of the first images obtained by the first obtaining unit; and a generation unit configured to perform a first processing on the plurality of the second images by the second obtaining unit on a basis of the motion information detected by the detection unit and perform a second processing on the plurality of the second images on which the first processing has been performed to generate one output image.
According to other aspects of the present disclosure, one or more additional imaging apparatuses, one or more additional imaging systems, one or more control methods therefor, and one or more storage mediums are discussed herein. Further features of the present disclosure will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereinafter, with reference to the drawings, exemplary embodiments of the present disclosure will be described.
First, a first exemplary embodiment of the present disclosure will be described.
As illustrated in
The lighting system 110-1 is constituted so as to be communicable with the imaging apparatus 120-1 that performs the imaging of the object H and irradiates the object H with illumination light (first illumination light) 141. Herein, the lighting system 110-1 is set to irradiate the object H with the first illumination light 141 having a first light intensity in a normal case. The luminance of the first illumination light 141 is considerably higher than normal illumination light, and the luminance of the light is particularly high with respect to the operative field. In addition, the first illumination light 141 is light that satisfies conditions of shadowless light which are necessary for the operation room.
The lighting system 110-1 is also provided with a communication unit 111 configured to perform a communication with the imaging apparatus 120-1 and receive extinction instruction information related to a timing in an extinction period of the first illumination light 141 (including information of the extinction period of the first illumination light 141) and the like from the imaging apparatus 120-1.
The imaging apparatus 120-1 performs the imaging of the object H in the environment irradiated with the first illumination light 141 by the lighting system 110-1. Specifically, the imaging apparatus 120-1 according to the present exemplary embodiment identifies a timing at which the operator O who performs the work on the object H does not recognize the extinction of the first illumination light 141 in the extinction period of the first illumination light 141 which corresponds to a timing after the instruction for performing the imaging is input and performs the imaging of the object H at this extinction timing to pick up an object image. As the imaging apparatus 120-1, a shape like a general single-lens reflex camera or a compact camera is supposed, but the configuration does not necessarily need to be the above-described shape. In addition, for example, the imaging apparatus 120-1 can perform the fluorescence imaging.
The imaging apparatus 120-1 is constituted by including an image pickup unit 121 that includes an image pickup optical system such as a lens and a diaphragm and an image pickup element, a light emitting unit 122, and a communication unit 123.
The image pickup unit 121 performs the imaging of the object H to pick up an object image based on light 143 from the object H.
The light emitting unit 122 emits second illumination light 142 that is different from the first illumination light 141 in accordance with a purpose of this imaging to the object H for exposure when the imaging of the object H is performed by the image pickup unit 121. For example, in a case where the purpose of the imaging is the fluorescence imaging, the second illumination light 142 is light for exciting a predetermined fluorescent material. In a case where the purpose of the imaging is normal flash imaging, for example, the second illumination light 142 is flash light.
The communication unit 123 performs a communication with the lighting system 110-1 and the display apparatus 130. The communication unit 123 transmits, for example, a wireless signal (infrared signal) 144 such as the extinction instruction information related to the timing in the extinction period of the first illumination light 141 (including the information of the extinction period of the first illumination light 141) to the lighting system 110-1. The communication unit 123 also transmits, for example, the object image picked up by the image pickup unit 121 and the like to the display apparatus 130.
The display apparatus 130 performs a communication with the imaging apparatus 120-1 (the communication unit 123) and performs processing of receiving the object image picked up by the image pickup unit 121 and the like and displaying this object image and the like.
Next, internal configurations of the lighting system 110-1 and the imaging apparatus 120-1 illustrated in
As illustrated in
The CPU 211 controls the operation of the lighting system 110-1 in an overall manner by using, for example, the program or data stored in the ROM 213 or the external memory 214.
The RAM 212 is provided with an area for temporarily storing the program or data loaded from the ROM 213 or the external memory 214 and is also provided with a work area used for the CPU 211 to perform the various processings.
The ROM 213 stores the program in which no change is needed, the information such as various parameters, and the like.
The external memory 214 stores, for example, an operating system (OS) and the program executed by the CPU 211 and further stores the information already given in the descriptions in the present exemplary embodiment and the like. It is noted that, according to the exemplary embodiment, the external memory 214 stores the program for executing the processing according to the exemplary embodiment of the present disclosure, but for example, a mode in which the program is stored in the ROM 213 may also be applied to one or more embodiments of the present disclosure.
The light emitting unit 215 emits the first illumination light 141 on the basis of the control by the CPU 211.
The input device 216 is constituted, for example, by a switch, a button, or the like (including a power supply switch) installed in the lighting system 110-1.
The communication I/F 217 governs transmission and reception of various information and the like which are performed between the lighting system 110-1 and an external apparatus G (in the present example, the imaging apparatus 120-1).
Herein, the communication unit 111 illustrated in
As illustrated in
The CPU 221 controls the operation of the imaging apparatus 120-1 in an overall manner by using, for example, the program or data stored in the ROM 223 or the external memory 224.
The RAM 222 is provided with an area for temporarily storing the program or data loaded from the ROM 223 or the external memory 224 and is also provided with a work area used for the CPU 221 to perform the various processings.
The ROM 223 stores the program in which no change is needed, the information such as various parameters, and the like.
The external memory 224 stores, for example, an operating system (OS) and the program executed by the CPU 221 and further stores the information already given in the descriptions in the present exemplary embodiment and the like. It is noted that, according to the exemplary embodiment, the program for executing the processing according to the exemplary embodiment of the present invention is stored in the external memory 224, but for example, a mode in which the program is stored in the ROM 223 may also be applied to one or more embodiments of the present disclosure.
The image pickup unit 225 performs the imaging of the object H to pick up an object image based on the light 143 from the object H. Specifically, the image pickup unit 225 is constituted by including an image pickup optical system 2251 such as a lens for guiding the light 143 from the object H to an internal image pickup element 2252 and the diaphragm and the image pickup element 2252 that picks up the object image based on the light 143 from the object H guided via the image pickup optical system 2251.
The light emitting unit 226 emits the second illumination light 142 on the basis of the control by the CPU 221.
The input device 227 is constituted, for example, by a switch, a button, or the like installed in the imaging apparatus 120-1. The input device 227 is used, for example, by the user to perform various instructions to the imaging apparatus 120-1 to input the instructions to the CPU 221 or the like.
The communication I/F 228 governs transmission and reception of various information and the like which are performed between the imaging apparatus 120-1 and the external apparatus G (in the present example, the lighting system 110-1 and the display apparatus 130).
Herein, the image pickup unit 121 illustrated in
Hereinafter, the exposure performed on the basis of the first illumination light 141 is referred to as first exposure, and the exposure performed on the basis of the second illumination light 142 is referred to as second exposure.
According to the present exemplary embodiment, the low luminance imaging based on the second illumination light 142 or the like is performed without disturbance with respect to the illumination by the first illumination light 141 for supporting the surgical work by the surgeon corresponding to the operator O and also without disturbance by the first illumination light 141.
Herein, the disturbance with respect to the illumination by the first illumination light 141 refers to a state in which the work is even momentarily interrupted because extinction occurs in a time period during which the operator O under this illumination can recognize the extinction, or flickering occurs to a level at which the operator O is bothered and loses concentration.
The disturbance by the first illumination light 141 refers to a state in which the first illumination light 141 becomes disturbance noise when the imaging of the low luminance imaging or the like is performed, and the intended imaging of the low luminance imaging or the like is not performed.
According to the present exemplary embodiment, the illumination by the first illumination light 141 is turned off in only an extinction period corresponding to a time period during which the operator O as a human being does not recognize the extinction, and the imaging such as the low luminance imaging is performed in the extinction period. At this time, in the case of the first illumination light 141 that turns on and off at a predetermined period (that turns off at least in a time related to the above-described extinction period), the imaging such as the low luminance imaging is performed in the extinction period at a gap between the light emissions of the first illumination light 141.
In addition, according to the present exemplary embodiment, the lighting system 110-1 is supposed to be a device that can perform high-speed response. Up to now, the lighting system of this type is often a halogen lamp or the like and is not necessarily a device that can perform the high-speed response. However, in recent years, an LED and organic electroluminescence are put to practical use as illumination devices, the lighting system that uses these devices can perform the high-speed response. In general, reasons for demanding these devices are mainly high light emission efficiency (that is, low heat generation), high luminance, long life, wide wavelength characteristic options, and the like, but attention is paid on the high-speed responsiveness according to the present exemplary embodiment.
Next, a processing procedure of a control method by the imaging system 100 illustrated in
In
The sequence on the first stage from the top in
The sequence on the second stage from the top in
First, when the half-press of the shutter button of the input device 227 occurs at the time T0, the CPU 221 of the imaging apparatus 120-1 identifies a timing at which the operator O who performs the work on the object H does not recognize the extinction of the first illumination light 141 in the extinction period of the first illumination light 141 from a surrounding situation or the like. The CPU 221 that performs the processing of identifying the timing in the extinction period of the first illumination light 141 constitutes an identification unit.
Subsequently, the CPU 221 of the imaging apparatus 120-1 sets the extinction instruction information related to the identified timing in the extinction period (including the information of the extinction period). Specifically, in the example illustrated in
When the indentation of the shutter button of the input device 227 is performed, the CPU 221 of the imaging apparatus 120-1 detects this state. Then, the communication I/F 228 of the imaging apparatus 120-1 (the communication unit 123) transmits the trigger information indicating that the indentation of the shutter button has been performed to the lighting system 110-1.
The sequence on the third stage from the top in
The sequence on the fourth stage from the top in
The sequence on the fifth stage from the top in
The sequence on the sixth stage from the top in
In the case of the example illustrated in
Next, a visual characteristic of a human being in the extinction period Δt3 of the instantaneous extinction illustrated in
Herein, according to the present exemplary embodiment, the extinction period Δt3 of the instantaneous extinction illustrated in
For this reason, an upper limit is set for the length of the extinction period Δt3, and also the light emission intensity correction for increasing the light intensity in at least one of the period immediately before the instantaneous extinction and the period immediately after the instantaneous extinction is performed.
The first to third stages from the top in
In general, human eyes have an integration characteristic in a time direction, and this can be represented as Expression (1) below.
B(T)=∫t=Tt=T+τgamma(L(T))dt (1)
In Expression (1), B(T) denotes a brightness perceived at a time T. L(T) denotes a light intensity of the illumination light (the first illumination light 141) at the time T. A function gamma ( ) denotes a perceptual (visual) sensitivity characteristic with respect to the light intensity. τ denotes a constant of the integration and is 20 msec to 50 msec although relatively large differences exist between individuals.
The fourth to sixth stages from the top in
As illustrated in
According to the present exemplary embodiment, the front and back maximum emphasis amount is set as 200%, and the extinction period Δt3 is set as 10 msec to 30 msec.
As illustrated in
Next, a case where the plural exposures are performed will be described.
In the descriptions by using
On the other hand, a large number of LED illuminations and organic electroluminescence illuminations in actuality perform driving in an impulse manner. In view of the above, from
The frequency of the first illumination light 141 in the case of the impulse driving is approximately several hundreds of hertz to 1 kilohertz. It is noted that a pulse train of
In
For example, in the case of the low luminance imaging, some exposure time is still needed even with backing-up by improvement in the sensitivity of the image pickup element 2252, release of the diaphragm corresponding to one of components of the image pickup optical system 2251, and the like. However, according to the present exemplary embodiment, since the extinction period Δt3 has an upper limit to avoid the flickering or the like, and the single exposure time is not sufficiently extended.
In this case, as illustrated in
As illustrated in
According to the present exemplary embodiment, it is possible to obtain the appropriate object image without disturbing the operator O (surgeon) who performs the work (surgery) on the object H. Accordingly, for example, the pathological diagnosis or the like can be performed in the actual time on site on the basis of the object image during the surgery, and it is possible to carry out the appropriate surgery.
Next, a second exemplary embodiment of the present disclosure will be described.
The second exemplary embodiment corresponds to a mode in which the light emitting unit 122 of the imaging apparatus 120-1 illustrated in
In
As described above, the second exemplary embodiment corresponds to a mode in which the second light emission (for example, the excited light in the fluorescence imaging) is not needed in the second exposure. That is, the second exemplary embodiment corresponds to the mode for performing the imaging on the basis of a light emission principle in which light emission occurs by itself by way of some type of energy.
The second exemplary embodiment has a similar configuration and a similar operation to those of the first exemplary embodiment other than no provision of the light emitting unit in the imaging apparatus 120-1.
Next, a third exemplary embodiment of the present disclosure will be described.
The third exemplary embodiment is devised to propose conditions for performing the low luminance imaging or the like when the instantaneous extinction occurs in the first light emission by the first illumination light 141.
According to the first and second exemplary embodiments described above, the extinction instruction information related to the timing in the extinction period Δt3 of the instantaneous extinction is transmitted from the imaging apparatus 120-1 to the lighting system 110-1 to instruct the instantaneous extinction.
On the other hand, in a case where the first light emission by the first illumination light 141 is impulse light emission, a non-illumination timing at a gap (gap) between the impulse light emissions can be used as the preexistent instantaneous extinction (timing in the extinction period Δt3) in some cases. According to the third exemplary embodiment, this is utilized.
As illustrated in
Differences from the imaging system 100 illustrated in
The light receiving unit 124 is configured to receive the first illumination light 141.
Next, internal configurations of the lighting system 110-3 and the imaging apparatus 120-3 illustrated in
As illustrated in
As illustrated in
At this time, the communication I/F 228 governs transmission and reception of various information and the like which are performed between the imaging apparatus 120-3 and the display apparatus 130 corresponding to the external apparatus G. That is, the exemplary embodiment is different from the first exemplary embodiment and the like in that the communication I/F 228 does not perform a communication with the lighting system 110-3 corresponding to an external apparatus.
The light receiving unit 229 is configured to receive the first illumination light 141. The light receiving unit 124 illustrated in
According to the present exemplary embodiment, the first illumination light 141 is received by the light receiving unit 229 of the imaging apparatus 120-3 (the light receiving unit 124), and the CPU 221 of the imaging apparatus 120-3 evaluates the waveform of the first illumination light 141 received by the light receiving unit 229. Then, the CPU 221 of the imaging apparatus 120-3 determines whether or not an imaging timing can be set in a gap between the illuminations of the first illumination light 141 in accordance with this evaluation result. Then, in a case where the imaging timing can be set in the gap between the illuminations of the first illumination light 141, the CPU 221 of the imaging apparatus 120-3 sets the gap imaging timing (at this time, for example, the timing in the extinction period Δt3 of the first illumination light 141 is identified). Then, the image pickup unit 225 of the imaging apparatus 120-3 performs the exposure in accordance with the set gap imaging timing and the imaging of the object image of the object H on the basis of the control by the CPU 221 of the imaging apparatus 120-3.
A lighting pattern 1 illustrated in
A lighting pattern 2 illustrated in
A lighting pattern 3 illustrated in
A lighting pattern 4 illustrated in
A lighting pattern 5 illustrated in
Next, a fourth exemplary embodiment of the present disclosure will be described.
A schematic configuration of the imaging system according to the fourth exemplary embodiment is obtained by combining the lighting system 110-1 illustrated in
When the imaging system according to the fourth exemplary embodiment adopts the above-described configuration, the following processing can be realized. That is, only when it is determined that it is not possible to perform the gap imaging according to the third exemplary embodiment, the processing according to the first exemplary embodiment is executed.
First, in step S101, the CPU 221 of the imaging apparatus 120-3 determines whether or not the half-press of the shutter button of the input device 227 is detected. As a result of this determination, in a case where the half-press of the shutter button of the input device 227 is not detected (S101/NO), until the half-press of the shutter button of the input device 227 is detected, the flow stands by in step S101.
On the other hand, as a result of the determination in step S101, in a case where the half-press of the shutter button of the input device 227 is detected (S101/YES), the flow advances to step S102.
When the flow advances to step S102, once the light receiving unit 229 of the imaging apparatus 120-3 receives the first illumination light 141, the CPU 221 of the imaging apparatus 120-3 detects this reception and evaluates the first illumination light 141. The contents of this evaluation are as described in the third exemplary embodiment.
Subsequently, in step S103, the CPU 221 of the imaging apparatus 120-3 determines whether or not the gap imaging described in the third exemplary embodiment can be performed on the basis of the result of the evaluation in step S102.
As a result of the determination in step S103, in a case where the gap imaging can be performed (S103/YES), the flow advances to step S104.
When the flow advances to step S104, the CPU 221 of the imaging apparatus 120-3 performs the setting of the gap imaging timing described according to the third exemplary embodiment.
Subsequently, in step S105, the CPU 221 of the imaging apparatus 120-3 determines whether or not the indentation of the shutter button of the input device 227 is detected. As a result of this determination, in a case where the indentation of the shutter button of the input device 227 is not detected (S105/NO), until the indentation of the shutter button of the input device 227 is detected, the process stands by in step S105.
On the other hand, as a result of the determination in step S105, in a case where the indentation of the shutter button of the input device 227 is detected, (S105/YES), the flow advances to step S106.
When the flow advances to step S106, the light emitting unit 226 and the image pickup unit 225 of the imaging apparatus 120-3 performs the gap imaging (the second light emission and exposure) of the object H on the basis of the control by the CPU 221 of the imaging apparatus 120-3 at the gap imaging timing set in step S104.
When the processing in step S106 is ended, the processing of the flow chart in
As a result of the determination in step S103, in a case where it is not possible to perform the gap imaging (S103/NO), the flow advances to step S107.
When the flow advances to step S107, the CPU 221 of the imaging apparatus 120-3 performs the control to transmit the extinction instruction information described in the first exemplary embodiment to the lighting system 110-1 via the communication I/F 228.
Subsequently, in step S108, the CPU 221 of the imaging apparatus 120-3 determines whether or not the indentation of the shutter button of the input device 227 is detected. As a result of this determination, in a case where the indentation of the shutter button of the input device 227 is not detected (S108/NO), until the indentation of the shutter button of the input device 227 is detected, the process stands by in step S108.
On the other hand, as a result of the determination in step S108, in a case where the indentation of the shutter button of the input device 227 is detected, (S108/YES), the flow advances to step S109.
When the flow advances to step S109, the CPU 221 of the imaging apparatus 120-3 performs the control to transmit the trigger information described in the first exemplary embodiment to the lighting system 110-1 via the communication I/F 228.
Subsequently, in step S110, the light emitting unit 226 and the image pickup unit 225 of the imaging apparatus 120-3 performs the instantaneous extinction imaging (the second light emission and exposure) described in the first exemplary embodiment on the object H on the basis of the control by the CPU 221 of the imaging apparatus 120-3.
When the processing in step S110 is ended, the processing of the flow chart in
First, a fifth exemplary embodiment of the present disclosure will be described.
A configuration of the present exemplary embodiment is the same as the configuration of the first exemplary embodiment illustrated in
Although the configuration of the present exemplary embodiment is the same as that of the first exemplary embodiment, operation timings of the imaging apparatus 120-1 and the lighting system 110-1 are different.
A processing procedure of a control method by the imaging system 100 of
In
The sequence on the first stage from the top in
The sequence on the second stage from the top in
First, when the half-press of the shutter button of the input device 227 occurs at the time T0, the CPU 221 of the imaging apparatus 120-1 identifies a timing in the periodic extinction period corresponding to a period during which the first illumination light 141 is periodically turned off and turn on from a surrounding situation or the like. Herein, in the example illustrated in
Subsequently, the CPU 221 of the imaging apparatus 120-1 sets the extinction instruction information (including information of the periodic extinction period) related to the identified timing in the periodic extinction period. Specifically, in the example illustrated in
When the indentation of the shutter button of the input device 227 is performed, the CPU 221 of the imaging apparatus 120-1 detects this state. Then, the communication I/F 228 of the imaging apparatus 120-1 (the communication unit 123) transmits the trigger information indicating that the indentation of the shutter button has been performed to the lighting system 110-1.
The sequence on the third stage from the top in
The sequence on the fourth stage from the top in
The sequence on the fifth stage from the top in
The sequence on the sixth stage from the top in
Herein, as one of the characteristics of the present exemplary embodiment, in the above-described periodic extinction for periodically turning off and turning on the light, the length of the instantaneous extinction corresponding to the single extinction is suppressed to such an extent that the instantaneous extinction is not recognized as flickering or the like within a range of human visual responsiveness. Specifically, for example, the length of the instantaneous extinction is suppressed to such an extent that the operator O does not recognize the extinction of the first illumination light 141. The length of this instantaneous extinction is approximately 10 msec to 20 msec although differences exist between individuals.
In the descriptions using
On the other hand, in actuality, the driving is performed in an impulse manner in a large number of the LED illuminations and the organic electroluminescence illuminations. In view of the above, from
The frequency of the first illumination light 141 in the case of the impulse driving is approximately several hundreds of hertz to 1 kilohertz in general. It is noted that pulse trains in
It is noted that, in
The sequence on the first stage from the top in
In a period immediately before and a period immediately after the instantaneous extinction, the CPU 211 of the lighting system 110-1 performs a control to carry out the irradiation of the first illumination light 141 having the light intensity higher than the normal light intensity of the illumination in the periodic extinction, so that the flickering or the like accompanied by the instantaneous extinction does not visually occur.
The sequence on the second stage from the top in
The sequence on the third stage from the top in
Exposure conditions appropriate to respective conditions are set for the first imaging by the first exposure and the second imaging by the second exposure.
Specifically, according to the present exemplary embodiment, a condition is set for the first exposure such that, for example, every exposure creates an image having a sufficient level. On the other hand, with regard to the second exposure, the single exposure is not sufficient, for example. Herein, a condition is set for the second exposure such that, for example, an image having a sufficient level is created by six exposures.
According to the present exemplary embodiment, the CPU 221 of the imaging apparatus 120-1 controls timings related to the first imaging, timings related to the second imaging, and the like. The CPU 221 of the imaging apparatus 120-1 which performs this timing control constitutes a timing control unit (a timing control unit 1221 illustrated in
Specifically, as illustrated in
Subsequently, the CPU 221 of the imaging apparatus 120-1 obtains the plurality of first images obtained by performing the first imaging plural times (the illumination images S0 to S6 illustrated in
The CPU 221 of the imaging apparatus 120-1 also obtains the plurality of second images (the observation images K1 to K6 illustrated in
Subsequently, the CPU 221 of the imaging apparatus 120-1 detects motion information related to a motion of the object H between the plurality of obtained first images (between the illumination images). The CPU 221 of the imaging apparatus 120-1 which performs the processing of detecting this motion information constitutes a motion information detecting unit (a motion information detecting unit 1224 illustrated in
Subsequently, the CPU 221 of the imaging apparatus 120-1 performs first processing on the plurality of obtained second images on the basis of the detected motion information and performs second processing on the plurality of second images on which the first processing has been performed to generate a single output image. The CPU 221 of the imaging apparatus 120-1 which performs the processing of generating this output image constitutes an output image generating unit (an output image generating unit 1225 illustrated in
Next, the detection processing for the motion information by the CPU 221 of the imaging apparatus 120-1 will be described.
Herein, a motion vector map determined by a transition from an image I to an image J is defined as Mv[I_J]. Hereinafter, descriptions will be given by using
Herein,
As illustrated in
Herein, processing of applying the motion vector map Mv[I_J] determined by the images I and J to an image X and generating an image Y (that is, motion compensation is performed) is represented by the following Expression (2).
Y=Mv[I
—
J](x) (2)
Next, descriptions will be given by using images of specific pictures.
According to the present exemplary embodiment, a case is supposed where the first imaging (the first exposure) for obtaining the illumination image S and the second imaging (the second exposure) for obtaining the observation image K are imaged while being respectively shifted by just a half of the period in the same period.
As illustrated in
Herein, a transform from the illumination image S0 to the illumination image S1 is represented by a function Mv[S0_S1]( ). Similarly, a transform from the illumination image S1 to the illumination image S2 is represented by a function Mv[S1_S2] ( ). Herein, according to the present exemplary embodiment, as described above, the exposure timing of the observation image K1 is supposed to be at the midpoint between the exposure timing of the illumination image S0 and the exposure timing of the illumination image S1. Similarly, the exposure timing of the observation image K2 is supposed to be at the midpoint between the exposure timing of the illumination image S1 and the exposure timing of the illumination image S2.
Herein, although the exposure conditions are different from each other, the illumination image S and the observation image K are obtained by imaging the same object H from the same direction at the same field angle. For this reason, the illumination image S and the observation image K are meant to have a series of motions on the same line while a time difference is taken into account.
As described above, the exposure timing of the observation image K1 is at the midpoint between the exposure timing of the illumination image S0 and the exposure timing of the illumination image S1. For this reason, the motion of the object H from the observation image K1 to the observation image K2 is substantially the same as a motion obtained by overlapping the transforms with each other in which the motions of the object H of the previous and next illumination images S are respectively halved.
That is, the transform from the observation image K1 to the observation image K2 can be represented as follows.
Therefore, when an image obtained by estimating an image at a time point time of the observation image K2 from the observation image K1 (image obtained by moving the observation image K1 to the time point of the observation image K2 on the basis of the motion information) is set as Kv1_2, this can be represented by the following Expression (3).
Kv1—2=Mv[S1—S2]/2(Mv[S0—S1]/2(K1) (3)
By generalizing this, for example, in a case where the exposure timing of the observation image K is a timing having a ratio of a:b (a+b=1, a>0, b>0) with respect to the period of the exposure timing of the illumination image S, Expression (3) can be rewritten into the following Expression (4).
Kv1—2=Mv[S1—S2]*a(Mv[S0—S1]*b(K1) (4)
According to the present exemplary embodiment, Expression (3) corresponds to a case where a=0.5 and b=0.5 are set in Expression (4).
Even in a case where the partial movement occurs in the object H in the above-described manner, Expression (3) and Expression (4) described above are established similarly as described in
Herein, with the same rule as the image Kv1_2 described above, with regard to each of the observation images K1, K2, K3, K4, and K5, images obtained by estimating images at the time point of the observation image K6 (images obtained by moving the observation images K1, K2, K3, K4, and K5 to the time point of the observation image K6 on the basis of the motion information) are respectively set as images Kv1_6, Kv2_6, Kv3_6, Kv4_6, and Kv5_6.
The images Kv1_6, Kv2_6, Kv3_6, Kv4_6, and Kv5_6 can be respectively represented by the following Expressions (5-1) to (5-5) on the basis of the above-described concept.
Kv1—6=Mv[S5—S6]/2(Mv[S4—S5](Mv[S3—S4](Mv[S2—S3](Mv[S1—S2](Mv[S0—S1]/2(K1)))))) (5-1)
Kv2—6=Mv[S5—S6]/2(Mv[S4—S5](Mv[S3—S4](Mv[S2—S3](Mv[S1—S2]/2(K2))))) (5-2)
Kv3—6=Mv[S5—S6]/2(Mv[S4—S5](Mv[S3—S4](Mv[S2—S3]/2(K3)))) (5-3)
Kv4—6=Mv[S5—S6]/2(Mv[S4—S5](Mv[S3—S4]/2(K4))) (5-4)
Kv5—6=Mv[S5—S6]/2(Mv[S4—S5]/2(K5)) (5-5)
Kv6—6=K6 (5-6)
The processing performed in Expressions (5-1) to (5-5) is so-called positioning processing. Since Kv6_6 becomes the observation image K6, Expression (5-6) is obviously established. That is, the processing represented in Expressions (5-1) to (5-6) are equivalent to the first processing performed on the plurality of observation images (the second images) on the basis of the detected motion information. With this first processing, the positioning processing by correcting the translational movement or the rotational movement across the entire image of the object H or the positioning processing by correcting the translational movement or the rotational movement for each image region of the object H is performed.
Furthermore, addition processing is performed on the images Kv1_6 to Kv6_6 illustrated in Expressions (5-1) to (5-6) as in the following Expression (5) to generate one output image K_sum. In a case where a situation occurs in which a pixel value of the output image K_sum is saturated or the like, averaging processing is performed on the images Kv1_6 to Kv6_6 illustrated in Expressions (5-1) to (5-6) as in the following Expression (7) as needed to generate one output image K_ave. The processing indicated by Expression (6) or Expression (7) is equivalent to the second processing performed on the plurality of observation images on which the above-described first processing has been performed when the single output image is generated.
K
—
sum=Kv1—6+Kv2—6+Kv3—6+Kv4—6+Kv5—6+Kv6—6 (6)
K
—
ave=(Kv1—6+Kv2—6+Kv3—6+Kv4—6+Kv5—6+Kv6—6)/6 (7)
Herein, the output image K_sum or the like indicated in Expression (6) is by no means inferior in terms of luminance or S/N and also by far superior in terms of moving image blurring or irregular deviation if this is compared with the single imaging by the exposure for the total exposure time of the observation images K1 to K6 while the object H is fixed.
According to the present exemplary embodiment, it is possible to obtain an appropriate object image (output image) of the object H in motion. In addition, according to the present exemplary embodiment, since the control is performed such that the timing for emitting the first illumination light 141 related to the first imaging is not overlapped with the timing for emitting the second illumination light 142 related to the second imaging (or the second imaging timing), it is possible to obtain the appropriate object image (output image) of the object H in motion while the work is carried out by the operator O who performs the work on the object H in motion. Accordingly, it is possible to perform the pathological diagnosis based on the object image (output image) in the actual time during the operation of the object H, for example, and it is possible to carry out the appropriate operation.
Next, a sixth exemplary embodiment of the present disclosure will be described.
According to the sixth exemplary embodiment, instead of generating the single output image from the exposures performed the particular number of times as in the processing described in the above-described fifth exemplary embodiment, the similar processing is sequentially continuously performed with respect to the images obtained by the sequential exposures.
The imaging system according to the sixth exemplary embodiment is similar, for example, to the imaging system 100 according to the first exemplary embodiment illustrated in
According to the sixth exemplary embodiment too, similarly as in the fifth exemplary embodiment, the first light emission by the first illumination light 141 occurs at a predetermined period, and the second light emission by the second illumination light 142 is performed in the extinction period of this first light emission. That is, the first imaging based on the first exposure is performed in the first light emission period, and the second imaging based on the second exposure is performed in the second light emission period.
Hereinafter, the processing procedure of the control method by the imaging apparatus according to the present exemplary embodiment will be described.
According to the present exemplary embodiment, with regard to second images imaged at a predetermined frame rate, an output image is generated by positioning frames of m2 pieces of previous images for every m1 frames. Herein, m1 and m2 are integers. That is, when a frame rate of the second images is set as F0, a frame rate of the output image is F0/m1, and the output image is obtained by making a reference and positioning the frames of the m2 pieces of previous images. According to the present exemplary embodiment, the frame rate F0 is not particularly determined, but a case where m1=5 and m2=10 are set will be hereinafter described as an example.
The sequence on the first stage from the top in
The sequence on the second stage from the top in
The sequence on the third stage from the top in
The sequence on the fourth stage from the top in
Hereinafter, an example of processing of generating the output image D2 illustrated in
The output image D2 is obtained in the following manner. That is, images obtained by virtually moving the observation images K12, K13, K14, K15, K16, K17, K18, K19, K20, and K21 illustrated in
Herein, the images obtained by moving the observation images K12 to K21 as described above to the imaging timing of the illumination image S21 are represented as follows.
K12_21s, K13_21s, K14_21s, K15_21s, K16_21s, K17_21s, K18_21s, K19_21s, K20_21s, K21_21s
Then, the total sum or the average of these images corresponds to the output image D2.
Herein, a process in which the first observation images K12 is transformed into K12_21s can be decomposed as follows.
K12→K12_12s→K12s_13s→K13s_14s→ . . . →K19s_20s→K20s_21s
Herein, only the first transform is performed in a manner that each component of the motion vector map determined by the illumination images S having the adjacent sequential orders is halved. After that, all the transforms are based on the motion vector map determined by the mutual illumination images S having the adjacent sequential orders. In any one of the observation images K12 to K21, the transform is based on the half of the motion vector map only for the first time immediately after the imaging as described above, and thereafter, the transform is based on the above-described motion vector map.
Next, a function configuration of the imaging apparatus according to the present exemplary embodiment will be described.
As illustrated in
The image pickup unit 1210 is configured to pick up an image based on the light 143 from the object H by performing the imaging of the object H. The image pickup unit 1210 is constituted, for example, by the image pickup unit 225 illustrated in
The control and processing unit 1220 controls the operation by the imaging apparatus according to the present exemplary embodiment in an overall manner and also performs various processings. The control and processing unit 1220 is configured, for example, while the CPU 221 illustrated in
Specifically, the control and processing unit 1220 includes the timing control unit 1221, the illumination image obtaining unit 1222, the observation image obtaining unit 1223, the motion information detecting unit 1224, and the output image generating unit 1225.
The timing control unit 1221 controls the timing related to the first imaging performed in the image pickup unit 1210 and the timing related to the second imaging performed in the image pickup unit 1210.
The illumination image obtaining unit 1222 obtains the plurality of illumination images (S11 to S21 in
The observation image obtaining unit 1223 obtains the plurality of observation images (K11 to K22 in
The motion information detecting unit 1224 detects the motion information related to the motion of the object H between the plurality of illumination images obtained in the illumination image obtaining unit 1222.
First, the output image generating unit 1225 performs first processing on the plurality of observation images obtained by the observation image obtaining unit 1223 on the basis of the motion information detected by the motion information detecting unit 1224. Herein, the first processing corresponds to the positioning processing by correcting the translational movement or the rotational movement across the entire image of the object H with respect to the plurality of observation images or the positioning processing by correcting the translational movement or the rotational movement for each image region of the object H with respect to the plurality of observation images. Subsequently, the output image generating unit 1225 performs second processing on the plurality of observation images on which the first processing has been performed to generate a single output image. Herein, the second processing corresponds to addition processing of the images or the averaging processing of the images with respect to the plurality of observation images on which the first processing has been applied.
The control and processing unit 1220 performs calculation processing while information is temporarily stored in the calculating FM 1230 via the bus and also transfers a result of the calculation processing to each block via the bus.
The calculating FM 1230 stores the information such as the result of the calculation processing by the control and processing unit 1220 and is used when the control and processing unit 1220 performs the calculation processing.
The FM_a1 (1231) to the FM_a3 (1233) and the FM_0 (1234) to the FM_10 (1244) are so-called frame memories.
Next, the calculation processing using the various frame memories by the control and processing unit 1220 will be described.
An upper stage in
An image stored in the frame memory FM_0 in
Images stored in the frame memories FM_1 to FM_9 are applied with the transform based on the motion vector map determined by the illumination images S each time i is incremented and moved to the frame memory FM on the immediate right. Herein, at the time of i=21→i=22, this transform (latter transform) is represented by Y=Mv[S20_S21] (X).
In
According to the present exemplary embodiment, the output image is output at an interval of once for every five frames of input frames (imaged frames). That is, within a range illustrated in
Herein, for example, the output image D2 is obtained while the images obtained by transforming the observation images K12 to K21 to the timing of the illumination image S21 are added to one another.
At the time point of i=21, the illumination images S20 and S19 are respectively stored in the frame memories FM_a1 and FM_a2, and the motion vector map Mv[S19_S20] calculated on the basis of these illumination images is stored in the frame memory FM_a3.
At the time point of i=22, the illumination images S21 and S20 are respectively stored in the frame memories FM_a1 and FM_a2, and the motion vector map Mv[S20_S21] calculated on the basis of these illumination images is stored in the frame memory FM_a3.
The images stored in the frame memories FM_1 to FM_9 at the time point of i=21 are transformed by the motion vector map Mv[S20_S21] at the time point of i=22 and moved to the frame memories FM 2 to FM_10.
In addition, the image stored in the frame memory FM_0 at the time point of i=21 is transformed by the motion vector map Mv[S20_S21]/2 at the time point of i=22 and stored in the frame memory FM_1.
Furthermore, since the time point of i=22 is the timing for outputting the output image D2, all the images stored in the frame memories FM_1 to FM_10 are added to one another (may be averaged when needed) to generate and output the output image D2.
In this manner, first, the output image generating unit 1225 sequentially performing positioning of the observation images K imaged at the frame rate F0 on the basis of the motion information obtained by the illumination images S up to m2 times. On that basis, the output image generating unit 1225 adds (averages) the m2 pieces of frames to one another for every m1 pieces of frames (that is, at the frame rate of F0/m1) to generate the output image D. With this configuration, the images imaged at the frame rate F0 are output as moving images where the frame rate is converted to 1/m1.
The respective frames of the output moving images are images obtained by positioning and adding the immediately preceding m2 pieces of input images or the like. Accordingly, the output images hardly has dynamic deterioration such as motion blurring or multiply-layered images, and the S/N improvement and level improvement are achieved, so that the low luminance imaging or the like of the moving object H is realized.
Embodiments of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions recorded on a storage medium (e.g., non-transitory computer-readable storage medium) to perform the functions of one or more of the above-described embodiment(s) of the present disclosure, and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more of a central processing unit (CPU), micro processing unit (MPU), or other circuitry, and may include a network of separate computers or separate computer processors. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present inventions have been described with reference to exemplary embodiments, it is to be understood that the inventions are not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2014-178481, filed Sep. 2, 2014, and Japanese Patent Application No. 2014-178483, filed Sep. 2, 2014, which applications are hereby incorporated by reference herein in their entireties.
Number | Date | Country | Kind |
---|---|---|---|
2014-178481 | Sep 2014 | JP | national |
2014-178483 | Sep 2014 | JP | national |