This application is based upon and claims the benefit of priority from prior Japanese Patent Application No. 2021-119597, filed Jul. 20, 2021, the entire contents of which are incorporated herein by this reference.
The disclosure of the present specification relates to a cell aggregate internal prediction method, a computer readable medium, and an image processing device.
A technique for stable supplying a large number of cells while maintaining the quality at a certain level or higher is essential to promote drug discovery and regenerative medicine using pluripotent stem cells. Thus, in recent years, suspension culture capable of culturing a larger number of cells at a time than monolayer culture is gaining attention.
Unlike the monolayer culture in which cells are planarly cultured, the suspension culture produces a cell aggregate by three-dimensionally culturing cells. The cells in the cell aggregate act by interacting with the surrounding cells or the like in the same manner as in vivo. Thus, for example, for evaluating drug efficacy, using the cell aggregate cultured in the suspension culture makes it possible to perform accurate evaluation under conditions closer to in vivo than using the cells cultured in the monolayer culture. Such a technique related to the drug efficacy evaluation is described in, for example, JP 2015-181348 A.
An internal prediction method according to an aspect of the present invention includes acquiring an image of a cell aggregate, calculating a feature amount related to a shape of the cell aggregate on the basis of the image, and outputting structure information related to an internal structure of the cell aggregate on the basis of the feature amount.
A non-transitory computer readable medium according to an aspect of the present invention stores an internal prediction program of a cell aggregate, in which the program causes a computer to execute processes of acquiring an image of the cell aggregate, calculating a feature amount related to a shape of the cell aggregate on the basis of the image, and outputting structure information related to an internal structure of the cell aggregate on the basis of the feature amount.
An image processing device according to an aspect of the present invention includes an acquisition portion that acquires an image of a cell aggregate, a calculation portion that calculates a feature amount related to a shape of the cell aggregate on the basis of the image, and an output portion that outputs structure information related to an internal structure of the cell aggregate on the basis of the feature amount.
The present invention will be more apparent from the following detailed description when the accompanying drawings are referenced.
It is difficult to observe an internal structure of a cell aggregate during cell culture from the outside, and observable parts are limited to a part of the cell aggregate. Thus, it is difficult to determine whether the cell culture is proceeded properly in suspension culture as compared with monolayer culture in which the whole cells can be observed.
Considering such circumstances, an embodiment of the present invention will be described hereinafter.
The system 1 includes a microscope system 10, a server device 40, and a plurality of client devices (a client device 50, a client device 60, and a client device 70), which are communicatively connected to one another through a network.
Note that a type of the network connecting between the devices is not particularly limited. For example, the network may be a public network such as an internet, a dedicated network, or a LAN (local area network). The connection between the devices may be wired connection or wireless connection.
The microscope system 10 includes a microscope 20 which captures an image of the cell aggregate and a control device 30 which controls the microscope 20. The control device 30 controls the microscope 20, so that the microscope 20 captures an image of the cell aggregate taken out from a culture environment, and, further, the control device 30 sends the image of the cell aggregate thus generated to the server device 40.
The microscope 20 is only required to include an imaging function for capturing an image of the cell aggregate.
Examples of an observation method in which the microscope 20 is used include a bright field observation method and a phase difference observation method. However, as described below, the microscope 20 is only required to acquire an image in which at least a contour of the cell aggregate can be recognized, and thus it may be used in any observation method other than the above.
The server device 40 is an image processing device which executes internal prediction processing described below on the basis of the image of the cell aggregate. The server device 40 acquires the image of the cell aggregate generated by the microscope system 10, predicts the internal structure of the cell aggregate on the basis of the image, and outputs a prediction result. More specifically, the server device 40 predicts the internal structure on the basis of a shape feature of the cell aggregate appeared in the image of the cell aggregate.
The client devices (the client device 50, the client device 60, and the client device 70) acquire the prediction result outputted by the control device 30 by responding to a request from a user and display it on a display device. Thus, the client device is only required to include at least an input device which receives the request from the user, the display device which displays the prediction result, and a communication device which communicates with the server device 40. Note that the control device 30 may be operated as the client device and output the prediction result on the display device (a display portion) included in the control device 30. That is, the control device 30 may output the prediction result by displaying the prediction result by itself.
Note that, the client device may be, for example, a desktop computer such as the client device 50, a tablet computer such as the client device 60, or a laptop computer such as the client device 70. Further, it may be a smartphone, a cellular phone, or the like. Further, each client device may be a dedicated terminal for a specific user or a shared terminal shared by multiple users.
According to the system 1 configured as described above, the user can easily recognize the internal structure of the cell aggregate by confirming the prediction result displayed on the client device. Thus, when the cell aggregates are periodically sampled to acquire the images of the cell aggregates during the cell culture, it becomes possible to detect abnormality of the cell culture at an early stage and efficiently culture the cells without performing useless culture.
Further, the system 1 predicts the internal structure of the cell aggregate from the shape feature of the cell aggregate. Thus, a high-performance device for visualizing in detail the inside of the cell aggregate which is three-dimensionally grown is not necessarily required, and many existing microscope systems can be used as the imaging device. Further, since it is only required to achieve an image quality enough to extract the contour, an imaging time can be shortened. This makes it possible to obtain the prediction result in a short time and allows the user to recognize the culture state without delay.
The acquisition portion 41 acquires, for example, the image of the cell aggregate generated by the microscope system 10. The acquisition portion 41 desirably acquires two or more images of the cell aggregate captured from mutually different directions. Using the images captured from the different directions facilitates recognition of the whole shape of the cell aggregate in the calculation portion 42 described below as compared with the case of using only the image captured from one direction. Further, the acquisition portion 41 further desirably acquires a plurality of the images obtained by imaging mutually different surfaces of the cell aggregate in each imaging direction. Acquiring the plurality of the images captured in the same direction makes it possible to select the image suitable for recognizing the shape of the cell aggregate in each imaging direction. This further facilitates the recognition of the whole shape of the cell aggregate in the calculation portion 42. Further, the different directions are desirably directions that intersect with each other. Using the intersecting directions makes it possible to obtain the images of the cell aggregate captured at different angles with respect to the gravity direction. For example, the acquisition portion 41 may acquire a plurality of first images D1 obtained by imaging the mutually different surfaces of the cell aggregate from a vertical direction (a first direction) and a plurality of second images D2 obtained by imaging the mutually different surfaces of the cell aggregate from a horizontal direction (a second direction). Note that, for example, the images of the cell aggregate may be previously stored in the storage portion 47 of the server device 40, and the acquisition portion 41 may read the images from the storage portion 47.
The calculation portion 42 calculates the feature amount related to the shape of the cell aggregate on the basis of the images acquired by the acquisition portion 41. The calculation portion 42 may include, for example, a contour extraction portion 43, an image selection portion 44, and a feature amount calculation portion 45.
The contour extraction portion 43 specifies a contour of the cell aggregate on the basis of the images acquired by the acquisition portion 41. A method for extracting and specifying the contour is not particularly limited. Any existing extraction method can be adopted. In a case where the plurality of the images are acquired by the acquisition portion 41, the contour extraction portion 43 desirably specifies the contour of the cell aggregate in each of the images thus acquired. For example, in a case where the plurality of the first images and the plurality of the second images are acquired by the acquisition portion 41, the contour extraction portion 43 desirably specifies the contour of the cell aggregate in each of the plurality of the first images and specifies the contour of the cell aggregate in each of the plurality of the second images.
The image selection portion 44 selects the image to be used for the feature amount calculation on the basis of the contour specified by the contour extraction portion 43. The image selection portion 44 desirably selects the image in each imaging direction and, further, desirably selects the image having the maximum contour among the images in the same imaging direction. That is, it is desirable to select the image having the maximum contour in each imaging direction. For example, in a case where the plurality of the first images and the plurality of the second images are acquired by the acquisition portion 41, the image selection portion 44 selects a third image on the basis of a plurality of the contours corresponding to the plurality of the first images and selects a fourth image on the basis of a plurality of the contours corresponding to the plurality of the second images. The image selection portion 44 desirably selects the image having the maximum contour as the third image among the plurality of the first images and selects the image having the maximum contour as the fourth image among the plurality of the second images. Note that the image selection portion 44 may select the image, for example, by defining the contour in which a partitioned region has the maximum area as the maximum contour.
The feature amount calculation portion 45 calculates the feature amount related to the shape of the cell aggregate on the basis of the image selected by the image selection portion 44. The feature amount calculation portion 45 desirably calculates the feature amount in each imaging direction and thus desirably calculates the feature amount in each image selected by the image selection portion 44. For example, in a case where the image selection portion 44 selects the third image from the plurality of the first images and the fourth image from the plurality of the second images, the feature amount calculation portion 45 desirably calculates the feature amount on the basis of each of the third image and the fourth image.
The feature amount calculated by the feature amount calculation portion 45 is a feature amount related to the shape of the cell aggregate recognizable from the contour of the cell aggregate. The feature amount calculated by the feature amount calculation portion 45 desirably includes at least one of the feature amount (hereinafter, referred to as a first feature amount) related to unevenness on the surface of the cell aggregate and the feature amount (hereinafter, referred to as a second feature amount) related to deviation from the ideal shape of the cell aggregate. Note that the ideal shape of the cell aggregate is, for example, a spherical shape, and the ideal shape appeared in the image is, for example, a circular shape.
The output portion 46 outputs the structure information related to the internal structure of the cell aggregate on the basis of the feature amount calculated by the feature amount calculation portion 45. The output portion 46 desirably refers to a database in which the structure information related to the internal structure of the cell aggregate is associated with the feature amount. For example, the output portion 46 desirably acquires the structure information related to the internal structure of the cell aggregate form the database constructed in the storage portion 47 using the feature amount calculated by the feature amount calculation portion 45 and outputs the structure information thus acquired. That is, the storage portion 47 stores the feature amount and the structure information in association with each other. Note that the database may be constructed in a device different from the server device 40.
Information related to the internal structure of the cell aggregates is collected by observing a number of the cell aggregates in detail in advance and recorded in the database as the structure information. The structure information recorded in the database may be, for example, information generated on the basis of a tomographic image of the cell aggregate acquired by optical coherence tomography (OCT), information generated on the basis of a tomographic image of the cell aggregate acquired by a fluorescence observation method, or information generated on the basis of an image obtained by actually cutting the cell aggregate and imaging a resulting cross section. The information described above may be an image itself obtained by imaging the cell aggregate or a model image showing a distribution of cells in the cell aggregate generated from the image, as long as the information is associated with the feature amount.
The server device 40 configured as described above executes the internal prediction processing described below. When the quality of the cell aggregate is deteriorated due to weakening of the cells or the like, the bond between the cells is also weakened, and the whole shape of the cell aggregate starts to collapse. Thus, when the cell aggregate is not normal, the shape of the cell aggregate is deviated from the ideal shape, and, further, the unevenness on the surface becomes evident. The server device 40 can detect a slight difference in the shape of the cell aggregate hardly recognizable by human eyes by quantifying the shape of the cell aggregate as the feature amount. Then, by referring to the database on the basis of the shape of the cell aggregate thus detected, the server device 40 can predict the internal structure of the cell aggregate with high accuracy. Thus, according to the server device 40 and the internal prediction method performed by the server device 40 described above, it becomes possible to easily recognize the internal structure of the cell aggregate from the image of the cell aggregate and detect abnormality of the cell culture at an early stage.
Below, a case where the images of the cell aggregate as a prediction object captured by the microscope system 10 are stored in advance in the server device 40 will be described as an example. In this example, the images stored in the server device 40 include the plurality of the first images D1 obtained by imaging the cell aggregate from the vertical direction and the plurality of the second images D2 obtained by imaging the cell aggregate from, for example, the horizontal direction. Further, the plurality of the first images D1 are the images of the cell aggregate corresponding to mutually different focal planes and the plurality of the second images D2 are also the images of the cell aggregate corresponding to mutually different focal planes.
The server device 40 executes a predetermined program and starts the internal prediction processing shown in
Upon receiving the request from the control device 30, the server device 40 first acquires images of a cell aggregate CM1 as a prediction object (Step S10). In this step, the acquisition portion 41 acquires the plurality of the first images D1 and the plurality of the second images D2 from the storage portion 47. The plurality of the first images D1 are, for example, as shown in
The images acquired in the Step S10 are not limited to the images acquired from the vertical direction and the horizontal direction. However, including the images captured from the vertical direction and the horizontal direction means including the images captured in a direction (the horizontal direction) largely affected by the gravity and the images captured in a direction (the vertical direction) less affected by the gravity, thereby providing an advantage that a degree of deterioration of the cell aggregate can be easily recognized. Note that the images acquired in the Step S10 may include images captured from three or more directions or images captured from two reversely directed directions.
When the images are acquired, the server device 40 extracts the contour of the cell aggregate on the basis of the images thus acquired (Step S20). In this step, the contour extraction portion 43 extracts the contour of the cell aggregate from each image acquired in the Step S10.
Further, the server device 40 selects the image used for the feature amount calculation on the basis of the contour extracted in the Step S20 (Step S30). In this step, the image selection portion 44 specifies the maximum contour in each imaging direction and selects the image having the maximum contour. That is, as shown in
Subsequently, the server device 40 executes the feature amount calculation processing shown in
After calculating the approximate curve, the feature amount calculation portion 45 calculates the first feature amount on the basis of the contour L1 and the approximate curve L2 calculated in the Step S41 (Step S42). In this step, for example, as shown in
Further, the feature amount calculation portion 45 calculates the second feature amount related to deviation from the ideal shape of the cell aggregate on the basis of the contour L1 (Step S43). In this step, for example, as shown in
The calculation methods of the first feature amount and the second feature amount are not limited to the above examples. For example, the second feature amount is only required to indicate a deviation degree from the ideal shape and thus may be calculated on the basis of the approximate curve instead of the roundness. For example, in a case where the approximate curve is expressed using an ellipse equation, an ellipticity may be calculated as the second feature amount instead of the roundness.
Both the first feature amount and the second feature amount are suitable parameters for detecting abnormality of the cell aggregate. Predicting the internal structure from these parameters makes it possible to find abnormality of the cell aggregate at an early stage. Specifically, when the second feature amount indicating the deviation from the ideal shape such as the roundness is used, it becomes possible to quantitatively recognize a state where the shape of the cell aggregate collapses by the influence of the gravity or the like, for example, as seen in a cell aggregate CM2 shown in
After ending the feature amount calculation processing, the server device 40 outputs the structure information related to the internal structure of the cell aggregate on the basis of the feature amount thus calculated (Step S50). In this step, the output portion 46 acquires the structure information associated with the first feature amount calculated in the Step S42 and the second feature amount calculated in the Step S43 by referring to a database DB1 constructed in the storage portion 47. As shown in
After the output portion 46 outputs the structure information acquired from the storage portion 47 to the control device 30, the server device 40 ends the internal prediction processing shown in
As described above, the server device 40 outputs the prediction result of the internal structure of the cell aggregate by executing the internal prediction processing shown in
Note that a display method of the prediction result is not particularly limited.
Even if the combination of the feature amount at the cross sections having the maximum contour is the same, the whole shape of the cell aggregate may greatly vary depending on the positional relationship between the cross sections. Thus, constructing a database by collecting the structure information in each combination of the feature amount and the intersection position of the cross sections makes it possible to predict the internal structure of the cell aggregate with higher accuracy. Thus, in the Step S50 in
The server device 40 executes a predetermined program and starts the internal prediction processing shown in
After receiving the request from the control device 30, the server device 40 first acquires the image of the cell aggregate as a prediction object (Step S110) and extracts the contour of the cell aggregate on the basis of the image thus acquired (Step S120). The processing in the Step S110 and the Step 120 is the same as that in the Step S10 and the Step 20 shown in
After extracting the contour, the server device 40 displays the contour on the display device. In this step, the server device 40 may display an image obtained by, for example, as shown in
Further, the server device 40 determines the presence or absence of a correction instruction (Step S140), and, if the correction instruction is inputted (Step S140: YES), the contour of the cell aggregate is updated in accordance with the correction instruction (Step S150). For example, the server device 40 may receive the correction of the contour L1 extracted in the Step S120 when a correction button shown in
Note that a cell C1 and a cell C2 shown in
Subsequently, the server device 40 selects the image to be used for the feature amount calculation (Step S160), executes the feature amount calculation processing on the basis of the image thus selected (Step S170), and outputs the structure information related to the internal structure of the cell aggregate on the basis of the feature amount thus calculated (Step S180). The processing from the Step S160 to the Step S180 is the same as that from the Step S30 to the Step S50 shown in
As described above, even in the case where the server device 40 executes the internal prediction processing shown in
Further, in the present embodiment, the contour of the cell aggregate recognized by the server device 40 can be manually corrected by the user. Adding the judgement of the user in the contour extraction makes it possible to perform the contour extraction with higher accuracy. This makes it possible to calculate the feature amount of the cell aggregate calculated on the basis of the contour with higher accuracy, leading to expectations of an improvement in prediction accuracy of the internal structure.
The above description shows an example of predicting the internal structure of the cell aggregate on the basis of the image of the cell aggregate at a certain time point. However, for example, in a case where the internal structure is repeatedly predicted by continuously observing the same cell aggregate, as shown in
For example, in a case where the cell aggregate CM1 at the previous prediction time point is grown to a cell aggregate CM4 at the present prediction time point, when a model image IM5 is displayed as the present prediction result as shown in
The processor 101 may be, for example, a single processor, a multiprocessor, or a multicore processor. The processor 101 reads and executes a program stored in storage device 103 and thereby operates as the acquisition portion 41, the calculation portion 42, and the output portion 46 described above. Note that the processor 101 is an example of an electric circuit.
The memory 102 is, for example, a semiconductor memory and may include a RAM region and a ROM region. The storage device 103 is, for example, a semiconductor memory such as a hard disk or a flash memory, or an external storage device.
The reading device 104, for example, accesses a removable storage medium 105 in accordance with an instruction of the processor 101. The removable storage medium 105 can be achieved by, for example, a semiconductor device, a medium at which information is inputted and outputted by magnetic action, and a medium at which information is inputted and outputted by optical action. Note that the semiconductor device is, for example, a USB (universal serial bus) memory. Further, the medium at which information is inputted and outputted by the magnetic action is, for example, a magnetic disk. The medium at which information is inputted and outputted by the optical action is, for example, a CD (compact disc)-ROM, a DVD (digital versatile disk), or a Blu-ray disc (Blu-ray is a registered trademark).
The communication interface 106 communicates with other devices, for example, in accordance with the instruction of the processor 101. The input/output interface 107 is an interface, for example, between an input/output device and the computer 100. The input device is, for example, a device which receives an instruction from the user such as a keyboard, a mouse, or a touch panel. The output device is, for example, a display device such as a display or a sound device such as a speaker.
The storage portion 47 described above may include, for example, the memory 102, the storage device 103, and the removable storage medium 105. Further, the acquisition portion 41 and the output portion 46 described above may include at least one of the input/output interface 107 and the communication interface 106.
The program executed by the processor 101 is provided to the computer 100, for example, in the following forms.
(1) Previously installed in the storage device 103
(2) Provided by the removable storage medium 105
(3) Provided from a server such as a program server
Note that the hardware configuration of the computer 100 for achieving the server device 40 described with reference to
The above-described embodiments illustrate specific examples in order to facilitate understanding of the invention, and the present invention is not limited to these embodiments. Variations obtained by modifying the above-described embodiments and alternatives to the above-described embodiments can be included. That is, in each embodiment, the constituents can be modified without departing from the spirit and scope thereof. In addition, a new embodiment can be implemented by appropriately combining a plurality of constituents disclosed in one or more embodiments. In addition, some constituents may be deleted from the constituents illustrated in the respective embodiments, or some components may be added to the constituents illustrated in the embodiments. Furthermore, the processing procedures described in each embodiment may be performed in a different order as long as there is no contradiction. That is, the internal prediction method of the cell aggregate, the program, the image processing device, and the system of the present invention can be variously modified and changed without departing from the scope of the invention defined by the claims.
The above embodiment describes the digital microscope including the digital camera 23 as an example. However, the imaging device for generating the image of the cell aggregate may be, for example, a laser scanning type microscope. Further, the imaging device for generating the image of the cell aggregate is not limited to the microscope, and other imaging devices may be used for generating the image of the cell aggregate.
The above embodiment describes the example where the internal structure of the cell aggregate is predicted by calculating the feature amount from each of the contours of the cell aggregates appeared in the tomographic images captured from the different directions and then quantifying the feature of the three-dimensional shape of the cell aggregate using the feature amount calculated from the two-dimensional image. However, the internal structure of the cell aggregate may be predicted by generating the three-dimensional image of the cell aggregates, calculating the feature amount from the three-dimensional image, and then quantifying the feature of the three-dimensional shape of the cell aggregate. Also, in this case, the feature amount desirably includes at least one of the first feature amount related to the unevenness on the surface of the cell aggregate and the second feature amount related to the deviation from the ideal shape of the cell aggregate, and more desirably, the feature amount includes both of them.
The above embodiment describes the three-dimensional model image and the tomographic image as an example of the structure information associated with the feature amount stored in the database. However, other pieces of information may be stored in the database. For example, information other than the image, such as the cell number (the live cell number, the dead cell number, etc.), a cell density, and the presence or absence, the size, and a ratio of voids present in the cell aggregate, may be included as the structure information. Further, the database may include, for example, information related to the quality of the cell aggregate, such as normal/abnormal and presence/absence of tumor, in addition to the structure information. Further, the database may include annotation information attached by the user.
The above embodiment describes an example where the information on the cell aggregate at a certain time point is stored in the database as the structure information of the cell aggregate. However, the database may include information related to a change over time in the cell aggregate which has been continuously observed. The server device 40 may determine whether a change in the cell aggregate specified by comparison of the prediction results performed at different timing is normal or abnormal by referring to the information related to the change over time in the database. For example, the server device 40 may determine whether the change is normal or abnormal on the basis of a proliferation rate, a proliferation number, or the like of the cells during a predetermined period of time.
In the above embodiment, the model image and the tomographic image are displayed as the internal prediction result. However, the image is not necessarily displayed, and other pieces of information may be displayed. For example, the number of the cells present in the cell aggregate, a cell proliferation rate or cell proliferation number during a predetermined period of time (e.g., a time period from the previous prediction to the present prediction), the quality information of the cell aggregate (normal/abnormal, the presence/absence of tumor, a survival rate (the live cell number/the total cell number)), and the like may be displayed.
The above embodiment describes an example where the server device 40 performs the internal prediction processing. However, the internal prediction processing may be performed in the microscope system 10 having captured the image of the cell aggregate. More specifically, the control device 30 may execute the internal prediction processing on the basis of the image generated by the microscope 20. Further, the internal prediction processing may be performed by a device different from the device in which the database is constructed. For example, the control device 30 may execute the internal prediction processing by referring to the database constructed in the server device 40.
As used herein, terms such as “first” and “second” that modify a noun do not limit the quantity or order of the elements represented by the noun. These terms are used only to distinguish between two or more elements. Therefore, the identification of the “first” and “second” elements does not mean that the “first” element precedes the “second” element. The identification of the “first” and “second” elements does not deny the existence of the “third” element.
Number | Date | Country | Kind |
---|---|---|---|
2021-119597 | Jul 2021 | JP | national |