The present disclosure relates to a microscopic device and an image processing device.
At present, a sample plate for cell counting or analysis usually includes one or more sample cells. The sample cells are used to accommodate cell samples. The count, concentration, size, and other parameters of cell samples are obtained through a microscopic imaging system and an image analysis system. However, the detection of these parameters depends on the observation ability of the microscope or the microscopic imaging system. The most fundamental and important feature parameter is the magnification, and the actual magnification of many microscopic devices on the market is inconsistent with their nominal value.
At present, the calibration of the magnification of the microscope mainly adopts a micrometer technique or other similar techniques, and the actual magnification of the microscope is calibrated based on the cooperation of an eyepiece micrometer and an objective micrometer. For example, the actual magnification of the microscope is determined by comparing a known size of the spacing in a scale with a size obtained in the microscopic imaging system. However, these technologies have certain defects. When the eyepiece or objective is replaced, the actual magnification of the microscope needs to be recalibrated, and at this time, the cell samples need to be removed, which leads to the repetition and trouble of the operation. The original visual field for observation needs to be searched again, which may lead to a movement of the cell samples in the sample cell and affect an observation result.
Therefore, it is very significant to measure the magnification of the microscope or microscopic imaging system. However, the existing cell counting plates do not allow accurate calibration of the magnification of the microscope, resulting in the inconsistency between the detected parameters and the actual parameters, and affection of the accuracy of the final result.
One aspect of the present disclosure may provide a microscopic device, comprising: an optical imaging device configured to generate an optical image of a sample, wherein the optical image includes a scaling pattern for determining a magnification of the microscopic device; an image sensor configured to generate a digital image based on the optical image; and a transmission device configured to transmit the digital image.
In some embodiments, the microscopic device may further comprise an inputting device, configured to input pattern information related to the scaling pattern.
In some embodiments, the transmission device may further be configured to transmit the pattern information.
In some embodiments, the pattern information may include at least one of: a size of the scaling pattern; an identifier of the scaling pattern; or a direction of the scaling pattern.
In some embodiments, the inputting device may further be configured to input first sample information related to the sample.
In some embodiments, the first sample information may include a type of the sample.
Another aspect of the present disclosure may provide an image processing device, comprising: a receiving device configured to receive a digital image generated by a microscopic device, wherein the digital image includes a scaling pattern; a storage device configured to store the digital image; and a processor configured to determine a magnification of the microscopic device based on the scaling pattern.
In some embodiments, the receiving device may further be configured to receive pattern information related to the scaling pattern.
In some embodiments, the processor may further be configured to determine the magnification of the microscopic device based on the pattern information.
In some embodiments, the processor may further be configured to determine a direction of the sample plate where the sample is located based on the pattern information.
In some embodiments, the processor may further be configured to classify the digital image based on the pattern information.
In some embodiments, the processor may further be configured to generate a first image based on the pattern information, and integrate the first image into the digital image.
In some embodiments, the receiving device may further be configured to receive first sample information related to the sample.
In some embodiments, the processor may further be configured to classify the digital image based on the first sample information.
In some embodiments, the processor may further be configured to generate a second image based on the first sample information, and integrate the second image into the digital image.
In some embodiments, the processor may further be configured to find, based on the type of the digital image, other images in the storage device that are the same as a type of the digital image, wherein the image processing device further comprises a transmission device configured to transmit the images that are the same as the type of the digital image.
In some embodiments, the processor may further be configured to analyze the digital image based on the scaling pattern to obtain second sample information related to the sample in the digital image.
In some embodiments, the processor may further be configured to classify the digital image based on the second sample information.
In some embodiments, the processor may further be configured to find the other images in the storage device that are the same as the type of the digital image based on the type of the digital image, the image processing device may further comprise a transmission device configured to transmit the images that are the same as the type of the digital image.
In some embodiments, the second sample information may include at least one of: a diameter of the sample; a value of major axis and a value of minor axis of the sample; a size of a visual field of photographing; or a concentration of the sample.
Another aspect of the present disclosure may provide a method for determining a magnification of a microscopic device, comprising: obtaining an image captured by the microscopic device, wherein the image includes a scaling pattern on a sample plate; determining a size of an image of the scaling pattern generated on an image sensor; and determining the magnification of the microscopic device based on a size of the scaling pattern on the sample plate and the size of the image of the scaling pattern generated on the image sensor.
In some embodiments, the size of the image of the scaling pattern generated on the image sensor may be determined based on spacing between adjacent pixels of the image sensor of the microscopic device.
In some embodiments, the scaling pattern may include a plurality of line segments, the method may further comprise: determining a plurality of values of magnification of the microscopic device based on a length of each of the plurality of line segments; and designating an average value of the plurality of values of magnification as the magnification of the microscopic device.
In some embodiments, the scaling pattern may include a plurality of line segments, the method may further comprise determining the magnification of the microscopic device based on a sum of lengths of the plurality of line segments.
Another aspect of the present disclosure may provide a microscopic analysis system, comprising: the microscopic device according to the above-mentioned embodiments; and the image processing device according to the above-mentioned embodiments.
In some embodiments, the microscopic analysis system may further comprise a mobile device, configured to receive and display the digital image from the microscopic device or the image processing device.
In some embodiments, the mobile device may further be configured to receive second sample information.
In some embodiments, the mobile device may further be configured to transmit a photographing parameter to the microscopic device; and the microscopic device may perform a photographing operation based on the photographing parameter.
Another aspect of the present disclosure may provide a microscopic analysis system, comprising:
a cloud server,
a microscopic device; and
a mobile device,
wherein the microscopic device includes:
an optical imaging device configured to photograph a sample on a sample plate to generate an optical image of the sample, wherein the optical image includes a scaling pattern for determining a magnification of the microscopic device;
an image sensor configured to generate a digital image based on the optical image;
a transmission device configured to transmit the digital image; and
a receiving device configured to receive information from the cloud server and the mobile device,
wherein the mobile device is configured to:
receive the digital image from the cloud server and the microscopic device;
transmit a photographing parameter to the cloud server and the microscopic device; and
transmit the digital image from the microscopic device to the cloud server,
wherein the cloud server is configured to:
receive the digital image from the microscopic device and the mobile device;
receive the photographing parameter from the mobile device and forward the photographing parameter to the microscopic device;
transmit the digital image from the microscopic device to the mobile device;
analyze the digital image and transmit an analysis result to the mobile device.
In some embodiments, the cloud server may further be configured to transmit an update of an application to the microscopic device and the mobile device.
In some embodiments, the sample plate may include a sample cell for accommodating a sample, and the scaling pattern for determining the magnification
Other features of the present disclosure and advantages thereof may become apparent from the following detailed description of embodiments of the present disclosure with reference to the drawings.
The drawings forming a portion of the present disclosure describe embodiments of the present disclosure and are used together with the present disclosure to explain the principles of the present disclosure.
Referring to the drawings, the present disclosure may be more clearly understood according to the following detailed description, wherein:
It should be noted that in some embodiments described below, the same number is used between different drawings to indicate the same part or a part with the same function, and the repeated description thereof is omitted. In the present disclosure, similar labels and letters are used to indicate similar items. Therefore, once an item is defined in a drawing, a further discussion may not need to be described in subsequent drawings.
In order to facilitate understanding, position, size, range of each structure shown in the drawings or the like may not indicate the actual position, size, or range in some cases. Therefore, the present disclosure is not limited to the position, dimension, ranges disclosed in the drawings, or the like.
The embodiments of the present disclosure are described in detail with reference to the drawings. It should be noted that unless stated otherwise or obvious from the context, the relative arrangement of components and steps, numerical expressions, and values set forth in these embodiments do not limit the scope of the present disclosure.
The following description of at least one embodiment is merely illustrative in nature and is in no way intended to limit the present disclosure and its application or use.
The techniques, methods, and devices known to those of ordinary skill in the art may not be discussed in detail, but where appropriate, such techniques, methods, and devices should be considered as a part of the present disclosure.
In all the examples shown and discussed in the present disclosure, any specific value should be interpreted as merely for example, and not as a limitation. Therefore, other examples of the embodiments may have different values.
As shown in
The digital image captured by the image sensor 101 may be stored in the memory 102, and the processor 103 may read and process the digital image in the memory 102.
In the embodiments shown in
For example, as shown in
K=D·sqrt[(x2−x1)2+(y2−y1)2] (1)
where D represents spacing between adjacent pixels in the image sensor 101, that is, a distance from a center of a pixel to a center of an adjacent pixel of the pixel along the X direction or Y direction. The function “sqrt” represents determining a square root.
Then, a magnification M of the optical imaging device 104 of the microscope device 100 may be determined based on the following formula (2):
M=K/L (2)
In the above method, an actual magnification of the microscopic device 100 may be accurately obtained.
In the above embodiment, a pixel array of the image sensor 101 may be a rectangular array, and the spacing of the adjacent pixels may be the same in the X direction and the Y direction. In other embodiments according to the present disclosure, the spacing between the adjacent pixels of the image sensor 101 may be different in the X direction and the Y direction. For example, if the spacing between the adjacent pixels along the X direction is D1 and the spacing between adjacent pixels along the Y direction is D2, the size k of the image of the line segment generated by the line segment on the sample plate on the image sensor 101 may be determined based on the following formula (3):
sqrt[(D1)2·(x2−x1)2+(D2)2·(y2−y1)2] (3)
Then, the actual magnification M of the microscopic device 100 may be accurately obtained.
The above operation of determining the actual magnification may be performed by, for example, the processor 103 of the microscope device 100. For example, the spacing between adjacent pixels of the image sensor 101 may be stored in the memory 102 in advance. When the image sensor 101 generates a digital image of the sample plate 200, the digital image may be stored in the memory 102.
Then, the processor 103 may read the digital image from the memory 102 and recognize the scaling pattern in the digital image. Next, the processor 103 may read the spacing between adjacent pixels of the image sensor 101 from the memory and determine the actual magnification M of the microscopic device 101 based on the above formulas (1)-(3).
In addition, in some embodiments according to the present disclosure, for the sample plate 200 with a plurality of sample cells 201, the magnification M may also be determined using the following method.
For the sample plate 200 shown in
M=(M1+M2+M3)/3 (4)
That is, an average value of the magnifications M1, M2, and M3 are designated as the actual magnification M of the microscopic device. In this way, the calculation error may be reduced and the accuracy of magnification may be further improved.
In addition, in other embodiments according to the present disclosure, a sum K′ of the line segments of each scaling pattern 202 may be determined based on the following formula (5):
K′=D·sqrt[(x2−x1)2+(y2−y1)2]+D·sqrt[(x3−x4)2+(y3−y4)2]+D·sqrt[(x5−x6)2+(y5−y6)2] (5)
where (x1, y1) and (x2, y2) are the coordinates of both ends of the line segment of the scaling pattern of a first sample cell 201 in the upper of
Then, the actual magnification M of the microscopic device may be determined based on a formula (6):
M=K′/(3L) (6)
In this way, the calculation error may be reduced and the accuracy of magnification may be further improved.
The above embodiments briefly describe how to determine the actual magnification of the microscopic device 100 based on the scaling pattern on the sample plate. It should be understood that the present disclosure is not limited to the above methods. Under the instruction and enlightenment of the present disclosure, those skilled in the art may also adopt other ways to determine the actual magnification of the microscopic device 100 based on the scaling pattern.
In the embodiments shown in
For example, based on the digital image generated by the image sensor 101, the processor 103 may obtain a coordinate (x1′, y1′) of a point on a tick mark in the scaling pattern 302 and a coordinate (x2′, y2′) of an intersection point of a line along the direction vertical to the tick mark and passing through point (x1′, y1′) and an adjacent tick mark.
Using a method similar to the method described above, the actual magnification of the microscopic device 100 may be determined.
In the example shown in
Using the scaling pattern 502 on the sample plate 500 of
In addition, the scaling pattern 502 on the sample plate 500 of
The sample plate according to the present disclosure and how to obtain the magnification of the microscopic device based on the scaling pattern on the sample plate are described above. It should be understood that the present disclosure may not be limited to the above embodiments.
For example,
In addition, in some embodiments according to the present disclosure, the orientation of the sample plate and the plurality of sample cells may also be determined according to the scaling pattern on the sample plate. For example, the sample plate 300 shown in
When observing and photographing samples in the plurality of sample cells 301 through the microscope device 100, it may be impossible to observe and photograph all the samples in the plurality of sample cells 301 at the same time due to the visual field and other reasons. Therefore, it is necessary to move the sample table 105 so that the sample plate is moved in the visual field, to observe and photograph different sample cells 301 on the sample plate 300 or different portions of the same sample cell 301.
As shown in
As shown in
Similarly, in the second mark 1004, based on the spacing between the tick marks, a missing tick mark may represent 0, and then the serial number of the sample plate 1000 may be determined as 1101.
In addition, in some embodiments of the present disclosure, other ways may also be used to represent a number 0 or 1. For example, in a set of tick marks as shown in
It should be understood that under the instruction and enlightenment of the present disclosure, those skilled in the art may combine the first mark, the second mark, the third mark, and the tick mark in other ways as a scaling pattern.
As shown in
Then, a digital image of the samples may be generated by the image sensor 101 (operation 1202). The optical image generated by the optical imaging device 104 of the microscopic device 100 may be received by the image sensor 101 and the digital image may be generated. The digital image may be stored in the memory 102.
Then, the processor 103 may read the digital image from the memory 102 and perform various processes (operation 1203). For example, as described above, based on the scaling pattern in the digital image, the magnification of the microscopic device 100 may be determined; the extension direction and arrangement direction of the sample cell may be determined; the (number of) sample plate may be determined; or the (number of) sample cell may be determined, or the like.
For example, the sample plate with the scaling pattern may be arranged on the sample table 1405, and the optical imaging device 1404 may generate an optical image of samples. In addition, the scaling pattern in the sample plate may also be included in the optical image. As described above, the scaling pattern may be configured to determine the magnification of the microscopic device, determine the direction of the sample plate, identify the sample plate or sample cell, or the like. The image sensor 1401 may convert an optical image into a digital image.
In addition, a user of the microscope device 1400 may input information about the scaling pattern (i.e., pattern information) by the inputting device 1410. The pattern information may include size information of the scaling pattern. For example, when the scaling pattern is the scaling pattern 202 shown in
In addition, the pattern information may include an identifier of the scaling pattern. For example, the user may input a unique serial number of the scaling pattern by the inputting device 1410. The serial number of the scaling pattern may be stored in a database in association with relevant information of the scaling pattern. Based on the serial number, the scaling pattern corresponding to the serial number and the relevant information of the scaling pattern, such as size, spacing, etc., may be found in the database.
In some illustrative embodiments, a type of the sample plate may also be determined by the unique serial number of the scaling pattern. For example, in the database, the serial number of the scaling pattern may also be stored in association with relevant parameters of the sample plate. For example, the relevant parameters of the sample plate may include: the type of the sample plate, the numbers of sample cells, the depth of sample cells, or the like.
In addition, the user may also input information about the sample (first sample information) by the inputting device 1410. For example, the information may include the type of sample. In this way, based on the information about the sample, the type of the samples such as red blood cells, yeast, algae, etc., may be determined.
The transmission device 1411 may transmit the digital image to other devices. For example, the transmission device 1411 may transmit the digital image to a remote server, such as a cloud server or the like. The server may include an image processing device to further process the received image.
In addition, an interaction between the microscopic device 1900 and the mobile device may be carried out directly or indirectly through, for example, a cloud server, which may not be limited by the present disclosure.
The receiving device 1501 may receive a digital image captured by the above-mentioned microscopic device according to the present disclosure. A scaling pattern may be included in the digital image.
In addition, in some embodiments according to the present disclosure, the receiving device 1501 may also receive pattern information and/or first sample information related to the digital image.
The storage device 1502 may store the digital image and various information received by the receiving device 1501.
The processor 1503 may obtain the digital image and various information from the storage device 1502 and process the digital image. For example, the processor 1503 may determine the magnification of the microscopic device based on the scaling pattern in the digital image.
The transmission device 1504 may transmit the digital image and information related to the digital image (e.g., the magnification, etc.) to, for example, a mobile device or the like.
In some embodiments according to the present disclosure, since specifications of the sample plates may be different, the parameters such as the length and/or spacing of the tick marks of the scaling pattern on the sample plates with different specifications may be different. In this case, the processor 1503 may also need to further combine the pattern information related to the digital image to determine the magnification of the microscopic device.
In addition, in some embodiments according to the present disclosure, the processor 1503 may also determine the direction of the sample plate where the sample is located based on the pattern information. For example, when the sample plate is the sample plate 900 shown in
Further, in some embodiments according to the present disclosure, the processor 1503 may classify digital images based on pattern information. For example, the digital images may be classified based on the size of the scaling pattern, and digital images with the same size of the scaling pattern may be divided into a group. When browsing the group of digital images later, the digital images may be scaled to make the size of the scaling pattern the same, so that users may intuitively observe and compare the relative size of the samples in each digital image.
Further, in some embodiments according to the present disclosure, the processor 1503 may generate a first image from the pattern information and integrate the first image into the digital image. For example, as shown in
In addition, the processor 1503 may classify the digital images based on the first sample information. For example, when the first sample information includes the type of the samples, the processor 1503 may determine that the digital images of samples of the same type belong to the same group.
In addition, in some embodiments according to the present disclosure, the first sample information may also include other information such as production date, photographing date, source, copyright information, or the like. The processor 1503 may also divide the digital images with the same date into a group.
In some embodiments according to the present disclosure, the processor 1503 may generate a second image based on the first sample information and integrate the second image into the digital image. As shown in
In some embodiments according to the present disclosure, a client may find and download the digital image from the server. For example, the client may enter a keyword, such as yeast cells, and the keyword may be transmitted to a remote server. The server may find digital images of yeast cells in all sample information in the database based on the keyword, and provide a list or thumbnails of these digital images to the client. The client may select and download a selected digital image and related information of the selected digital image based on the information provided by the server. Further, in some embodiments according to the present disclosure, the client may pay a fee to the server to obtain a license to use the selected digital image. After the server determines that the client has obtained the license, the selected digital image may be transmitted to the client.
Further, in some embodiments according to the present disclosure, the processor 1503 may analyze the digital image based on the scaling pattern to obtain second sample information related to the sample. The second sample information may include, for example, diameters of the samples, values of major axis and values of minor axis of the samples, a size of the visual field, a concentration of the samples and other information.
For example,
Further, in some embodiments according to the present disclosure, the processor 1503 may determine a plurality of yeast cells by the image recognition operation and determine the diameter of each of the plurality of the yeast cells based on the scaling pattern. Then, the processor 1503 may store an average value of the diameters of the plurality of yeast cells in the second sample information as a diameter of the plurality of yeast cells.
For example, as shown in
In addition, in some embodiments according to the present disclosure, the processor 1503 may identify 393 yeast cells from the digital image and determine the diameter of each of the 393 yeast cells, so that the average value of the diameters of the 393 yeast cells may also be obtained and stored in the second sample information.
In addition, the processor 1503 may also determine the size of a visual field of photographing and the concentration of the samples from the digital image. As shown in
In addition, the processor 1503 may determine the value of the major axis and the value of the minor axis of the sample from the digital image. For example,
According to the embodiment of the present disclosure, a microscopic analysis system is also provided, including the microscopic device, the image processing device, the mobile device, etc. It should be understood that the image processing device of the present disclosure may be a single server or multiple servers, or a cloud server. As shown in
Therefore, in the microscopic analysis system of the present disclosure, the microscopic device may include: an optical imaging device configured to obtain a sample on a sample plate to generate an optical image of the sample, wherein the optical image may include a scaling pattern for determining the magnification of the microscopic device; an image sensor configured to generate a digital image from the optical image; a transmission device configured to transmit the digital image; and a receiving device configured to receive information from the cloud server and the mobile device.
The mobile device may be configured to receive the digital image from the cloud server and the microscopic device; transmit a photographing parameter to the cloud server and the microscopic device; and transmit the digital image from the microscopic device to the cloud server.
The cloud server may be configured to receive the digital image from the microscopic device and the mobile device; receive the photographing parameter from the mobile device and forward the photographing parameter to the microscopic device; transmit the digital image from the microscopic device to the mobile device; and analyze the digital image and transmit an analysis result to the mobile device.
In addition, in some embodiments according to the present disclosure, the following technical schemes may also be adopted:
1. A microscopic device, comprising:
an optical imaging device configured to generate an optical image of a sample, wherein the optical image includes a scaling pattern for determining a magnification of the microscopic device;
an image sensor configured to generate a digital image based on the optical image; and
a transmission device configured to transmit the digital image.
2. The microscopic device of 1, wherein the optical imaging device is further configured to photograph the sample on a sample plate to generate the optical image, the sample plate including the scaling pattern.
3. The microscopic device of 1, further comprising:
an inputting device, configured to input pattern information related to the scaling pattern.
4. The microscopic device of 4, wherein the transmission device is further configured to transmit the pattern information.
5. The microscopic device of 4, wherein the pattern information includes at least one of:
a size of the scaling pattern;
an identifier of the scaling pattern; or
a direction of the scaling pattern.
6. The microscopic device of 4, wherein the inputting device is further configured to input first sample information related to the sample.
7. The microscopic device of 6, wherein the first sample information includes a type of the sample.
8. An image processing device, comprising:
a receiving device configured to receive a digital image generated by a microscopic device, wherein the digital image includes a scaling pattern;
a storage device configured to store the digital image; and
a processor configured to determine a magnification of the microscopic device based on the scaling pattern.
9. The image processing device of 8, wherein the microscopic device photographs a sample on a sample plate to generate the digital image, the sample plate including the scaling pattern.
10. The image processing device of 8, wherein the receiving device is further configured to receive pattern information related to the scaling pattern.
11. The image processing device of 10, wherein the processor is further configured to determine the magnification of the microscopic device based on the pattern information.
12. The image processing device of 10, wherein the processor is further configured to determine a direction of the sample plate where the sample is located based on the pattern information.
13. The image processing device of 10, wherein the processor is further configured to classify the digital image based on the pattern information.
14. The image processing device of 10, wherein the processor is further configured to generate a first image based on the pattern information, and integrate the first image into the digital image.
15. The image processing device of 10, wherein the receiving device is further configured to receive first sample information related to the sample.
16. The image processing device of 15, wherein the processor is further configured to classify the digital image based on the first sample information.
17. The image processing device of 15, wherein the processor is further configured to generate a second image based on the first sample information, and integrate the second image into the digital image.
18. The image processing device of 13 or 16, wherein
the processor is further configured to find, based on the type of the digital image, other images in the storage device that are the same as a type of the digital image, wherein the image processing device further comprises a transmission device configured to transmit the images that are the same as the type of the digital image.
19. The image processing device of 8, wherein the processor is further configured to analyze the digital image based on the scaling pattern to obtain second sample information related to the sample in the digital image.
20. The image processing device of 19, wherein the processor is further configured to classify the digital image based on the second sample information.
21. The image processing device of 20, wherein the processor is further configured to find the other images in the storage device that are the same as the type of the digital image based on the type of the digital image,
the image processing device further comprises:
a transmission device configured to transmit the images that are the same as the type of the digital image.
22. The image processing device of 19, wherein the second sample information includes at least one of:
a diameter of the sample;
a value of major axis and a value of minor axis of the sample;
a size of a visual field of photographing; or
a concentration of the sample.
23. A method for determining a magnification of a microscopic device, comprising:
obtaining an image captured by the microscopic device, wherein the image includes a scaling pattern on a sample plate;
determining a size of an image of the scaling pattern generated on an image sensor; and
determining the magnification of the microscopic device based on a size of the scaling pattern on the sample plate and the size of the image of the scaling pattern generated on the image sensor.
24. The method of 23, wherein the size of the image of the scaling pattern generated on the image sensor is determined based on spacing between adjacent pixels of the image sensor of the microscopic device.
25. The method of 23, wherein the scaling pattern includes a plurality of line segments, the method further comprising:
determining a plurality of values of magnification of the microscopic device based on a length of each of the plurality of line segments; and
designating an average value of the plurality of values of magnification as the magnification of the microscopic device.
26. The method of 23, wherein the scaling pattern includes a plurality of line segments, the method further comprising:
determining the magnification of the microscopic device based on a sum of lengths of the plurality of line segments.
27. A microscopic analysis system, comprising:
the microscopic device of any one of 1-7; and
the image processing device of any one of 8-22.
28. The microscopic analysis system of 27, further comprising:
a mobile device, configured to receive and display the digital image from the microscopic device or the image processing device.
29. The microscopic analysis system of 28, wherein the mobile device is further configured to receive second sample information.
30. The microscopic analysis system of 28, wherein the mobile device is further configured to transmit a photographing parameter to the microscopic device; and the microscopic device performs a photographing operation based on the photographing parameter.
31. A microscopic analysis system, comprising:
a cloud server,
a microscopic device; and
a mobile device,
wherein the microscopic device includes:
an optical imaging device configured to photograph a sample on a sample plate to generate an optical image of the sample, wherein the optical image includes a scaling pattern for determining a magnification of the microscopic device;
an image sensor configured to generate a digital image based on the optical image;
a transmission device configured to transmit the digital image; and
a receiving device configured to receive information from the cloud server and the mobile device,
wherein the mobile device is configured to:
receive the digital image from the cloud server and the microscopic device;
transmit a photographing parameter to the cloud server and the microscopic device; and
transmit the digital image from the microscopic device to the cloud server,
wherein the cloud server is configured to:
receive the digital image from the microscopic device and the mobile device;
receive the photographing parameter from the mobile device and forward the photographing parameter to the microscopic device;
transmit the digital image from the microscopic device to the mobile device;
analyze the digital image and transmit an analysis result to the mobile device.
32. The microscopic analysis system of 31, wherein the cloud server is further configured to transmit an update of an application to the microscopic device and the mobile device.
33. The microscopic analysis system of 31, wherein the sample plate includes the scaling pattern for determining the magnification.
The terms “front”, “back”, “top”, “bottom”, “over”, and “under” in the present disclosure and claims, if present, are used for descriptive purposes and are not necessarily used to describe invariant relative positions. It is to be understood that the terms so used are interchangeable under appropriate circumstances, enabling the embodiments of the disclosure described herein. For example, capable of operating in other orientations than those shown or otherwise described herein.
As used herein, the term “exemplary” means “serving as an example, instance, or illustration” rather than as a “model” to be exactly reproduced. Any implementation illustratively described herein is not necessarily to be construed as preferred or advantageous over other implementations. Furthermore, the present disclosure is not to be bound by any expressed or implied theory presented in the preceding technical field, background, summary, or detailed description.
As used herein, the term “substantially” is meant to encompass any minor variation due to design or manufacturing imperfections, tolerances of devices or elements, environmental influences, and/or other factors. The term “substantially” also allows for differences from a perfect or ideal situation due to parasitic effects, noise, and other practical considerations that may exist in an actual implementation.
The above description may indicate elements or nodes or features that are “connected” or “coupled” together. As used herein, unless expressly stated otherwise, “connected” means that one element/node/feature is electrically, mechanically, logically, or otherwise directly connected to another element/node/feature (or direct communication). Similarly, unless expressly stated otherwise, “coupled” means that one element/node/feature can be mechanically, electrically, logically, or otherwise linked, directly or indirectly, with another element/node/feature to Interactions are allowed, even though the two features may not be directly connected. That is, “coupled” is intended to encompass both direct and indirect coupling of elements or other features, including connections that utilize one or more intervening elements.
In addition, certain terms may also be used in the following description for reference purposes only, and are thus not intended to be limiting. For example, the terms “first,” “second,” and other such numerical terms referring to structures or elements do not imply a sequence or order unless the context clearly dictates otherwise.
It should also be understood that the word “including/comprising” is used herein to indicate the presence of the indicated features, integers, steps, operations, units, and/or components, but does not preclude the presence or addition of one or more other features, integers, steps, operations, units and/or components and/or combinations thereof.
In the present disclosure, the term “providing” is used in a broad sense to encompass all manners of obtaining an object, thus “providing something” includes, but is not limited to, “purchasing,” “preparing/manufacturing,” “arranging/setting,” “installing/assembling,” and/or “ordering” the objects, etc.
Those skilled in the art should appreciate that the boundaries between the operations described above are merely illustrative. Multiple operations may be combined into a single operation, a single operation may be distributed among additional operations, and operations may be performed at least partially overlapping in time. Furthermore, alternative embodiments may include multiple instances of a particular operation, and the order of operations may be changed in other various embodiments. However, other modifications, changes, and substitutions are equally possible. Accordingly, the present disclosure and drawings are to be regarded in an illustrative rather than a restrictive sense.
Although some specific embodiments of the present disclosure have been described in detail by way of examples, those skilled in the art should understand that the above examples are for illustration only, and not for the purpose of limiting the scope of the present disclosure. The various embodiments disclosed herein may be combined in any combination without departing from the spirit and scope of the present disclosure. It will also be understood by those skilled in the art that various modifications may be made to the embodiments without departing from the scope and spirit of the present disclosure. The scope of the present disclosure is defined by the claims.
Number | Date | Country | Kind |
---|---|---|---|
201911270215.4 | Dec 2019 | CN | national |
This application is a continuation of International Application No. PCT/CN2020/134569, filed on Dec. 8, 2020, which designates the United States of America and claims priority to Chinese Patent Application No. CN201911270215.4, filed on Dec. 11, 2019, the contents of each of which are incorporated herein by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2020/134569 | Dec 2020 | US |
Child | 17806481 | US |