The present disclosure relates to robotic microinjection of eggs and embryos such as insect eggs and embryos.
Malaria continues to impose hardship on much of the world causing illness and death. As is well known, malaria is transmitted by mosquitoes and regions with climates conducive to producing large mosquito populations encounter higher rates of malaria.
A variety of measures are being undertaken to alleviate the disease ranging from efforts to reduce standing water and use of mosquito nets to larvicides and insecticides. Malaria vaccines are even being developed. Use of transgenic mosquitoes offers a potentially promising approach to vector control. Transgenic mosquitoes are mosquitoes that are genetically modified, for example, by injecting a particularly selected gene into mosquito eggs using a pipette. Transgenic mosquitoes, for example, have been produced that cause female offspring of the transgenic mosquitoes to die before reaching full maturity. Such transgenic mosquitoes have been released in locations in the Brazil, Cayman Islands, Panama, India and United States. Transgenic mosquito solutions are directed to specific target species of mosquitoes. Research continues in developing transgenic mosquitoes for mosquito vector control. Such solutions may help reduce the incidence of malaria as well as other mosquito borne diseases such as Dengue fever, West Nile virus, and Zika and yellow fever. Techniques for improving and expediting research endeavors in genetical modified mosquitoes may thus be beneficial.
The present disclosure relates generally to apparatus and methods for more efficiently micro-positioning a micropipette with respect to an egg and/or injecting fluid into a plurality of eggs using a micropipette. Various approaches described herein comprise use of automated and/or robotic systems to direct a micropipette to and inject the micropipette in eggs mounted on a sample holder. For example, various designs disclosed herein comprise a micro-positioning system configured to insert a pipette into a plurality of eggs, wherein the micro-positioning system comprises:
Some designs disclosed herein comprise a micro-positioning system configured to insert a pipette into a plurality of elongate eggs, the elongate eggs having ends separated by a length and having sides separated by a wide, the length longer than the width, wherein the micro-positioning system comprises:
Details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages will become apparent from the description, the drawings, and the claims. Neither this summary nor the following detailed description purports to define or limit the scope of the inventive subject matter.
One method of microinjection into eggs 10 such as mosquito eggs is to line the eggs up in a row 12 along the edge 14 of a microscope slide 16 as illustrated in
Other configurations are possible. For example, the eggs 10 need on be on a microscope slide 16. Instead of the microscope slide 16, a microscope slip could be used. Moreover, in various implementations, the eggs 10 are supported by a platform 18 other than a microscope slide 16 or microscope slip. Also, although two translation stages 24, 26 are shown, more translation stages may be included. Additionally, although the translation stages 24, 26 are shown mounted to the pipette holder 22 and pipette 20, one or more of the translation stages may be mounted to the sample platform 18. Furthermore, both the sample platform 18 and the pipette holder 22 may be mounted to translations stages and the translation stage(s) mounted to the platform may move along same or different directions as the translation stage(s) mounted to and configured to move the pipette holder and pipette 20.
As shown in
In the example shown in
In various implementations, a camera 40 such as an overhead camera (shown in
In the example shown in
In various implementations, however, the third translation stage 32 comprises an electronically controllable translations stage. This third translation stage 32 may be configured to receive electrical signals from electronics such as a processor that controls the movement of the translation stage. As described herein, in some implementations, artificial intelligence may be employed to control the translation stage 32. For example, object detection and/or object segmentation may be applied to images of the eggs 10 to determine the location of the eggs and/or the end of the egg to move the translation stages 24, 26 so as to position the pipette tip 28 with respect to the egg. The translation stages 24, 26 may, for example, move the pipette 20 and the pipette 28 so as to inject the tip 28 of the pipette 20 into the egg 10, for example, into the end of the egg.
The motor 34 is shown in
The pipette 20 is also shown in
As discussed above, additional translation stages may be employed and may be configured to move the pipette 20 and/or the eggs 10. As illustrated in
The rotation stage 50 may comprise an electronically controlled rotation stage that may be driven by a processor. As illustrated in
As discussed above, electronics provide signals to the one or more translation stages 24, 26, 32 to move the translation stages to such that the pipette 20 is injected into the egg 10. This electronics may comprise one or more microprocessors 60 such as shown in
In some implementations, any one of the drivers for the translation stages 61, 62, 63 may be integrated in the respective translation stage 24, 26, 32. Similarly, the driver 66 for the rotation stage 50 may be integrated in the rotation stage. Memory as well as a storage device may be in communication with the processor 60 and may store one or more software modules. One or more displays such as a top view display 70 and/or a side view display 68 or a single display the display both the top view and side view may be includes. Such top and side views may be provided by the overhead camera 40 and the side camera 41 respectively.
As discussed herein, images of the eggs and pipette can be capture by camera(s) 40, 41 and be processed by the electronics, e.g., the one or more processors 60. For example, computer vision and/or machine learning may be employed to detect eggs 10 in the images. Features of the detected egg in the image, such as the tip, center, side, may be identified. Edges of the egg may be detected. Bounding boxes and/or masks may be created based on the edges of the eggs. Data associated with the detected egg such as the location of the egg, the locations of features on the egg (e.g., the tip, center, centroid or any of these), the perimeter, the location of bounding box points or masks tracing the outline of the eggs or any combination of these may be determine and/or used.
Similarly, in various implementations, computer vision and/or machine learning may be employed to identify the pipette 20 in the images. Features of the detected pipette 20 in the image, such as the tip, center, side, may be identified. Edges of the pipette 20 may be detected. Bounding boxes and/or masks may be created based on the edges of the pipette 20. Data associated with the detected pipette 20 such as the location of the pipette, the locations of features on the pipette (e.g., the tip of the pipette, the center, centroid or any of these), the perimeter of the pipette, the location of bounding box points or masks tracing the outline of the pipette or any combination of these may be determine and/or used.
The process shown in
While the term pipette is employed herein this pipette 20 may comprises a micro pipette. This micro-pipette 20 may, for example, be 0.1 mm, 0.2 mm, 0.5 mm, 1 mm, 2 mm wide, or 3 mm wide at the widest parts or in any range formed by any of these values or larger or smaller. The tip 28 of the pipette or micro-pipette 20 may be much smaller, for example, be 1 micron, 5 microns, 10 microns, 20 microns, 50 microns, 100 microns, 120 microns, 150 microns, 200 microns, 250 microns, 300 microns, 400 microns, 500 microns or in any range formed by any of these values or larger or smaller.
The eggs 10 may be small. The eggs 10 may have a largest dimension, e.g., a length, of for example, be 1 micron, 5 microns, 10 microns, 20 microns, 50 microns, 100 microns, 120 microns, 150 microns, 200 microns, 250 microns, 300 microns, 400 microns, 500 microns, 600 microns, 700 microns, 800 microns, 900 microns, 1000 microns, 1200 microns, 1300 microns, 1400 microns, 1500 microns, 1600 microns, 1700 microns, 1800 microns, 1900 microns, 2 mm, 3 mm, 4 mm, 5 mm or in any range formed by any of these values or larger or smaller. The eggs 10 may have a smaller dimension, e.g., a width, of for example, be 1 micron, 5 microns, 10 microns, 20 microns, 50 microns, 100 microns, 120 microns, 150 microns, 200 microns, 250 microns, 300 microns, 400 microns, 500 microns, 600 microns, 700 microns, 800 microns, 900 microns, 1000 microns, 1200 microns, 1300 microns, 1400 microns, 1500 microns, 1600 microns, 1700 microns, 1800 microns, 1900 microns, 2 mm, 3 mm, 4 mm, 5 mm or in any range formed by any of these values or larger or smaller.
In various implementations, the eggs 10 are not spheres. In various case, the eggs 10 elongate having a length different than the width thereof. The eggs may for example have an aspect ratio length to width of 1.1, 1.2, 1.3, 1.4, 1.5, 2, 2.5, 3, 3.5, 4, 4.5, 5, 6, 7, 8 or any range formed by any of these values or larger or smaller. The eggs may also have complex shapes other than simple geometric shapes and/or may be irregularly shaped.
In various cases, the eggs comprise mosquito eggs but need not be so limited. The eggs may comprise other types of insect eggs. The eggs, however, need not be limited to insect eggs. The micro-positioning systems and microinjection system also need not be limited to eggs but can be used for other biological objects, for example, having such small size and/or having such shapes. However, the micro-positioning systems and microinjection system need not be limited to such biological objects as these.
While the pipette 20 is shown as being configured to penetrate eggs 10 on the top of the shaft 30, in different designs, the pipette can be configured to penetrate eggs on the bottom of the shaft or anywhere else on the shaft, for example, on a side of the shaft. One or more of the translations stages 24, 26, 32 may need to be oriented differently, for example, if the pipette penetrates eggs on the side of the shaft 30. Likewise, the translations stages 24, 26, 32 need not be oriented as they are shown in
Additionally, the pipette holder may comprise different configurations. For example the pipette holder may comprise a rod or other elongate member and the pipette may be mounted the rod (e.g., on the top or bottom or side of the rod or other elongate member). The pipette may be mounted in an opening or hole in the holder such as an opening or hole in the face of the holder.
In some implementations, the pipette may be mounted in an groove on the holder such as on the top or bottom or side of the holder.
In some designs the holder is configured to support a plurality of pipettes. Moreover, in various implementations, the holder or a portion of the holder to which the pipettes are secured is configured to move, e.g., rotated, such that the pipette that was previously used to penetrate one or more eggs can be switched out for another pipette. In some designs, for example, the pipette holder is configured to rotate an array of pipettes such that the pipette that is used to penetrate the egg is switch out. A configuration similar to a revolver may be used. In some implementations, for example, the pipette below the holder is used to penetrate the egg. The holder can be configure to rotate an array of pipettes arrange in a ring so that another pipette is located the location below the holder to penetrate the egg. Such a configuration may be useful to address potential clocking of pipettes after penetration into the egg(s).
As discussed above, the microinjection system may be configured to detect objects such as eggs and pipettes. The detection may be accomplished using a variety of techniques, including sensors such as cameras as discussed herein.
In some embodiments, objects present in images captured by the one or more cameras may be detected using computer vision techniques. For example, as disclosed herein, the display system's microscope camera 40 (or side view camera 41) may be configured to image the eggs and the pipette and the microinjection system may be configured to perform image analysis on the images to determine the presence of objects such as eggs and the pipette in the images. The microinjection system may analyze the images acquired by the microscope camera to perform scene reconstruction, event detection, video tracking, object recognition, object segmentation, object pose estimation, learning, indexing, motion estimation, or image restoration, etc. As other examples, the micro-positioning and/or microinjection system may be configured to perform object recognition and/or segmentation to determine the presence and location of eggs and/or one or more pipettes in the camera's field of view. One or more computer vision algorithms may be used to perform these tasks. Non-limiting examples of computer vision algorithms include: Scale-invariant feature transform (SIFT), speeded up robust features (SURF), oriented FAST and rotated BRIEF (ORB), binary robust invariant scalable keypoints (BRISK), fast retina keypoint (FREAK), Viola-Jones algorithm, Eigenfaces approach, Lucas-Kanade algorithm, Horn-Schunk algorithm, Mean-shift algorithm, visual simultaneous location and mapping (vSLAM) techniques, a sequential Bayesian estimator (e.g., Kalman filter, extended Kalman filter, etc.), bundle adjustment, Adaptive thresholding (and other thresholding techniques), Iterative Closest Point (ICP), Semi Global Matching (SGM), Semi Global Block Matching (SGBM), Feature Point Histograms, various machine learning algorithms (such as e.g., support vector machine, k-nearest neighbors algorithm, Naive Bayes, neural network (including convolutional or deep neural networks), or other supervised/unsupervised models, etc.), and so forth.
A variety of machine learning algorithms may be used to learn to identify the presence of objects in the image. Once trained, the machine learning algorithms may be stored by the system. Some examples of machine learning algorithms may include supervised or non-supervised machine learning algorithms, including regression algorithms (such as, for example, Ordinary Least Squares Regression), instance-based algorithms (such as, for example, Learning Vector Quantization), decision tree algorithms (such as, for example, classification and regression trees), Bayesian algorithms (such as, for example, Naive Bayes), clustering algorithms (such as, for example, k-means clustering), association rule learning algorithms (such as, for example, a-priori algorithms), artificial neural network algorithms (such as, for example, Perceptron), deep learning algorithms (such as, for example, Deep Boltzmann Machine, or deep neural network), dimensionality reduction algorithms (such as, for example, Principal Component Analysis), ensemble algorithms (such as, for example, Stacked Generalization), and/or other machine learning algorithms. In some embodiments, individual models may be customized for individual data sets. For example, the system may generate or store a base model. The base model may be used as a starting point to generate additional models specific to a data type (e.g., a particular user), a data set (e.g., a set of additional images obtained), conditional situations, or other variations. In some embodiments, the system may be configured to utilize a plurality of techniques to generate models for analysis of the aggregated data. Other techniques may include using pre-defined thresholds or data values.
The criteria for detecting an object may include one or more threshold conditions. If the analysis of the data acquired by the camera(s) indicates that a threshold condition is passed, the system may provide a signal indicating the detection the presence of the object in the image. The threshold condition may involve a quantitative and/or qualitative measure. For example, the threshold condition may include a score or a percentage associated with the likelihood of the object being present. The system may compare the score with the threshold score. If the score is higher than the threshold level, the system may detect the presence of the object. In some other embodiments, the system may signal the presence of the object if the score is lower than the threshold.
In some embodiments, the threshold conditions, the machine learning algorithms, or the computer vision algorithms may be specialized for a specific context.
It will be appreciated that each of the processes, methods, and algorithms described herein and/or depicted in the figures may be embodied in, and fully or partially automated by, code modules executed by one or more physical computing systems, hardware computer processors, application-specific circuitry, and/or electronic hardware configured to execute specific and particular computer instructions. For example, computing systems may include general purpose computers (e.g., servers) programmed with specific computer instructions or special purpose computers, special purpose circuitry, and so forth. A code module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language. In some embodiments, particular operations and methods may be performed by circuitry that is specific to a given function.
Further, certain embodiments of the functionality of the present disclosure are sufficiently mathematically, computationally, or technically complex that application-specific hardware or one or more physical computing devices (utilizing appropriate specialized executable instructions) may be necessary to perform the functionality, for example, due to the volume or complexity of the calculations involved or to provide results substantially in real-time. For example, a video may include many frames, with each frame having millions of pixels, and specifically programmed computer hardware is necessary to process the video data to provide a desired image processing task or application in a commercially reasonable amount of time.
Code modules or any type of data may be stored on any type of non-transitory computer-readable medium, such as physical computer storage including hard drives, solid state memory, random access memory (RAM), read only memory (ROM), optical disc, volatile or non-volatile storage, combinations of the same and/or the like. In some embodiments, the non-transitory computer-readable medium may be part of one or more of the local processing and data module, the remote processing module, and remote data repository. The methods and modules (or data) may also be transmitted as generated data signals (e.g., as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). The results of the disclosed processes or process steps may be stored, persistently or otherwise, in any type of non-transitory, tangible computer storage or may be communicated via a computer-readable transmission medium.
Any processes, blocks, states, steps, or functionalities in flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing code modules, segments, or portions of code which include one or more executable instructions for implementing specific functions (e.g., logical or arithmetical) or steps in the process. The various processes, blocks, states, steps, or functionalities may be combined, rearranged, added to, deleted from, modified, or otherwise changed from the illustrative examples provided herein. In some embodiments, additional or different computing systems or code modules may perform some or all of the functionalities described herein. The methods and processes described herein are also not limited to any particular sequence, and the blocks, steps, or states relating thereto may be performed in other sequences that are appropriate, for example, in serial, in parallel, or in some other manner. Tasks or events may be added to or removed from the disclosed example embodiments. Moreover, the separation of various system components in the embodiments described herein is for illustrative purposes and should not be understood as requiring such separation in all embodiments. It should be understood that the described program components, methods, and systems may generally be integrated together in a single computer product or packaged into multiple computer products.
Some additional nonlimiting examples of embodiments discussed above are provided below. These should not be read as limiting the breadth of the disclosure in any way.
Example 1: A micro-positioning system configured to insert a pipette into a plurality of eggs, said micro-positioning system comprising:
Example 2: The microinjection system of Example 1, wherein said pipette holder comprises a hole or groove configured to receive said pipette.
Example 3: The microinjection system of Example 1 or 2, wherein said pipette is configured to be supported on said electronically controlled lateral translation stage such that said pipette is moved laterally.
Example 4: The microinjection system of any of the examples above, wherein said pipette is configured to be supported on said electronically controlled longitudinal translation stage such that said pipette is moved longitudinally.
Example 5: The microinjection system of any of the examples above, wherein said pipette is configured to be supported on said vertical translation stage such that said pipette is moved vertically.
Example 6: The microinjection system of any of the examples above, wherein said shaft is configured to be supported on said electronically controlled lateral translation stage such that said shaft is moved laterally.
Example 7: The microinjection system of any of the examples above, wherein said shaft is configured to be supported on said electronically controlled longitudinal translation stage such that said shaft is moved longitudinally.
Example 8: The microinjection system of any of the examples above, wherein said shaft is configured to be supported on said vertical translation stage such that said pipette is moved vertically.
Example 9: The microinjection system of any of the examples above, wherein said vertical translation stage comprises an electronically controlled vertical translation stage configured to translate in the vertical direction so as to move said pipette holder vertically with respect to said shaft.
Example 10: The microinjection system of any of the examples above, wherein said motor comprises a stepper motor or a servo motor.
Example 11: The microinjection system of any of the examples above, further comprising at least one camera including microscope optics configured to images said pipette and said eggs.
Example 12: The microinjection system of Example 11, wherein said at least one camera comprises a camera located vertically above said shaft to top of said pipette.
Example 13: The microinjection system of Example 11 or 12, wherein said at least one camera comprises a side camera configured to view a side of said pipette.
Example 14: The microinjection system of any of Examples 11-13, wherein said electronics are configured to received images from said at least one camera of said eggs and said pipette and to drive at least said lateral and longitudinal translation stages and said motor to position said pipette with respect to said egg such that said pipette penetrates said egg based on said images from said camera.
Example 15: The microinjection system of any of the examples above, further comprising at least one electronically controlled rotation stage configured to rotate said shaft with respect to said pipette holder.
Example 16: The microinjection system of Example 13, wherein said one electronically controlled rotation stage is configured to rotate about an axis orthogonal the length of the shaft.
Example 17: The microinjection system of Example 13 or 14, wherein said one electronically controlled rotation stage is configured to rotate in a plane parallel to said longitudinal and lateral directions.
Example 18: The microinjection system of Example 13 or 14, wherein electronics are configured to control said electronically controlled rotation stage.
Example 19: The microinjection system of any of the claims above, wherein said pipette holder is configured to hold a plurality of pipettes.
Example 20: The microinjection system of Example 12, wherein said pipette holder can move to change the pipette among the plurality of pipettes that is injected into the egg.
Example 21: The microinjection system of Example 13, wherein said pipette holder can rotate to change the pipette that among the plurality of pipettes is injected into the egg.
Example 22: A micro-positioning system configured to insert a pipette into a plurality of elongate eggs, said elongate eggs having ends and sides, said ends separated by a length, said elongate eggs having ends separated by a length and having sides separated by a wide, said length longer than said width, said micro-positioning system comprising:
Example 23: The microinjection system of Example 22, wherein said platform comprises a shaft mounted on a motor.
Example 24: The microinjection system of Example 22 or 23, wherein said electronics are configured to drive at least said lateral and longitudinal translation stages to position said pipette with respect to one of said ends of said elongate egg such that said pipette penetrates said egg through said end of said elongate egg based on said images from said camera.
Example 25: The microinjection system of any of Examples 22-24, wherein said electronics are configured to identify said eggs in said images.
Example 26: The microinjection system of any of Examples 22-25, wherein said electronics are configured to identify said pipette in said images.
Example 27: The microinjection system of any of Examples 22-26, wherein said electronics are configured to drive said lateral and translation stages by distances such that said pipette penetrates said elongate egg through the end of said egg.
Example 28: The microinjection system of any of Examples 22-26, wherein said electronics are configured to drive said lateral and translation stages by distances such that said pipette penetrates said elongate egg through a side of said egg.
Example 29: A method of inserting a pipette into a plurality of elongate eggs, said elongate eggs having ends and sides, said ends separated by a length, said elongate eggs having ends separated by a length and having sides separated by a wide, said length longer than said width, said method comprising:
Example 30: The method of Example 29, further comprising performing object detection of said object and said pipette to determine said signal sent to said lateral and longitudinal translation stages.
Example 31: The method of any of Examples 29 or 30, further comprising performing object segmentation of said object and said pipette and moving said pipette with respect to said elongate eggs based on information obtained from said object segmentation.
Example 32: The method of any of Examples 29-31, wherein said eggs are moved to cause said pipette to be injected into said eggs.
Example 33: The method of any of Examples 29-31, wherein said pipette is moved to cause said pipette to be injected into said eggs.
Example 34: A method of inserting a pipette into a plurality of elongate eggs, said elongate eggs having ends and sides, said ends separated by a length, said elongate eggs having ends separated by a length and having sides separated by a wide, said length longer than said width, said method comprising:
Example 35: The method of Example 34, further comprising object segmentations of said object and said pipette and moving said pipette with respect to said elongate eggs based on information obtained from said object segmentation.
Example 36: The method of any of Examples 34 or 35, wherein said eggs are moved to cause said pipette to be injected into said eggs.
Example 37: The method of any of Examples 34 or 35, wherein said pipette is moved to cause said pipette to be injected into said eggs.
Each of the processes, methods, and algorithms described herein and/or depicted in the attached figures may be embodied in, and fully or partially automated by, code modules executed by one or more physical computing systems, hardware computer processors, application-specific circuitry, and/or electronic hardware configured to execute specific and particular computer instructions. For example, computing systems can include general purpose computers (e.g., servers) programmed with specific computer instructions or special purpose computers, special purpose circuitry, and so forth. A code module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language. In some implementations, particular operations and methods may be performed by circuitry that is specific to a given function.
Further, certain implementations of the functionality of the present disclosure are sufficiently mathematically, computationally, or technically complex that application-specific hardware or one or more physical computing devices (utilizing appropriate specialized executable instructions) may be necessary to perform the functionality, for example, due to the volume or complexity of the calculations involved or to provide results substantially in real-time. For example, animations or video may include many frames, with each frame having millions of pixels, and specifically programmed computer hardware is necessary to process the video data to provide a desired image processing task or application in a commercially reasonable amount of time.
Code modules or any type of data may be stored on any type of non-transitory computer-readable medium, such as physical computer storage including hard drives, solid state memory, random access memory (RAM), read only memory (ROM), optical disc, volatile or non-volatile storage, combinations of the same and/or the like. The methods and modules (or data) may also be transmitted as generated data signals (e.g., as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). The results of the disclosed processes or process steps may be stored, persistently or otherwise, in any type of non-transitory, tangible computer storage or may be communicated via a computer-readable transmission medium.
Any processes, blocks, states, steps, or functionalities in flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing code modules, segments, or portions of code which include one or more executable instructions for implementing specific functions (e.g., logical or arithmetical) or steps in the process. The various processes, blocks, states, steps, or functionalities can be combined, rearranged, added to, deleted from, modified, or otherwise changed from the illustrative examples provided herein. In some embodiments, additional or different computing systems or code modules may perform some or all of the functionalities described herein. The methods and processes described herein are also not limited to any particular sequence, and the blocks, steps, or states relating thereto can be performed in other sequences that are appropriate, for example, in serial, in parallel, or in some other manner. Tasks or events may be added to or removed from the disclosed example embodiments. Moreover, the separation of various system components in the implementations described herein is for illustrative purposes and should not be understood as requiring such separation in all implementations. It should be understood that the described program components, methods, and systems can generally be integrated together in a single computer product or packaged into multiple computer products. Many implementation variations are possible.
The processes, methods, and systems may be implemented in a network (or distributed) computing environment. Network environments include enterprise-wide computer networks, intranets, local area networks (LAN), wide area networks (WAN), personal area networks (PAN), cloud computing networks, crowd-sourced computing networks, the Internet, and the World Wide Web. The network may be a wired or a wireless network or any other type of communication network.
The systems and methods of the disclosure each have several innovative aspects, no single one of which is solely responsible or required for the desirable attributes disclosed herein. The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. Various modifications to the implementations described in this disclosure may be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Thus, the claims are not intended to be limited to the implementations shown herein, but are to be accorded the widest scope consistent with this disclosure, the principles and the novel features disclosed herein.
Certain features that are described in this specification in the context of separate implementations also can be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation also can be implemented in multiple implementations separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination. No single feature or group of features is necessary or indispensable to each and every embodiment.
Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. In addition, the articles “a,” “an,” and “the” as used in this application and the appended claims are to be construed to mean “one or more” or “at least one” unless specified otherwise.
As used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: A, B, or C” is intended to cover: A, B, C, A and B, A and C, B and C, and A, B, and C. Conjunctive language such as the phrase “at least one of X, Y and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be at least one of X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present.
Similarly, while operations may be depicted in the drawings in a particular order, it is to be recognized that such operations need not be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Further, the drawings may schematically depict one more example processes in the form of a flowchart. However, other operations that are not depicted can be incorporated in the example methods and processes that are schematically illustrated. For example, one or more additional operations can be performed before, after, simultaneously, or between any of the illustrated operations. Additionally, the operations may be rearranged or reordered in other implementations. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products. Additionally, other implementations are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results.
This application claims the priority benefit of U.S. Patent Prov. App. 63/314,302, titled ROBOTIC MICROINJECTION, filed Feb. 25, 2022, which is hereby incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63314302 | Feb 2022 | US |