This invention relates to stent design and, more particularly, to 3D modeling tools, systems, and software for designing stents.
Accurate representations of anatomical structures may allow for customization in medical treatment. For example, inner cavities of anatomical tubular structures, also known as lumens, are present in humans and other organisms. Lumens may be hollow, such as airways, or filled with another substance such a blood vessel filled with blood or a bone filled with bone marrow. Changes in an anatomical lumen may require medical intervention. For example, when an airway narrows or closes, a medical professional may insert a stent into the lumen to correct a medical condition. Since anatomical lumens have non-uniform shapes and sizes, a customized medical device (e.g., a stent) to correct a lumen-based medical condition would enhance medical treatment.
In accordance with one embodiment, a method for designing a stent provides a user interface configured to display a 3D lumen model. The method receives a user input from the user interface indicating a selection of a point of the 3D lumen model. The method determines a 2D cursor position on the user interface corresponding to the selection. The method translates the 2D cursor position to a 3D lumen model position. The method determines a center point of the 3D lumen model based on a proximity to the 3D lumen model position. The method determines a diameter for a volume-defining object based on the center point. The method positions a center of the volume-defining object at the center point.
In some embodiments, the method forms a stent surface within the 3D lumen model based on a position of the volume-defining object.
Determining the diameter for the volume-defining object may include determining a diameter of a cross-section of the 3D lumen model through the center point.
Translating the 2D cursor position to a 3D lumen model position may include forming a ray based on a position of a camera view and the 2D cursor position; and determining a point of a lumen surface intersected by the ray.
Determining the diameter for the volume-defining object may include displaying a cross-section of the 3D lumen model. The cross-section may include a representation of the center point, a representation of the shortest and longest diameters of the cross-section, a representation of a cross-section of a stent, and a representation of a diameter of the stent. The cross-section is configured to receive a stent adjustment from a user.
In some embodiments, the method forms a 3D stent model including a stent surface using a position and diameter of the volume-defining object.
In some embodiments, the method translates the 3D stent model into a sliced object; determines an image slice intersecting the 3D stent model; overlays the sliced object onto the image slice; and displays the overlayed image slice.
In accordance with another embodiment, a stent design system has a display configured to output a user interface; a user input device configured to control a 2D cursor position on the user interface; a processing device; and a memory device configured to store a set of instructions. The stent design system receives a user input from the user interface indicating a selection of a point of a 3D lumen model; determines the 2D cursor position on the user interface corresponding to the selection; translates the 2D cursor position to a 3D lumen model position; determines a center point of the 3D lumen model based on a proximity to the 3D lumen model position; determines a diameter for a volume-defining object based on the center point; and positions a center of the volume-defining object at the center point.
Illustrative embodiments of the invention are implemented as a computer program product having a computer usable medium with computer readable program code thereon. The computer readable code may be read and utilized by a computer system in accordance with conventional processes.
Those skilled in the art should more fully appreciate advantages of various embodiments of the invention from the following “Description of Illustrative Embodiments,” discussed with reference to the drawings summarized immediately below.
In illustrative embodiments, a stent design system is configured to generate a 3D model of an anatomical tubular structure including a lumen. The stent design system may use machine learning and/or a recursive center point adjustment process to determine center points or centerlines of lumen cross-sections. The stent design system may provide a user interface for designing a stent or other medical device to be inserted into the lumen. The user interface may display multiple views, such as perspective views of the 3D model, cross-sectional views of the 3D model, or 2D CT scan images with stent design overlays. Using the user interface, the user may place multiple objects, such as a sphere, that correspond to the size (e.g., diameter) and position of a desired stent. The stent design system may automatically position the objects at a center point of the lumen or along centerlines of the lumen cross-sections, as indicated by the user. The stent design system may also automatically size the objects based on the diameter of the lumen at the cross-section where the center point is located. The stent design system may use the sizing and location of the objects to generate a customized stent model. Generating a stent model based on selected locations and diameters may be performed by a number of methods, such as the methods described in International Publication No. 2021/007570 entitled “System And Method For Model-Based Stent Design And Placement.” Details of illustrative embodiments are discussed below.
The stent design system 105 is configured to receive 2D anatomical images from a data structure 103. The 2D anatomical images may be generated by a computer tomography (CT) machine 101, or another imaging system. The stent design system 105 is also configured to receive a user input 107 from a user. The user input 107 may be configured to adjust an anatomical model, a centerline, or a stent model, among other things. The stent design system 105 is configured to output a stent design, such as in the form of a stent design file 109.
The process 200 begins at operation 201 where the stent design system determines a 3D model (or 3D mesh) of a lumen and a cross-sectional guideline. Determining the 3D model may include receiving the 3D model or generating the 3D model based on 2D images, such as CT scan images, among other things. Determining the cross-sectional guideline may include receiving the guideline or generating the guideline based on the 3D model. In some embodiments, the guideline is the centerline derived from the process 600 in
The process 200 proceeds to operation 203 where the stent design system determines a cross-section of the lumen using the cross-sectional guideline. The cross-sectional guideline may be configured to be perpendicular to preferred cross-sections of the modeled lumen down its entire length. Therefore, the stent design system determines a cross-section of the lumen by determining a plane perpendicular to the cross-sectional guideline. In certain embodiments, the stent design system may determine perpendicularity based on an averaged rate of change of the cross-sectional guideline.
The process 200 proceeds to operation 205 where the stent design system determines an initial center point. For example, the stent design system may use the cross-section guideline and the cross-section. The initial center point may be the place where the plane of the cross-section intersects with the cross-sectional guideline. In other embodiments, the user may select or adjust the initial center point before the process 200 proceeds to operation 207.
The process 200 proceeds to operation 207 where the stent design system determines points along the outer surface of the lumen using the initial center point. Determining the lumen outer surface points may include casting rays from the initial center point to the outer surface of the lumen within the cross-section in a circular or radial pattern. The points at which the rays intersect with the outer surface of the lumen may be the lumen outer surface points. Among other things, the number of rays may be such that the lumen outer surface points are spaced 1-1.25 mm apart when casting the largest cross-sections of an airway. Among other things, the number of rays may be based on the voxel size of the 3D model. Among other things, the voxels of the 3D model may include a range inclusive of about 0.7×0.7×0.5 mm and 1×1×1.5 mm, or a voxel size of about 1×1×1.25 mm. The number of rays cast may correspond to the spacing between lumen outer surface points being approximately a voxel size of the 3D model. By using points along the outer surface of the lumen rather than ray casting equal-length rays, the process 200 is able to determine a center point of a wide range of non-circular cross-sections.
The process 200 proceeds to operation 209 where the stent design system determines a new center point using the lumen outer surface points. The new center point may be the centroid of the lumen outer surface points. For example, the stent design system may use the following formula to determine the new center point, where N is the total number of points and Pi is the i-th lumen surface point and is located at coordinates x, y, and z:
The process 200 then proceeds to operation 211 where the stent design system determines an inter-center point distance between the two most recent center points. At conditional 213, the inter-center point distance is compared to a distance threshold. If the inter-center point distance is greater than the distance threshold, the process 200 returns to operation 207 and repeats operations 207-213 until the inter-center point distance becomes less than the distance threshold. When the inter-center point distance is less than the distance threshold, the process 200 proceeds to operation 215, where the process 200 determines a centerline using the new center point. Among other things, determining a centerline may include repeating operations 203—conditional 213 until the process determines center points for multiple cross sections. The determined center points may then be connected to form a centerline.
It should be appreciated that the process 200 may be used to determine a single center point, or may be repeated at regular intervals along the cross-sectional guideline to generate a centroid-based centerline for the 3D lumen model. Among other things, the process 200 may be repeated every millimeter along the cross-sectional guideline to generate a centroid-based centerline. Instead of every millimeter, the process 200 may be repeated for another sampling distance determined by the spatial resolution of the 2D image upon which the 3D lumen model is based. For example, if the slice thickness of a CT scan is 1.25 mm, then the sampling distance may be less than 1.25 mm to prevent aliasing.
In certain embodiments, the 3D lumen model used by the process 200 may be first filtered or smoothed. For example, the 3D lumen model may include an initial set of raw voxels that makeup a corresponding airway. After initially skeletonizing the voxels (using a marching cubes algorithm), the model may have rough edges due to the relatively low resolution of the CT image scans. To fix this problem, the 3D model, or mesh, is run through a smoothing algorithm.
By doing this smoothing, the process 200 may find a more accurate centerline than if the voxels that make up the cross-section were averaged.
The process 400 begins at operation 401 by determining the training data to be used for training the neural network. The training data may include labeled data. For example, the training data may have representations of lumen cross-sections with center points labeled as being a branching region edge or a non-branching region edge. The cross-sections may be generated using the process 200, the process 600, or a combination thereof, among other things. In some embodiments, the training data may be labeled as being within a branching region or outside of a branching region. The process 400 may determine the training data by labeling data or accessing stored, pre-labeled training data. In some embodiments, the training data is labeled manually by a user analyzing each cross-section representation of the training data.
The process 400 proceeds to operation 403 by training the neural network to determine a branching status based on a provided lumen cross section. Training the neural network may include selecting a number of inputs in the input layer, a number of outputs in the output layer, and a number of hidden layers, as well as a number of nodes in the hidden layers. The output of the neural network is configured to output an indication of whether the cross-section is a branching region edge. The indication may be a classification or a probability, among other things.
The process 400 proceeds to operation 405 by inputting a representation of a cross-section of a lumen with a center point into the neural network. In some embodiments, inputting the representation of the lumen cross-section may include dividing the representation into subsections, such as pixels or voxels, and apply pre-processing filters before providing the representation to the input layer of the neural network.
In some embodiments, the representation of the cross-section of the lumen may include the location of points along the surface of the lumen, which may be represented by a distance between the point and the center point. The representations may also include a normalized version of the distance. For example, the distances may be normalized on a scale between 0 and 1. In some embodiments, the representation of the cross-section of the lumen may include a section identifier which indicates a section of the lumen model where the cross-section may be found. For example, an airway may be partitioned into 23 generations of branching, extending from the trachea (generation 0) the last order of the terminal bronchioles. At each generation, the airway is being divided into two smaller child airway branches. The section identifier would then indicate which generation includes the cross-section.
The process 400 proceeds to operation 407 by outputting a branching status for the lumen cross-section. The branching status may be a probability the cross-section is a branching region edge or a probability the cross-section is within the branching region of the lumen, among other things.
After completing operations 405 and 407 for one cross-section representation, operation 409 includes repeating operations 405 and 407 for a set of cross-section representations of the same lumen in order to determine branching statuses for each cross-section.
Once the neural network has output a set of branching statuses, the process 400 proceeds to operation 411 by determining the branching region edges using the branching statuses. In some embodiments, where the branching statuses include a probability, the operation 411 may include comparing each probability to each other or to a threshold to determine which cross-section is the edge of the branching region. For example, the operation 411 may determine the branching region edge by selecting the cross-section with the highest probability determined by the neural network. In another example, the operation 411 may determine the branching region edge by comparing the probabilities and selecting the cross-section with the largest change in probability compared to an adjacent cross-section, in addition to or in place of having a probability above a threshold.
As shown, the modeled lumen may have a branching structure, such as an airway. Similarly, the centerline also has a branching structure such that the centerline has a tree structure.
The process 600 begins at operation 601 where the stent design system determines training data for generating a neural network. The stent design system may also determine testing or validation data for generating the neural network. In some embodiments, the training data includes 3D lumen models with centerlines determined by the recursive center point adjustment process of
The process 600 proceeds to operation 603, where the stent design system trains a neural network using the training, testing, and/or validation data from operation 601.
With continuing reference to
The process 600 proceeds to operation 605 where the stent design system determines, for one voxel, a voxel data set configured to be input into the neural network. The stent design system may determine the voxel data set by casting rays from one voxel of the 3D model to points on the outer surface of the modeled lumen. In some embodiments, the number of casted rays is at least 42 rays for the voxel. The voxel data set input into the neural network may include one or more of the following items, which may be determined using the casted rays: a centricity value, a mean radius of the casted rays, a minimum radius of the casted rays, position coordinates (x, y, or z coordinate), or ray length values for each casted ray. The voxel data set may also include a voxel density or a number of neighboring voxels. In some embodiments, the stent design system may apply a Gaussian blur filter to the voxels of the 3D model to smooth the 3D model before casting the rays to determine the voxel data set.
The process 600 proceeds to operation 607 where the voxel data set is input into the neural network. The process 600 proceeds to operation 609 where the neural network outputs centerline status for the voxel. The centerline status may be a classification of the voxel. For example, the centerline status may indicate whether the voxel is a centerline voxel. The centerline status may also be a probability that the voxel is a centerline voxel.
The process 600 proceeds to operation 611 where operations 605 to 609 are repeated for each voxel of the interior of the lumen. After the neural network outputs centerline status for each voxel, the voxels indicated by the neural network as being centerlines may be grouped together. In another embodiment, voxels may be grouped together based on a probability of the centerline status. For example, voxels with a high probability of being centerline voxels may be grouped together. The process 600 proceeds to operation 613, where operations 605 through 611 are repeated for the group of centerline voxels. Operation 613 may be repeated until the group of centerline voxels forms a centerline less than a thickness threshold. After operation 613, the final centerline may be post-processed. Among other things, the final centerline may be smoothed or invalid branches of the centerline may be removed.
Input/output device 1004 enables computing device 1000 to communicate with external device 1010. For example, input/output device 1004 in different embodiments may be a network adapter, network credential, interface, or a port (e.g., a USB port, serial port, parallel port, an analog port, a digital port, VGA, DVI, HDMI, FireWire, CAT 5, Ethernet, fiber, or any other type of port or interface), to name but a few examples. Input/output device 1004 may be comprised of hardware, software, or firmware. It is contemplated that input/output device 1004 includes more than one of these adapters, credentials, or ports, such as a first port for receiving data and a second port for transmitting data.
External device 1010 in different embodiments may be any type of device that allows data to be input to or output from computing device 1000. For example, external device 1010 in different embodiments is a mobile device, a reader device, equipment, a handheld computer, a diagnostic tool, a controller, a computer, a server, a printer, a display, an alarm, a visual indicator, a keyboard, a mouse, a user device, a cloud device, a circuit, or a touch screen display. Furthermore, it is contemplated that external device 1010 is be integrated into computing device 1000. It is further contemplated that more than one external device is in communication with computing device 1000.
Processing device 1002 in different embodiments is a programmable type, a dedicated, hardwired state machine, or a combination thereof. Device 1002 may further include multiple processors, Arithmetic-Logic Units (ALUs), Central Processing Units (CPUs), Digital Signal Processors (DSPs), or Field-programmable Gate Array (FPGA), to name but a few examples. For forms of processing device 1002 with multiple processing units, distributed, pipelined, or parallel processing may be used as appropriate. Processing device 1002 may be dedicated to performance of just the operations described herein or may be utilized in one or more additional applications. In the illustrated form, processing device 1002 is of a programmable variety that executes processes and processes data in accordance with programming instructions (such as software or firmware) stored in memory device 1006. Alternatively or additionally, programming instructions may be at least partially defined by hardwired logic or other hardware. Processing device 1002 may be comprised of one or more components of any type suitable to process the signals received from input/output device 1004 or elsewhere, and provide desired output signals. Such components may include digital circuitry, analog circuitry, or a combination of both.
Memory device 1006 in different embodiments is of one or more types, such as a solid-state variety, electromagnetic variety, optical variety, or a combination of these forms, to name but a few examples. Furthermore, memory device 1006 may be volatile, nonvolatile, transitory, non-transitory or a combination of these types, and some or all of memory device 1006 may be of a portable variety, such as a disk, tape, memory stick, cartridge, to name but a few examples. In addition, memory device 1006 may store data that is manipulated by processing device 1002, such as data representative of signals received from or sent to input/output device 1004 in addition to or in lieu of storing programming instructions, to name but a few examples. As shown in
The process 1100 begins at operation 1101, where the stent design system receives a user input indicating a selection of a point on a 3D lumen model using a cursor. The user may click on the 3D lumen model to indicate the selection, or the user may drag an object into/within the 3D lumen model to indicate a selection. Among other things, the user input may also indicate a centerline adjustment or an object placement.
The process 1100 proceeds to operation 1103 where the stent design system determines 2D coordinates of the 2D cursor position corresponding to the point selection.
The process 1100 proceeds to operation 1105 where the stent design system translates the 2D coordinates of the cursor position to a point on the 3D lumen model. For example, the stent design system may create a ray by determining a 3D position of a camera view and the 2D pixel coordinates of the user's cursor. This forms a ray going from the camera to the 2D cursor position.
When finding points inside an airway structure, the ray may intersect with two points of the airway surrounding the lumen: the outside of the airway before the ray enters the lumen and the inside of the airway when the ray exits the lumen. The position of the two points may be averaged, the result of which is the point in the 3D model selected by the cursor.
The process 1100 proceeds to operation 1107 where the stent design system updates the 3D lumen model after translating the 2D coordinates of the cursor position to the 3D lumen model point.
Updating the 3D lumen model may include identifying the closest center point of a centerline to the 3D lumen model point. When finding the closest center point to the user's cursor, the stent design system may determine the distance between the ray of operation 1105 and each center point along the centerline. The center point with the smallest distance to the ray may be selected as the closest point on the centerline.
When updating the 3D lumen model includes placing one or more objects, each object may be centered at the center point closest to the 2D cursor positions of the selected points. The object may be a volume-defining object. In some embodiments, the volume-defining object is a 3D shape having a surface for defining a stent dimension (e.g., inner diameter). In some embodiments, the volume-defining object may be a different shape, a set of rays, or a set of points, among other things, defining a stent dimension (e.g., inner diameter). It should be appreciated that any discussion of spheres herein is an example of an object which may be used to guide the design of a customized stent for a user, and should not be understood to be a limitation of a type of object. In some embodiments, the illustrative uses of a sphere defined herein may be applied to another type of object.
A sphere may also be automatically sized according to diameters of a lumen cross-section including the center point. In some embodiments, the diameter of the sphere may represent the average lumen diameter in the center point cross-section.
Designing a customized stent for a live subject using a 3D model generated from observing the live subject provides an opportunity for designing a stent which is more likely to effectively treat a patient without causing complications. However, the accuracy of the stent design is limited by the manipulation of the 3D model using a 2D interface. By translating 2D cursor positions to a 3D model location and providing stent model guides, such as the volume-defining objects, the stent design system eliminates opportunities for user error during the stent design process.
It should be appreciated that any or all of the foregoing operations and features of the process 1100 may also be present in the other processes disclosed herein.
After a sphere is added to a branch of the 3D lumen model, the stent design system may determine a branch position value for the sphere indicating the position of the sphere within the branch. For example, the branch position value may include a range of values inclusive of 0 and 1, where 0 indicates the sphere is placed at a beginning of a branch and 1 indicates the sphere is placed at the end of the branch.
The stent design system may use the branch position values for each sphere, as well as the size of the sphere and the position of the sphere, to determine the configuration of a stent surface configured to cover the spheres, as shown in
The lumen cross-section view is configured to display a selected cross-section of the lumen. The lumen cross-section view 1410 may include cross-section information. For example, the lumen cross-section view may include a representation of the centerline, an indication of the shortest and longest diameters of the cross-section, or a 3D compass showing the relative position of the cross-section in the 3D space of the model. The lumen cross-section view 1410 may also include the current, pathological lumen and a simulated non-pathological version of the same cross-section.
For each cross-section displayed in the lumen cross-section view 1410, the stent design system may determine a center point to display using the recursive center point adjustment process of
Process 1500 begins at operation 1501, where a user interface displays multiple views of a 3D lumen model, including a perspective view and a cross-section view. The cross-section view corresponds to a cross-section of the 3D model indicated in the perspective view by a slider configured to receive user input.
Process 1500 proceeds to operation 1503, where the stent design system receives user input from the slider indicating a new cross-section to view in the cross-section view.
Process 1500 proceeds to operation 1505, where the stent design system determines a center point of the new cross-section by performing the recursive center point adjustment process of
Process 1500 proceeds to operation 1507 where the user interface displays the new cross-section in the cross-section view, along with an indication of the center point determined in operation 1505.
It should be appreciated that any or all of the foregoing operations and features of the process 1500 may also be present in the other processes disclosed herein.
As a user is designing a 3D model of a stent within a 3D model of a lumen, a user interface of the stent design system may display a view of the stent within a 2D CT image slice. Process 1600 allows a view of the stent within the 2D CT image slice to be updated in real time or near real time as the user modifies the 3D stent model. This allows the stent to appear as though it were placed within the body when the image was acquired in a CT view that medical professionals are accustomed to viewing in daily practice.
Process 1600 begins at operation 1601 where the stent design system receives a 3D stent model change to an existing 3D stent model from a 3D modeling interface.
Process 1600 proceeds to operation 1603 where the stent design system determines an image slice (e.g., CT image slice, among other things) that corresponds to the 3D stent model change. Determining the CT image slice may include determining CT image slice thickness. The CT image slice thickness may be derived from the spacing between successive CT images at the time of the image acquisition. From the volume of stacked slice images, there is a relative coordinate system which may be used to determine which slices are intersecting the 3D stent model.
Process 1600 proceeds to operation 1605 where the stent design system translates the 3D stent model into a sliced object reflecting the 3D stent model change. The sliced object corresponds to a cross-section of the 3D stent model. In some embodiments, the 3D stent model, the 2D CT image slice, and the 3D lumen model share a coordinate system.
In certain embodiments, the 3D stent model and the 3D lumen model are in a patient coordinate system derived from CT scan images (measured in mm) and the 2D slice images are in a 2D pixel coordinate system (i.e. 512×512 pixels). To align the 2D images with the 3D objects, the 2D images may be stretched in the x or y dimension based on the voxel size (i.e. if the voxels aren't square then the images needs to be stretched). Furthermore, the 3rd dimension from the 2D slice may need to be determined by the stent design system, which is done by taking the index of the image slice in the 2D image stack of the CT scan and comparing it to the size of the 3D volume of one or more of the 3D models.
Process 1600 proceeds to operation 1607 where the stent design system overlays the sliced object onto the CT image slice. The sliced object may be overlayed in any plane, such as the coronal plane, axial plane, or sagittal plane, to name but a few examples. Process 1600 proceeds to operation 1609 where the CT image slice with the overlay of the 3D stent model slice is display to the user. In some embodiments, the stent design system may update the CT image slice in real-time as the user modifies the 3D stent model in the 3D modeling interface. It should be appreciated that process 1600 could be adapted such that the user could modify the 3D stent model slice and the 3D stent model would be updated in the 3D modeling interface.
It should be appreciated that any or all of the foregoing operations and features of the process 1600 may also be present in the other processes disclosed herein.
It is contemplated that the various aspects, features, processes, and operations from the various embodiments may be used in any of the other embodiments unless expressly stated to the contrary. Certain operations illustrated may be implemented by a computer executing a computer program product on a non-transient, computer-readable storage medium, where the computer program product includes instructions causing the computer to execute one or more of the operations, or to issue commands to other devices to execute one or more operations.
While the present disclosure has been illustrated and described in detail in the drawings and foregoing description, the same is to be considered as illustrative and not restrictive in character, it being understood that only certain exemplary embodiments have been shown and described, and that all changes and modifications that come within the spirit of the present disclosure are desired to be protected. It should be understood that while the use of words such as “preferable,” “preferably,” “preferred” or “more preferred” utilized in the description above indicate that the feature so described may be more desirable, it nonetheless may not be necessary, and embodiments lacking the same may be contemplated as within the scope of the present disclosure, the scope being defined by the claims that follow. In reading the claims, it is intended that when words such as “a,” “an,” “at least one,” or “at least one portion” are used there is no intention to limit the claim to only one item unless specifically stated to the contrary in the claim. The term “of” may connote an association with, or a connection to, another item, as well as a belonging to, or a connection with, the other item as informed by the context in which it is used. The terms “coupled to,” “coupled with” and the like include indirect connection and coupling, and further include but do not require a direct coupling or connection unless expressly indicated to the contrary. When the language “at least a portion” or “a portion” is used, the item can include a portion or the entire item unless specifically stated to the contrary. Unless stated explicitly to the contrary, the terms “or” and “and/or” in a list of two or more list items may connote an individual list item, or a combination of list items. Unless stated explicitly to the contrary, the transitional term “having” is open-ended terminology, bearing the same meaning as the transitional term “comprising.”
Various embodiments of the invention may be implemented at least in part in any conventional computer programming language. For example, some embodiments may be implemented in a procedural programming language (e.g., “C”), or in an object oriented programming language (e.g., “C++”). Other embodiments of the invention may be implemented as a pre-configured, stand-alone hardware element and/or as preprogrammed hardware elements (e.g., application specific integrated circuits, FPGAs, and digital signal processors), or other related components.
In an alternative embodiment, the disclosed apparatus and methods (e.g., see the various flow charts described above) may be implemented as a computer program product for use with a computer system. Such implementation may include a series of computer instructions fixed either on a tangible, non-transitory medium, such as a computer readable medium (e.g., a diskette, CD-ROM, ROM, or fixed disk). The series of computer instructions can embody all or part of the functionality previously described herein with respect to the system.
Those skilled in the art should appreciate that such computer instructions can be written in a number of programming languages for use with many computer architectures or operating systems. Furthermore, such instructions may be stored in any memory device, such as semiconductor, magnetic, optical or other memory devices, and may be transmitted using any communications technology, such as optical, infrared, microwave, or other transmission technologies.
Among other ways, such a computer program product may be distributed as a removable medium with accompanying printed or electronic documentation (e.g., shrink wrapped software), preloaded with a computer system (e.g., on system ROM or fixed disk), or distributed from a server or electronic bulletin board over the network (e.g., the Internet or World Wide Web). In fact, some embodiments may be implemented in a software-as-a-service model (“SAAS”) or cloud computing model. Of course, some embodiments of the invention may be implemented as a combination of both software (e.g., a computer program product) and hardware. Still other embodiments of the invention are implemented as entirely hardware, or entirely software.
The embodiments of the invention described above are intended to be merely exemplary; numerous variations and modifications will be apparent to those skilled in the art. Such variations and modifications are intended to be within the scope of the present invention as defined by any of the appended claims. It shall nevertheless be understood that no limitation of the scope of the present disclosure is hereby created, and that the present disclosure includes and protects such alterations, modifications, and further applications of the exemplary embodiments as would occur to one skilled in the art with the benefit of the present disclosure.
This patent application claims priority from provisional United States patent application Nos. 63/347,916 (filed Jun. 1, 2022), 63/348,316 (filed Jun. 2, 2022), 63/348,299 (filed Jun. 2, 2022), 63/348,304 (filed Jun. 2, 2022), 63/348,306 (filed Jun. 2, 2022), 63/396,932 (filed Aug. 10, 2022), 63/396,934 (filed Aug. 10, 2022), the disclosures of which are incorporated herein, in its entirety, by reference.
Number | Date | Country | |
---|---|---|---|
63396934 | Aug 2022 | US | |
63396932 | Aug 2022 | US | |
63348306 | Jun 2022 | US | |
63348304 | Jun 2022 | US | |
63348299 | Jun 2022 | US | |
63348316 | Jun 2022 | US | |
63347916 | Jun 2022 | US |