Systems and methods for medical image analysis

Information

  • Patent Grant
  • 11769251
  • Patent Number
    11,769,251
  • Date Filed
    Tuesday, December 22, 2020
    3 years ago
  • Date Issued
    Tuesday, September 26, 2023
    8 months ago
Abstract
Systems, instruments, and methods for medical treatment are disclosed. The methods comprise, by a computing device: receiving information identifying at least one first point on a body part shown in a medical image; overlaying a first mark on the medical image for the at least one first point; generating a spline based at least on the first mark; overlaying a second mark for the spline on the medical image; identifying a location of at least one second point on the body part shown in the medical image based on the first and second marks; overlaying a third mark for the at least one second point on the medical image; and using at least the third mark to facilitate the medical treatment of an individual whose body part is shown in the medical image.
Description
BACKGROUND

Spinal disorders such as degenerative disc disease, disc herniation, osteoporosis, spondylolisthesis, stenosis, scoliosis and other curvature abnormalities, kyphosis, tumor, and fracture may result from factors including trauma, disease and degenerative conditions caused by injury and aging. Spinal disorders typically result in symptoms including pain, nerve damage, and partial or complete loss of mobility.


Non-surgical treatments, such as medication, rehabilitation, and exercise can be effective, however, may fail to relieve the symptoms associated with these disorders. Surgical treatment of these spinal disorders includes correction, fusion, fixation, discectomy, laminectomy, and/or implantable prosthetics. As part of these surgical treatments, spinal constructs, which include implants such as bone fasteners, connectors, plates, and vertebral rods are often used to provide stability to a treated region. These implants can redirect stresses away from a damaged or defective region while healing takes place to restore proper alignment and generally support the vertebral members. The particular curvature, length and/or other parameters of the implants can be a key factor in obtaining successful results from surgery.


SUMMARY

The present disclosure relates to implementing systems and methods for medical treatment. The methods may comprise performing the following operations by a computing device: causing an imaging device to capture a medical image; receiving information identifying at least one first point (e.g., a first vertebrae endpoint) on a body part (e.g., a spine) shown in a medical image; overlaying a first mark on the medical image for the first point; generating a spline (e.g., a piecewise polynomial curve) based on the first mark and/or machine learned model(s) (e.g., defining possible structure(s) of the body part); overlaying a second mark for the spline on the medical image; identifying a location of at least one second point (e.g., a second different vertebrae endpoint) on the body part (e.g., the spine) shown in the medical image based on the first mark, the second mark, machine learned model(s) and/or contents of a scientific database; overlaying a third mark for the second point on the medical image; address an error in at least one characteristic of the third mark; and/or using at least the third mark to facilitate the medical treatment of an individual whose body part is shown in the medical image.


In some scenarios, the second mark comprises a curved line that (i) extends between a mid-point of the first mark and a mid-point of another mark, and (ii) extends along a centerline of the body part. The error is addressed by: analyzing differences in gray levels for pixels residing within a given area surrounding an end of the third mark to determine a precise location of an object corner (e.g., a vertebrae corner); and modifying at least one of a shape of the third mark, a size of the third mark and a location of the third mark relative to the medical image in accordance with results of the analyzing.


The present disclosure relates also relates to a system comprising a processor and a non-transitory computer-readable storage medium. The non-transitory computer-readable storage medium comprises programming instructions that are configured to cause the processor to implement the above-described method for medical treatment.


The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular descriptions of exemplary embodiments of the invention as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts of the disclosure.





BRIEF DESCRIPTION OF DRAWINGS

The following drawings are illustrative of particular embodiments of the present disclosure and therefore do not limit the scope of the present disclosure. The drawings are not to scale and are intended for use in conjunction with the explanations in the following detailed description.



FIG. 1 provides an illustration of an illustrative system.



FIG. 2 provides an illustration of an illustrative computing device.



FIG. 3 provides an illustration of an illustrative process for analyzing images.



FIG. 4 provides an illustration of a spine.



FIG. 5 provides an illustration of a vertebra.



FIGS. 6A-6B (collectively referred to as “FIG. 6”) provide a flow diagram of an illustrative method for medical treatment.





DETAILED DESCRIPTION

The following discussion omits or only briefly describes certain conventional features related to surgical systems for treating the spine, which are apparent to those skilled in the art. It is noted that various embodiments are described in detail with reference to the drawings, in which like reference numerals represent like parts and assemblies throughout the several views. Reference to various embodiments does not limit the scope of the claims appended hereto. Additionally, any examples set forth in this specification are intended to be non-limiting and merely set forth some of the many possible embodiments for the appended claims. Further, particular features described herein can be used in combination with other described features in each of the various possible combinations and permutations.


Unless otherwise specifically defined herein, all terms are to be given their broadest possible interpretation including meanings implied from the specification as well as meanings understood by those skilled in the art and/or as defined in dictionaries, treatises, etc. It must also be noted that, as used in the specification and the appended claims, the singular forms “a,” “an” and “the” include plural referents unless otherwise specified, and that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.


Embodiments of the present disclosure generally relate to implementing systems and methods for analyzing medical images. The methods involve: causing an imaging device to capture a medical image; receiving information identifying at least one first point (e.g., a first vertebrae endpoint) on a body part (e.g., a spine) shown in a medical image; overlaying a first mark on the medical image for the first point; generating a spline (e.g., a piecewise polynomial curve) based on the first mark and/or machine learned model(s) (e.g., defining a possible structure of a spine); overlaying a second mark for the spline on the medical image; identifying a location of at least one second point (e.g., a second different vertebrae endpoint) on the body part (e.g., the spine) shown in the medical image based on the first mark, the second mark, machine learned model(s) and/or contents of a scientific database; overlaying a third mark for the second point on the medical image; address an error in a characteristic of the third mark; and/or using at least the third mark to facilitate the medical treatment of an individual whose body part is shown in the medical image.


Referring now to FIG. 1, there is provided an illustration of an illustrative system 100. System 100 is generally configured to analyze images to detect a spine and/or a vertebrae of a patient 104. During operation, the patient 104 may be placed on a support structure 106 (e.g., table) such that imaging device(s) 102 is(are) aligned with a body part of interest (e.g., a back).


The imaging device(s) 102 can include, but is(are) not limited to, an X-ray system. The imaging device(s) 102 is(are) generally configured to capture at least one image 118 of a treatment area (e.g., at least a portion of the patient's spine). The image can include, but is not limited to, a Digital Tomosynthesis (DT) scan image, a Computed Tomography (CT) scan image, a Magnetic Resonance Imaging (MRI) image and/or a Positron Emission Tomography (PET) scan image).


The image 118 is communicated from the imaging device 102 to a computing device 108 via a communications link 120 and/or via a network 112 using communication links 122, 124. Additionally or alternatively, the image 118 is communicated from the imaging device 102 to a server 114 via network 112 using communication links 122, 126. The server 114 is configured to access a datastore 116 for reading information 128 therefrom and writing information 128 thereto. The network can include, but is not limited to, a Local Area Network (LAN), a Wide Area Network (WAN), an Intranet and/or the Internet. The communication links 120, 122, 124, 126 can be wired communication links and/or wireless communication links.


The image 118 is analyzed by the computing device 108 and/or server 114. The image analysis is generally performed automatively to detect a vertebral object and/or generate measurement data for the same. The manner in which such detection is made will become apparent as the discussion progresses. In some scenarios, the detection is achieved using anatomical knowledge and/or scientific literature to identify approximate positions of vertebrae in a given image, and then analyzing content of each approximate position to determine whether a vertebral object resides thereat.


Referring now to FIG. 2, there is provided an illustration of an illustrative architecture for a computing device 200. The computing device 108 and/or server 114 of FIG. 1 is at least partially the same as or similar to computing device 200. As such, the discussion of computing device 300 is sufficient for understanding the vehicle on-board computing device 220 of FIG. 2.


Computing device 200 may include more or less components than those shown in FIG. 2. However, the components shown are sufficient to disclose an illustrative solution implementing the present solution. The hardware architecture of FIG. 2 represents one implementation of a representative computing device configured to perform medical image processing, as described herein. As such, the computing device 200 of FIG. 2 implements at least a portion of the method(s) described herein.


Some or all components of the computing device 200 can be implemented as hardware, software and/or a combination of hardware and software. The hardware includes, but is not limited to, one or more electronic circuits. The electronic circuits can include, but are not limited to, passive components (e.g., resistors and capacitors) and/or active components (e.g., amplifiers and/or microprocessors). The passive and/or active components can be adapted to, arranged to and/or programmed to perform one or more of the methodologies, procedures, or functions described herein.


As shown in FIG. 2, the computing device 200 comprises a user interface 202, a Central Processing Unit (CPU) 206, a system bus 210, a memory 212 connected to and accessible by other portions of computing device 200 through system bus 210, a system interface 260, and hardware entities 214 connected to system bus 210. The user interface can include input devices and output devices, which facilitate user-software interactions for controlling operations of the computing device 200. The input devices include, but are not limited to, a physical and/or touch keyboard 250. The input devices can be connected to the computing device 200 via a wired or wireless connection (e.g., a Bluetooth® connection). The output devices include, but are not limited to, a speaker 252, a display 254, and/or light emitting diodes 256. Display 254 can include, but is not limited to, a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), a plasma display and/or a touch-screen display. System interface 260 is configured to facilitate wired or wireless communications to and from external devices (e.g., network nodes such as access points, etc.).


At least some of the hardware entities 214 perform actions involving access to and use of memory 212, which can be a Random Access Memory (RAM), a disk drive, flash memory, a Compact Disc Read Only Memory (CD-ROM) and/or another hardware device that is capable of storing instructions and data. Hardware entities 214 can include a disk drive unit 216 comprising a computer-readable storage medium 218 on which is stored one or more sets of instructions 220 (e.g., software code) configured to implement one or more of the methodologies, procedures, or functions described herein. The instructions 220 can also reside, completely or at least partially, within the memory 212 and/or within the CPU 206 during execution thereof by the computing device 200. The memory 212 and the CPU 206 also can constitute machine-readable media. The term “machine-readable media”, as used here, refers to a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 220. The term “machine-readable media”, as used here, also refers to any medium that is capable of storing, encoding or carrying a set of instructions 220 for execution by the computing device 200 and that cause the computing device 200 to perform any one or more of the methodologies of the present disclosure.


The methodologies of the present disclosure may be implemented at least partly by application(s) 224. The application(s) 224 may include, but are not limited to, web-enabled applications that uses text, widgets, graphics, audio, video and other media to present data and to allow interaction with data via a network (e.g., network 112 of FIG. 1). The widgets are configured to facilitate user-software interactions with computing device 200. In this regard, the widgets can include, but are not limited to, menus, windows, dialogue boxes, tool bars and/or controls (for example, radio buttons, check boxes, and/or sliding scales).


In some embodiments, one or more features of the present solution described herein can utilize a Uniform Resource Locator (URL) and/or cookies, for example for storing and/or transmitting data or user information. The URL can include a web address and/or a reference to a web resource that is stored on in a datastore (e.g., datastore 116 of FIG. 1). The URL can specify the location of the resource on a computer and/or a computer network. The URL can include a mechanism to retrieve the network resource. The source of the network resource can receive a URL, identify the location of the web resource, and transmit the web resource back to the requestor. A URL can be converted to an IP address. A Doman Name System (DNS) can look up the URL and its corresponding IP address. URLs can be references to web pages, file transfers, emails, database accesses and/or other applications. The URLs can include a sequence of characters that identify a path, domain name, a file extension, a host name, a query, a fragment, scheme, a protocol identifier, a port number, a username, a password, a flag, an object and/or a resource name. The systems disclosed herein can generate, receive, transmit, apply, parse, serialize, render, and/or perform an action on a URL.


A cookie (also referred to as an HTTP cookie, a web cookie, an internet cookie, and a browser cookie) can include data sent from a website and/or stored on the computing device 200 and/or another computing device. This data can be stored by a web browser while the user is browsing through data. The cookies can include useful information for websites to remember prior browsing information. The useful information can include, but is not limited to, a shopping cart on an online store, a clicking of buttons, login information and/or records of web pages or network resources visited in the past. Cookies can also include information that the user enters such as names, addresses, passwords, credit card information, etc. Cookies can also perform computer functions. For example, authentication cookies can be used by applications 224 (e.g., a web browser) to identify whether the user is already logged in (e.g., to a web site). The cookie data can be encrypted to provide security for the consumer. Tracking cookies can be used to compile historical browsing histories of individuals. Systems disclosed herein can generate and use cookies to access data of an individual. Systems can also generate and use JSON web tokens to store authenticity information, HTTP authentication as authentication protocols, IP addresses to track session or identity information and/or URLs.


Referring now to FIG. 3, there is provided an illustration of an illustrative method 300 for analyzing a medical image 302. It should be noted that machine learned models are used in method 300 for various purposes. The manner in which the machine learned models are used will become evident as the discussion progresses. The machine learned models are generated based on medical images for a plurality of individuals. Each machine learned model represents a possible structure of a body part (e.g., a spine). In this regard, each machine learned model may define relative locations of femoral heads to vertebrae, relative locations of vertebrae to each other, dimensions of vertebrae, relative locations of vertebrae edges, angles of vertebrae edges relative to a reference line, a centerline of the spine, and a curvature of the centerline.


After the machine learned models have been generated, an individual (e.g., individual 110 of FIG. 1) of a medical practice (e.g., a doctor, a nurse, or a technician) performs a user-software action to launch a software application (e.g., software application 224 of FIG. 4) installed on a computing device (e.g., computing device 108 of FIG. 1, server 114 of FIG. 1, and/or computing device 200 of FIG. 2). The software application is generally configured to cause a medical image 302 of a patient (e.g., patient 104 of FIG. 1) to be captured by an imaging device (e.g., imaging device 102 of FIG. 1). The medical image 302 may be captured during an office visit. The medical image 302 comprises an image of a part of the patient's body (e.g., a spine).


Next, the medical image 302 is presented to the individual on a display (e.g., display 254 of FIG. 2) of the computing device. The individual visually analyzes the medical image 302 to identify at least one point on the patient's body (e.g., spine 314) shown in the medical image 302. In the spine scenarios, each point can include, but is not limited to, a point on a femur, a cervical vertebra C1-C8, a thoracic vertebra T1-T12, a lumbar vertebra L1-L5, a sacral vertebra S1-S4 or other part along the vertebral column of the patient (e.g., patient 104 of FIG. 1). An illustration showing the listed types of vertebrae is provided in FIG. 4.


Once the point(s) on the patient's body (e.g., spine 314) has(have) been identified, a mark is overlaid or superimposed on the medical image 302 for each identified point. Techniques for overlaying or superimposing marks on images are well known. Any known or to be known technique can be used herein. For example, a mark can be overlaid or superimposed on the medical image 302 via user-software interactions performed by the individual using an annotation or drawing tool of the software application. The present solution is not limited in this regard. Annotation and drawing tools are well known.


The mark can include, but is not limited to, a line, a circle, and/or a square. For example, a circular mark 315 is drawn on the medical image 302 so as to encompass a femoral edge. A first linear mark 316 is drawn on the medical image 302 so as to extend parallel to and/or along an top end plate of a sacral vertebrae S1. A second linear mark 317 is drawn on the medical image 302 so as to extend parallel to and/or along a top end plate of a lumbar vertebrae L1. The present solution is not limited to the particulars of this example.


Next in 308, the computing device performs operations to generate, create or select a spline based on the machine learned models and/or the marks 315, 316, 317 overlaid/superimposed on the medical image 302. In some scenarios, the computing device inputs information about the marks 315, 316, 317 into a machine-learning algorithm. The machine-learning algorithm can employ supervised machine learning, semi-supervised machine learning, unsupervised machine learning, and/or reinforcement machine learning. Each of these listed types of machine-learning is well known in the art.


In some scenarios, the machine-learning algorithm includes, but is not limited to, a decision tree learning algorithm, an association rule learning algorithm, an artificial neural network learning algorithm, a deep learning algorithm, an inductive logic programming based algorithm, a support vector machine based algorithm, a Bayesian network based algorithm, a representation learning algorithm, a similarity and metric learning algorithm, a sparse dictionary learning algorithm, a genetic algorithm, a rule-based machine-learning algorithm, and/or a learning classifier systems based algorithm. The machine-learning process implemented by the present solution can be built using Commercial-Off-The-Shelf (COTS) tools (e.g., SAS available from SAS Institute Inc. of Cary, N.C.).


The machine-learning algorithm uses the inputted information and the previously generated machine learned models to generate/create a spline using curve fitting and/or select a pre-defined spline from a plurality of pre-defined possible splines. Curve fitting is well known in the art. In some scenarios, the spline includes a piecewise polynomial curve. Accordingly, a mathematical function may be employed to generate the spline as an alternative to or in addition to the machine-learning algorithm. The mathematical function can be defined by one or more polynomial equations representing one or more numerically complex contours on a given body part (e.g., a spine).


A mark for the spline is then overlaid or superimposed on the medical image 302. For example, a curved line 318 is overlaid or superimposed on the medical image 302 as shown in FIG. 3. Line 318 extends between the first mark 316 and the second mark 317. Line 318 starts at a mid-point 324 of the first mark 316 and ends at a mid-point 326 of the second mark 317. In effect, the line 318 generally follows and extends along a centerline of the vertebrae L1-L5. The present solution is not limited in this regard.


The mark 318 may be presented to the individual via the display (e.g., display 254 of FIG. 2) of the computing device. In some scenarios, the software application is configured to (i) allow the individual to adjust or modify the mark 318 and/or (ii) approve the mark 318, prior to proceeding with the operations of 310.


In 310, the computing device performs operations to identify locations and/or positions of vertebra endplates in the medical image 302. As shown in FIG. 5, a vertebral endplate 504, 506 of a vertebrae 500 comprises a surface defining a transition region where a vertebral body 502 and an intervertebral disc (not shown) interface with each other. These identifications can be made based on the machine learned models, the marks 315, 316, 317, 318 overlaid/superimposed on the medical image 302, and/or contents of scientific database (e.g., datastore 116 of FIG. 1 that stores models of a normal spine and relative vertebrae locations/positions). In some scenarios, the computing device inputs information about the marks 315, 316, 317, 318 into a machine-learning algorithm. The machine-learning algorithm can be the same as or different than the machine-learning algorithm employed in 308.


The machine-learning algorithm uses inputted information about marks 315, 316, 317, 318 and the previously generated machine learned models to generate or select possible locations/positions for vertebrae endplates. For example, the machine-learning algorithm can determine a location or position of a vertebrae endplate based on a body part identifier (e.g. L1 or S1) associated with each mark 315, 316, 317, a distance between marks 316, 317 in the medical image 302, a distance between mark 315 and each mark 316, 317 in the medical image 302, a location of spline mark 318 in the medical image 302, a shape of the spline mark 318, a curvature of the spline mark 318, an angle 330 of mark 316 relative to a reference line 328, and an angle 332 of mark 317 relative to the reference line 328. The listed information is used to identify a machine learned model from a plurality of pre-defined machine learned models. The locations/positions for vertebrae endplates specified in the identified machine learned model are then considered to be the locations/positions of vertebrae endplates in the medical image 302. The present solution is not limited to the particulars of this example.


A mark 320 is overlaid or superimposed on the medical image 302 at each identified location or position for a vertebra endplate. Each mark 320 can include, but is not limited to, a linear line. The mark 320 can have a center point 340 that resides on the spline mark 318 such that the linear line extends an equal amount in opposing directions from the spline mark 318.


The mark(s) 320 may be presented to the individual via the display (e.g., display 254 of FIG. 2) of the computing device. In some scenarios, the software application is configured to (i) allow the individual to adjust or modify the mark(s) 320 and/or (ii) approve the mark(s) 320, prior to proceeding with the operations of 312.


In 312, the computing device performs operations to address any errors in the positions/locations/sizes of the mark(s) 320 in the medical image 302. These operations involve defining circles 334 within the medical image 302. Each circle 334 has a pre-defined size and encompasses an end point 336 of a given mark 320. The computing device then analyzes the area of the medical image contained in each circle 334 to determine a precise location of a vertebrae's corner. For example, the computing device analyzes differences in pixel color, gray level and/or contrast within the given area to determine the precise location of a vertebrae's corner. As shown in FIG. 3, the pixel(s) associated with the vertebrae's corner has(have) a weaker gray level than the pixels associated with a surrounding structure. This difference in color, gray level and/or contrast is used to identify the precise location of a vertebrae's corner. The present solution is not limited to the particulars of this example.


Once the precise location(s) of the vertebrae corner(s) is(are) identified, the computing device may perform operations to adjust or modify the shape(s), size(s) and/or location(s) of the mark(s) 320 relative to the medical image 302. For example, a length of a given mark 320 is increased or decreased, and/or an angle of a given mark 320 is changed relative to the reference line 328 based on displacements or differences between the locations of the mark corners and the respective precise locations(s) of vertebrae corner(s). The present solution is not limited to the particulars of this example.


Subsequent to completing the operations of 312, the computing device may use information about the marks 315, 316, 317, 318 and/or 320 to create a treatment plan for the patient. For example, the information about the marks 315, 316, 317, 318 and/or 320 is used by the computing system to identify a possible abnormal condition or medical disease of the patient and/or make a prediction about a future medical disease or abnormal condition for the patient. This identification/predication can be made by comparing the particulars of marks 315, 316, 317, 318 and/or 320 to particulars of marks contained in pre-defined machine learned module(s). An abnormal condition or medical disease is identified/predicted when the particulars of the marks 315, 316, 317, 318 and/or 320 match the particulars of marks contained in pre-defined machine learned module(s) by a certain amount (e.g., 70%). The identified or predicted medical disease or abnormal condition is then used to generate a treatment plan. The treatment plan comprises an electronic and/or paper document that describes the patient's individualized diagnosis, needs, goals, treatment interventions and treatment providers. The treatment plan may also include measurements of radiographic parameters (e.g., spinopelvic and frontal) at different stages (e.g., preoperatively and surgical planning stages). The radiographic measurements can include, but are not limited to, pelvic tilt, pelvic incidence, sacral slope, lumbar lordosis, thoracic kyphosis, pelvic angle, sagittal vertical axis, and vertebra lordosis. The treatment plan may further include planning specifications (e.g., a rod is to be lengthened according to coronal deformity).


In some scenarios, a treatment system (e.g., treatment system 150 of FIG. 1) may then be programmed such that it can perform operations in accordance with the treatment plan. The treatment system can include, but is not limited to, an implant creating system, a surgical tool, and/or a radiation therapy device. Each of the listed treatment systems are well known. The treatment system may then perform operations to (i) produce, locate, order or otherwise obtain a medical device to be implanted in the patient (e.g., a cage, a hook, a plate, a screw, a rod and/or a spinal cord stimulator) and/or (ii) apply a treatment to the patient during a medical procedure (e.g., facilitate the control of and/or the implanting of one or more medical devices or tools).


The above described solution can be employed and incorporated in other systems such as those disclosed in U.S. patent application Ser. No. 16/837,461, U.S. patent application Ser. No. 16/404,276, U.S. patent application Ser. No. 15/958,409, U.S. patent application Ser. No. 16/182,466, and International Patent Application No. PCT/FR17/053506. The entire contents of which are incorporated herein by reference.


Referring now to FIG. 6, there is provided a flow diagram of an illustrative method 600 for medical treatment. As shown in FIG. 6A, method 600 begins with 602 and continues with 604 where medical images for a plurality of individuals are obtained. The medical images can include, but are not limited to, a DT scan image, a CT scan image, an MRI image, and/or a PET scan image. In 606, machine learned models are generated based on the medical images. Each machine learned model represents a possible structure of a body part (e.g., a spine). For example, each machine learned model may define relative locations of a femoral edge to vertebrae, relative locations of vertebrae to each other, dimensions of vertebrae, relative locations of vertebrae edges, angles of vertebrae edges relative to a reference line, a centerline of the spine, and a curvature of the centerline. The present solution is not limited to the particulars of this example.


In 608, a software application (e.g., software application 224 of FIG. 2) is launched. Techniques for launching software applications installed on computing devices are well known in the art. The software application is configured to cause a medical image (e.g., medical image 302 of FIG. 2) of an individual (e.g., individual 104 of FIG. 1) to be captured by at least one imaging device (e.g., imaging device(s) 102 of FIG. 1). The medical image is presented on a display (e.g., display 254 of FIG. 2) of a computing device (e.g., computing device 108 of FIG. 1, server 114 of FIG. 1, and/or computing device 200 of FIG. 2).


Next in 614, the medical image is visually analyzed or inspected to identify point(s) on a body part shown therein. For example, one or more vertebrae endplates of a spine is identified in the medical image. A mark is overlaid or superimposed on the medical image for each identified point, as shown by 616. An annotation tool or drawing tool can be used to overlay or superimpose the mark(s) on the medical image.


In 618, a spline is generated based on the machine learned model(s) and/or particulars of the mark(s) overlaid/superimposed on the medical image in 616. The spline may be generated in the same or substantially similar manner as that described above in relation to FIG. 3. A mark for the spline is overlaid/superimposed on the medical image in 620. The mark for the spine may be presented on the display of the computing device, as shown by 622. Manual adjustments to, modifications to and/or approval of the spline mark may optionally be allowed in 624. The manual actions can be achieved using one or more widgets of a Graphical User Interface (GUI) presented on the display of the computing device. The GUI can be the same as or different than the GUI in which the medical image is displayed.


In 626, the computing device performs operations to identify second point(s) on the body part shown in the medical image. For example, the computing device identifies location(s) and/or position(s) of vertebra endpoint(s) in a spine shown in the medical image. The manner in which the identification(s) is(are) made is the same as or substantially similar to that described above in relation to FIG. 3. In this regard, the identification(s) is(are) made based on the machine learned model(s), the marks overlaid/superimposed on the medical image in 616 and/or 620, and/or contents of a scientific database (e.g., database 116 of FIG. 1).


In 628, a mark is overlaid or superimposed on the medical image for each identified second point (e.g., location/position for a vertebrae endplate). Upon completing 628, method 600 continues with 630 of FIG. 6B.


As shown in FIG. 6B, 630 involves optionally presenting the mark(s) for the vertebrae endplates on the display of the computing device. The software application may allow manual adjustments to, modification of and/or approval of the mark(s) for the vertebra endplates, as shown by 632. The manual actions can be achieved using one or more widgets of a GUI presented on the display of the computing device. The GUI can be the same as or different than the GUI in which the medical image is displayed.


In 634, image recognition is used to identify and address any errors in mark(s) for the vertebrae endplate(s). The manner in which the image recognition is performed is the same as or substantially similar to that described above in relation to block 312 of FIG. 3. Next, method 600 may continue with 636, 638 and/or 640. 636-640 involve: using the particulars of the marks to create a treatment plane; programming a treatment system (e.g., treatment system 150 of FIG. 1) with the treatment plan; and/or performing operations by the treatment system to treat the individual. Subsequent to completing 634, 636, 638 or 640, 642 is performed where method 600 ends other operations are performed (e.g., return to 602 of FIG. 6A).


It will be understood that various modifications may be made to the embodiments disclosed herein. Therefore, the above description should not be construed as limiting, but merely as exemplification of the various embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended hereto.

Claims
  • 1. A method for medical treatment, comprising: receiving, by a computing device, information identifying at least one first point on a body part shown in a medical image;overlaying, by the computing device, a first mark on the medical image for the at least one first point;generating, by the computing device, a spline based at least on the first mark;overlaying, by the computing device, a second mark for the spline on the medical image;identifying, by the computing device, a location of at least one second point on the body part shown in the medical image based on the first and second marks;overlaying, by the computing device, a third mark for the at least one second point on the medical image;analyzing differences in gray levels for pixels residing within a given area surrounding an end of the third mark to determine a precise location of an object corner; andmodifying at least one of a shape of the third mark, a size of the third mark and a location of the third mark relative to the medical image in accordance with results of the analyzing; andusing at least the third mark to facilitate the medical treatment of an individual whose body part is shown in the medical image.
  • 2. The method according to claim 1, further comprising performing operations, by the computing device, to cause an imaging device to capture the medical image.
  • 3. The method according to claim 1, wherein at least one machine learned model is additionally used to generate the spline.
  • 4. The method according to claim 3, wherein the at least one machine learned model defines a possible structure of a spine.
  • 5. The method according to claim 1, wherein the spline comprises a piecewise polynomial curve.
  • 6. The method according to claim 1, wherein the second mark comprises a curved line that (i) extends between a mid-point of the first mark and a mid-point of another mark, and (ii) extends along a centerline of the body part.
  • 7. The method according to claim 1, wherein the first and second points comprise vertebrae endpoints.
  • 8. The method according to claim 1, wherein the location of the at least one second point is identified further based on at least one of a machine learned model and contents of a scientific database.
  • 9. The method according to claim 1, wherein the precise location of the object corner is determined by: defining a circle within the medical image that encompasses an end point of the third mark; andconsidering the given area as comprising an area of the medical image contained in the circle.
  • 10. A system, comprising: a processor;a non-transitory computer-readable storage medium comprising programming instructions that are configured to cause the processor to implement a method for medical treatment, wherein the programming instructions comprise instructions to: receive information identifying at least one first point on a body part shown in a medical image;overlay a first mark on the medical image for the at least one first point;generate a spline based at least on the first mark;overlay a second mark for the spline on the medical image;identify a location of at least one second point on the body part shown in the medical image based on the first and second marks;overlay a third mark for the at least one second point on the medical image;analyze differences in gray levels for pixels residing within a given area surrounding an end of the third mark to determine a precise location of an object corner; andmodify at least one of a shape of the third mark, a size of the third mark and a location of the third mark relative to the medical image in accordance with results of the analyzing; anduse at least the third mark to facilitate the medical treatment of an individual whose body part is shown in the medical image.
  • 11. The system according to claim 10, wherein the programming instructions further comprise instructions to cause an imaging device to capture the medical image.
  • 12. The system according to claim 10, wherein at least one machine learned model is additionally used to generate the spline.
  • 13. The system according to claim 12, wherein the at least one machine learned model defines a possible structure of a spine.
  • 14. The system according to claim 10, wherein the spline comprises a piecewise polynomial curve.
  • 15. The system according to claim 10, wherein the second mark comprises a curved line that (i) extends between a mid-point of the first mark and a mid-point of another mark, and (ii) extends along a centerline of the body part.
  • 16. The system according to claim 10, wherein the first and second points comprise vertebrae endpoints.
  • 17. The system according to claim 10, wherein the location of the at least one second point is identified further based on at least one of a machine learned model and contents of a scientific database.
  • 18. The system according to claim 10, wherein the precise location of an object corner is determined by: defining a circle within the medical image that encompasses an end point of the third mark; andconsidering the given area as comprising an area of the medical image contained in the circle.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims priority to and the benefit of U.S. Patent Ser. No. 62/953,677, which is entitled “Systems, Devices, And Methods For Medical Image Analysis” and was filed on Dec. 23, 2020. The content of which are incorporated herein in its entirety.

US Referenced Citations (481)
Number Name Date Kind
4382438 Jacobs May 1983 A
5006984 Steele Apr 1991 A
5163440 DeLuca et al. Nov 1992 A
5209752 Ashman et al. May 1993 A
5224035 Yamashita et al. Jun 1993 A
5251127 Raab Oct 1993 A
5291901 Graf Mar 1994 A
5305203 Raab Apr 1994 A
5312405 Korotko et al. May 1994 A
5413116 Radke et al. May 1995 A
5514180 Heggeness et al. May 1996 A
5667506 Sutterlin Sep 1997 A
5748767 Raab May 1998 A
5785663 Sarvazyan Jul 1998 A
6015409 Jackson Jan 2000 A
6213958 Winder Apr 2001 B1
6282437 Franck et al. Aug 2001 B1
6302888 Mellinger et al. Oct 2001 B1
6364849 Wilcox Apr 2002 B1
6385475 Cinquin et al. May 2002 B1
6409684 Wilk Jun 2002 B1
6443953 Perra et al. Sep 2002 B1
6499488 Hunter et al. Dec 2002 B1
6565519 Benesh May 2003 B2
6585666 Suh et al. Jul 2003 B2
6711432 Krause et al. Mar 2004 B1
6715213 Richter Apr 2004 B2
6716213 Shitoto Apr 2004 B2
6746449 Jones et al. Jun 2004 B2
6786930 Biscup Sep 2004 B2
7066938 Slivka et al. Jun 2006 B2
7095881 Lelong Aug 2006 B2
7338526 Steinberg Mar 2008 B2
7509183 Lin et al. Mar 2009 B2
7534263 Burdulis, Jr. et al. May 2009 B2
7542791 Mire et al. Jun 2009 B2
7570791 Frank et al. Aug 2009 B2
7606613 Simon et al. Oct 2009 B2
7611522 Gorek Nov 2009 B2
7618451 Berez et al. Nov 2009 B2
7634119 Tsougarakis et al. Dec 2009 B2
7635367 Groiso Dec 2009 B2
7639866 Pomero et al. Dec 2009 B2
7660623 Hunter et al. Feb 2010 B2
7674293 Kuiper et al. Mar 2010 B2
7715602 Richard May 2010 B2
7763054 Clement et al. Jul 2010 B2
7824413 Varieur et al. Nov 2010 B2
7835778 Foley et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7862593 Clement et al. Jan 2011 B2
7918887 Roche Apr 2011 B2
7953471 Clayton et al. May 2011 B2
7974677 Mire et al. Jul 2011 B2
7981158 Fitz et al. Jul 2011 B2
7996061 Mollard et al. Aug 2011 B2
7996064 Simon et al. Aug 2011 B2
8000926 Roche et al. Aug 2011 B2
8036441 Frank et al. Oct 2011 B2
8038716 Duggal et al. Oct 2011 B2
8046050 Govari et al. Oct 2011 B2
8066708 Lang et al. Nov 2011 B2
8077950 Tsougarakis et al. Dec 2011 B2
8083778 Clement et al. Dec 2011 B2
8105330 Fitz et al. Jan 2012 B2
8142842 Sugita et al. Mar 2012 B2
8196825 Turner et al. Jun 2012 B2
8211109 Groiso Jul 2012 B2
8211153 Shaolian et al. Jul 2012 B2
8234097 Steines et al. Jul 2012 B2
8241296 Wasielewski Aug 2012 B2
8246680 Betz et al. Aug 2012 B2
8265790 Amiot et al. Sep 2012 B2
8270253 Roche et al. Sep 2012 B1
8275594 Lin et al. Sep 2012 B2
8308772 Clement et al. Nov 2012 B2
8308775 Clement et al. Nov 2012 B2
8337501 Fitz et al. Dec 2012 B2
8357111 Caillouette et al. Jan 2013 B2
8357166 Aram et al. Jan 2013 B2
8372075 Groiso Feb 2013 B2
8377073 Wasielewski Feb 2013 B2
8394142 Bertagnoli et al. Mar 2013 B2
8398681 Augostino et al. Mar 2013 B2
8400312 Hotokebuchi et al. Mar 2013 B2
8414592 Quiro Apr 2013 B2
8442621 Gorek et al. May 2013 B2
8457930 Schroeder Jun 2013 B2
8465527 Clement Jun 2013 B2
8494805 Roche et al. Jul 2013 B2
8506632 Ganem et al. Aug 2013 B2
8532806 Masson Sep 2013 B1
8535337 Chang et al. Sep 2013 B2
8549888 Isaacs Oct 2013 B2
8556983 Bojarski et al. Oct 2013 B2
8588892 Hladio et al. Nov 2013 B2
8636776 Rosenberg et al. Jan 2014 B2
8672948 Lemaitre Mar 2014 B2
8685093 Anderson et al. Apr 2014 B2
8690888 Stein et al. Apr 2014 B2
8705829 Frank et al. Apr 2014 B2
8718820 Amiot et al. May 2014 B2
8758357 Frey Jun 2014 B2
8775133 Schroeder Jul 2014 B2
8777877 Stein et al. Jul 2014 B2
8784339 Stein et al. Jul 2014 B2
8792694 Batman Jul 2014 B2
8801786 Bernard et al. Aug 2014 B2
8814877 Wasielewski Aug 2014 B2
8814915 Hess et al. Aug 2014 B2
8852237 Kalfas et al. Oct 2014 B2
8855389 Hoffmann et al. Oct 2014 B1
8864764 Groiso Oct 2014 B2
8870889 Frey Oct 2014 B2
8900316 Lenz et al. Dec 2014 B2
8911448 Stein Dec 2014 B2
8926673 Clement et al. Jan 2015 B2
8945133 Stein et al. Feb 2015 B2
8956416 McCarthy Feb 2015 B2
8974467 Stone Mar 2015 B2
8983813 Miles et al. Mar 2015 B2
8998962 Birch Apr 2015 B2
9011448 Roche et al. Apr 2015 B2
9034037 Fiere et al. May 2015 B2
9039772 Park et al. May 2015 B2
9056017 Kotlus Jun 2015 B2
9066701 Finley et al. Jun 2015 B1
9066734 Schoenefeld et al. Jun 2015 B2
9078755 Mahfouz Jul 2015 B2
9101492 Mangione et al. Aug 2015 B2
9115998 Proulx et al. Aug 2015 B2
9119572 Gorek et al. Sep 2015 B2
9119671 Kast Sep 2015 B2
9125680 Kostrzewski et al. Sep 2015 B2
9144440 Aminian Sep 2015 B2
9144470 Proulx et al. Sep 2015 B2
9168153 Bettenga Oct 2015 B2
9173661 Metzger et al. Nov 2015 B2
9180015 Fitz et al. Nov 2015 B2
9192412 Meyrat et al. Nov 2015 B2
9198678 Frey et al. Dec 2015 B2
9232955 Bonin, Jr. et al. Jan 2016 B2
9233001 Miles et al. Jan 2016 B2
9237952 Kurtz Jan 2016 B2
9248023 Ries et al. Feb 2016 B2
9250620 Kotlus Feb 2016 B2
9278010 Gibson et al. Mar 2016 B2
9283048 Kostrzewski et al. Mar 2016 B2
9289221 Gelaude et al. Mar 2016 B2
9289270 Gielen et al. Mar 2016 B2
9295482 Fitz et al. Mar 2016 B2
9295497 Schoenefeld et al. Mar 2016 B2
9295561 Ball et al. Mar 2016 B2
9301768 Buza et al. Apr 2016 B2
9308050 Kostrzewski et al. Apr 2016 B2
9308091 Lang Apr 2016 B2
9314275 Clement et al. Apr 2016 B2
9314343 Parisi et al. Apr 2016 B2
9320547 Augostino Apr 2016 B2
9320604 Miles et al. Apr 2016 B2
9326780 Wong et al. May 2016 B2
9339277 Jansen et al. May 2016 B2
9345492 Stein et al. May 2016 B2
9358051 Sournac et al. Jun 2016 B2
9358130 Livorsi et al. Jun 2016 B2
9358136 Stein et al. Jun 2016 B2
9364370 Kuhnel Jun 2016 B2
9381085 Axelson, Jr. et al. Jul 2016 B2
9387015 Taylor Jul 2016 B2
9392953 Gharib Jul 2016 B1
9393052 Berg et al. Jul 2016 B2
9398962 Steinberg Jul 2016 B2
9402726 Linderman et al. Aug 2016 B2
9408615 Fitz et al. Aug 2016 B2
9408642 Wong et al. Aug 2016 B2
9408698 Miles et al. Aug 2016 B2
9414940 Stein et al. Aug 2016 B2
9433443 Montello et al. Sep 2016 B2
9439659 Schoenefeld et al. Sep 2016 B2
9439767 Bojarski et al. Sep 2016 B2
9439781 Gibson Sep 2016 B2
9445913 Donner et al. Sep 2016 B2
9452022 McIntosh et al. Sep 2016 B2
9452023 Boillot et al. Sep 2016 B2
9452050 Miles et al. Sep 2016 B2
9452064 Trautwein et al. Sep 2016 B2
9468436 Groiso Oct 2016 B2
9468502 Wiebe, III et al. Oct 2016 B2
9491415 Deitz et al. Nov 2016 B2
9492183 Wilkinson et al. Nov 2016 B2
9495483 Steines et al. Nov 2016 B2
9495509 Amiot et al. Nov 2016 B2
9498260 Funk et al. Nov 2016 B2
9504502 Kuiper et al. Nov 2016 B2
9510771 Finley et al. Dec 2016 B1
9510864 Devito Dec 2016 B2
9517134 Lang Dec 2016 B2
9517143 Prevost et al. Dec 2016 B2
9526514 Kelley et al. Dec 2016 B2
9532730 Wasielewski Jan 2017 B2
9539031 Fauth Jan 2017 B2
9539116 Claypool et al. Jan 2017 B2
9539760 Stahl et al. Jan 2017 B2
9547897 Parent et al. Jan 2017 B2
9549782 Park et al. Jan 2017 B2
9554411 Hall et al. Jan 2017 B1
9554910 Vanasse et al. Jan 2017 B2
9561115 Elahinia et al. Feb 2017 B2
9566075 Carroll et al. Feb 2017 B2
9579043 Chien et al. Feb 2017 B2
9585597 McCaulley et al. Mar 2017 B2
9597096 Aghazadeh Mar 2017 B2
9597156 Amiot et al. Mar 2017 B2
9603613 Schoenefeld et al. Mar 2017 B2
9603623 Brooks et al. Mar 2017 B2
9603711 Bojarski et al. Mar 2017 B2
9610086 Park et al. Apr 2017 B2
9615834 Agnihotri et al. Apr 2017 B2
9622712 Munro et al. Apr 2017 B2
9629723 Parisi et al. Apr 2017 B2
9636181 Isaacs May 2017 B2
9642633 Frey et al. May 2017 B2
9649170 Park et al. May 2017 B2
9655729 Parisi et al. May 2017 B2
9662214 Li et al. May 2017 B2
9668748 McKinnon et al. Jun 2017 B2
9668873 Winslow et al. Jun 2017 B2
9675471 Bojarski et al. Jun 2017 B2
9693831 Mosnier et al. Jul 2017 B2
9715563 Schroeder Jul 2017 B1
9757072 Urbalejo Sep 2017 B1
9782228 Mosnier et al. Oct 2017 B2
9788966 Steinberg Oct 2017 B2
9827109 Steinberg Nov 2017 B2
9848922 Tohmeh et al. Dec 2017 B2
9968408 Casey et al. May 2018 B1
9987048 Mosnier et al. Jun 2018 B2
9993177 Chien et al. Jun 2018 B2
10010426 Kuiper et al. Jul 2018 B2
10045824 Mosnier et al. Aug 2018 B2
10052135 Berg et al. Aug 2018 B2
10064743 Funk et al. Sep 2018 B2
10098671 Augostino Oct 2018 B2
10188480 Scholl et al. Jan 2019 B2
10201320 Saget et al. Feb 2019 B2
10219865 Jansen et al. Mar 2019 B2
10292770 Ryan et al. May 2019 B2
10314657 Mosnier et al. Jun 2019 B2
10318655 Mosnier et al. Jun 2019 B2
10413365 Mosnier et al. Sep 2019 B1
10420615 Mosnier et al. Sep 2019 B1
10433893 Scholl et al. Oct 2019 B1
10433912 Mosnier et al. Oct 2019 B1
10433913 Mosnier et al. Oct 2019 B2
10441363 Mosnier et al. Oct 2019 B1
10456211 McAfee Oct 2019 B2
10463433 Turner et al. Nov 2019 B2
20020035321 Bucholz et al. Mar 2002 A1
20020038118 Shoham Mar 2002 A1
20020045812 Ben-Haim et al. Apr 2002 A1
20020103432 Kawchuk Aug 2002 A1
20030191383 Ben-Haim et al. Oct 2003 A1
20030204189 Cragg Oct 2003 A1
20040120781 Luca et al. Jun 2004 A1
20040143243 Wahrburg Jul 2004 A1
20040152972 Hunter Aug 2004 A1
20040167637 Biscup Aug 2004 A1
20040171924 Mire et al. Sep 2004 A1
20040172020 Beaurain et al. Sep 2004 A1
20040215190 Nguyen et al. Oct 2004 A1
20040243148 Wasielewski Dec 2004 A1
20040267279 Casutt et al. Dec 2004 A1
20050149050 Stifter et al. Jul 2005 A1
20050177239 Steinberg Aug 2005 A1
20050182320 Stifter et al. Aug 2005 A1
20050182454 Gharib et al. Aug 2005 A1
20050203531 Lakin et al. Sep 2005 A1
20050203532 Ferguson Sep 2005 A1
20050262911 Dankowicz et al. Dec 2005 A1
20060015018 Jutras et al. Jan 2006 A1
20060015030 Poulin et al. Jan 2006 A1
20060036259 Carl et al. Feb 2006 A1
20060069324 Block et al. Mar 2006 A1
20060074431 Sutton et al. Apr 2006 A1
20060136058 Pietrzak Jun 2006 A1
20060142657 Quaid et al. Jun 2006 A1
20060285991 McKinley Dec 2006 A1
20060287627 Johnson Dec 2006 A1
20070021682 Gharib et al. Jan 2007 A1
20070118243 Schroeder et al. May 2007 A1
20070225731 Couture et al. Sep 2007 A1
20080058945 Hajaj et al. Mar 2008 A1
20080108991 von Jako May 2008 A1
20080177203 von Jako Jul 2008 A1
20080255575 Justis et al. Oct 2008 A1
20080281332 Taylor Nov 2008 A1
20090024164 Neubardt Jan 2009 A1
20090076615 Duggal et al. Mar 2009 A1
20090157083 Park et al. Jun 2009 A1
20090194206 Jeon et al. Aug 2009 A1
20090204159 Justis et al. Aug 2009 A1
20090248080 Wilcox et al. Oct 2009 A1
20090249851 Isaacs Oct 2009 A1
20090254326 Isaacs Oct 2009 A1
20100042157 Trieu Feb 2010 A1
20100100011 Roche Apr 2010 A1
20100191071 Anderson Jul 2010 A1
20100191088 Anderson et al. Jul 2010 A1
20100217270 Polinski et al. Aug 2010 A1
20100217336 Crawford et al. Aug 2010 A1
20110004309 Holm Jan 2011 A9
20110071802 Bojarski et al. Mar 2011 A1
20110118740 Rabiner et al. May 2011 A1
20110172566 Kawchuk Jul 2011 A1
20110214279 Park et al. Sep 2011 A1
20110224796 Weiland et al. Sep 2011 A1
20110257653 Hughes et al. Oct 2011 A1
20110257657 Turner et al. Oct 2011 A1
20110295159 Shachar et al. Dec 2011 A1
20110306873 Shenai et al. Dec 2011 A1
20120022357 Chang et al. Jan 2012 A1
20120027261 Frank et al. Feb 2012 A1
20120035611 Kave Feb 2012 A1
20120123301 Connor et al. May 2012 A1
20120143090 Hay et al. Jun 2012 A1
20120150243 Crawford et al. Jun 2012 A9
20120172884 Zheng et al. Jul 2012 A1
20120203289 Beerens et al. Aug 2012 A1
20130079678 Stein et al. Mar 2013 A1
20130079679 Roche et al. Mar 2013 A1
20130079790 Stein et al. Mar 2013 A1
20130131486 Copf et al. May 2013 A1
20130211531 Steines et al. Aug 2013 A1
20130245631 Bettenga Sep 2013 A1
20130253599 Gorek et al. Sep 2013 A1
20130268007 Rezach et al. Oct 2013 A1
20130303883 Zehavi et al. Nov 2013 A1
20130345718 Crawford et al. Dec 2013 A1
20140058407 Tsekos et al. Feb 2014 A1
20140100579 Kelman et al. Apr 2014 A1
20140135658 Hladio et al. May 2014 A1
20140180415 Koss Jun 2014 A1
20140194889 Chang et al. Jul 2014 A1
20140228670 Justis et al. Aug 2014 A1
20140228860 Steines et al. Aug 2014 A1
20140244220 McKinnon et al. Aug 2014 A1
20140257402 Barsoum Sep 2014 A1
20140272881 Barsoum Sep 2014 A1
20140277149 Rooney et al. Sep 2014 A1
20140296860 Stein et al. Oct 2014 A1
20140303672 Tran et al. Oct 2014 A1
20140316468 Keiser et al. Oct 2014 A1
20150057756 Lang et al. Feb 2015 A1
20150066145 Rogers et al. Mar 2015 A1
20150080901 Stein Mar 2015 A1
20150081029 Bojarski et al. Mar 2015 A1
20150088030 Taylor Mar 2015 A1
20150100066 Kostrzewski et al. Apr 2015 A1
20150100091 Tohmeh et al. Apr 2015 A1
20150105782 D'Lima et al. Apr 2015 A1
20150127055 Dvorak et al. May 2015 A1
20150150646 Pryor et al. Jun 2015 A1
20150164657 Miles et al. Jun 2015 A1
20150182292 Hladio et al. Jul 2015 A1
20150223900 Wiebe, III et al. Aug 2015 A1
20150245844 Kennedy et al. Sep 2015 A1
20150250597 Lang et al. Sep 2015 A1
20150265291 Wilkinson Sep 2015 A1
20150305878 O'Neil et al. Oct 2015 A1
20150305891 Bergin et al. Oct 2015 A1
20150313723 Jansen et al. Nov 2015 A1
20150328004 Mafhouz Nov 2015 A1
20150366630 Gorek et al. Dec 2015 A1
20160000571 Mahfouz Jan 2016 A1
20160007983 Frey et al. Jan 2016 A1
20160015465 Steines et al. Jan 2016 A1
20160022176 Le Huec et al. Jan 2016 A1
20160022370 Pavlovskaia et al. Jan 2016 A1
20160038161 Gibson Feb 2016 A1
20160038238 Kostrzewski et al. Feb 2016 A1
20160038242 Lo Iacono et al. Feb 2016 A1
20160038293 Slamin et al. Feb 2016 A1
20160038307 Bettenga Feb 2016 A1
20160045230 Lowery et al. Feb 2016 A1
20160045317 Lang et al. Feb 2016 A1
20160045326 Hansen et al. Feb 2016 A1
20160058320 Chien et al. Mar 2016 A1
20160058523 Chien et al. Mar 2016 A1
20160074052 Keppler et al. Mar 2016 A1
20160074202 Reed et al. Mar 2016 A1
20160081754 Kostrzewski et al. Mar 2016 A1
20160095710 Juszczyk et al. Apr 2016 A1
20160100907 Gomes Apr 2016 A1
20160106483 Mayer et al. Apr 2016 A1
20160128847 Kurtaliaj et al. May 2016 A1
20160143744 Bojarski et al. May 2016 A1
20160157751 Mahfouz Jun 2016 A1
20160199101 Sharifi-Mehr et al. Jul 2016 A1
20160228192 Jansen et al. Aug 2016 A1
20160235480 Scholl et al. Aug 2016 A1
20160235493 LeBoeuf, II et al. Aug 2016 A1
20160242819 Simpson et al. Aug 2016 A1
20160242857 Scholl Aug 2016 A1
20160242934 van der Walt et al. Aug 2016 A1
20160256279 Sanders et al. Sep 2016 A1
20160256285 Jansen Sep 2016 A1
20160262800 Scholl et al. Sep 2016 A1
20160262895 Shea et al. Sep 2016 A1
20160270802 Fang et al. Sep 2016 A1
20160270931 Trieu Sep 2016 A1
20160274571 Lavallee et al. Sep 2016 A1
20160283676 Lyon et al. Sep 2016 A1
20160287395 Khalili et al. Oct 2016 A1
20160296285 Chaoui et al. Oct 2016 A1
20160310221 Bar et al. Oct 2016 A1
20160331417 Trautwein et al. Nov 2016 A1
20160354009 Schroeder Dec 2016 A1
20160354161 Deitz Dec 2016 A1
20160360997 Yadav et al. Dec 2016 A1
20170000568 O'Neil et al. Jan 2017 A1
20170007145 Gharib et al. Jan 2017 A1
20170007328 Cattin et al. Jan 2017 A1
20170007408 Fitz et al. Jan 2017 A1
20170027590 Amiot et al. Feb 2017 A1
20170027617 Strnad Feb 2017 A1
20170035580 Murphy Feb 2017 A1
20170056179 Lorio Mar 2017 A1
20170056196 Kuiper et al. Mar 2017 A1
20170071503 Wasielewski Mar 2017 A1
20170119472 Herrmann et al. May 2017 A1
20170132389 McCaulley et al. May 2017 A1
20170135706 Frey et al. May 2017 A1
20170135707 Frey et al. May 2017 A9
20170135770 Scholl et al. May 2017 A1
20170143426 Isaacs et al. May 2017 A1
20170143494 Mahfouz May 2017 A1
20170143502 Yadin et al. May 2017 A1
20170156798 Wasielewski Jun 2017 A1
20170189121 Frasier et al. Jul 2017 A1
20170231709 Gupta et al. Aug 2017 A1
20170252107 Turner et al. Sep 2017 A1
20170273718 Metzger et al. Sep 2017 A1
20170323037 Schroeder Nov 2017 A1
20170360493 Zucker et al. Dec 2017 A1
20180178148 Mazor et al. Jun 2018 A1
20180256067 Chien et al. Sep 2018 A1
20180289396 McGahan et al. Oct 2018 A1
20180295584 Gliner et al. Oct 2018 A1
20180301213 Zehavi et al. Oct 2018 A1
20180303552 Ryan et al. Oct 2018 A1
20180310993 Hobeika et al. Nov 2018 A1
20180349519 Schroeder Dec 2018 A1
20190015136 Kraemer Jan 2019 A1
20190046269 Hedblom et al. Feb 2019 A1
20190046287 Fallin et al. Feb 2019 A1
20190059951 Barrus Feb 2019 A1
20190060086 Krause et al. Feb 2019 A1
20190083144 Sharifi-Mehr et al. Mar 2019 A1
20190103190 Schmidt et al. Apr 2019 A1
20190110819 Triplett et al. Apr 2019 A1
20190117278 Chin Apr 2019 A1
20190122364 Zhang et al. Apr 2019 A1
20190130639 Boyce May 2019 A1
20190142599 Thibodeau May 2019 A1
20190167314 Mosnier et al. Jun 2019 A1
20190201013 Siccardi et al. Jul 2019 A1
20190201155 Gupta et al. Jul 2019 A1
20190209212 Scholl et al. Jul 2019 A1
20190223916 Barrus et al. Jul 2019 A1
20190231443 McGinley et al. Aug 2019 A1
20190231557 Sutterlin, III et al. Aug 2019 A1
20190239935 Willis et al. Aug 2019 A1
20190247100 Mundis, Jr. et al. Aug 2019 A1
20190254769 Scholl et al. Aug 2019 A1
20190262015 Siccardi et al. Aug 2019 A1
20190269463 Mosnier et al. Sep 2019 A1
20190343587 Mosnier et al. Nov 2019 A1
20190362028 Mosnier et al. Nov 2019 A1
20190380782 McAfee et al. Dec 2019 A1
20200060768 Mosnier et al. Feb 2020 A1
20200121394 Mosnier et al. Apr 2020 A1
Foreign Referenced Citations (108)
Number Date Country
2015258176 Dec 2015 AU
2015202416 Mar 2017 AU
2019200740 Feb 2019 AU
2019200888 Feb 2019 AU
2019203557 Jun 2019 AU
2872845 Nov 2013 CA
2927955 Apr 2014 CA
1816134 Aug 2006 CN
102805677 Dec 2012 CN
104127229 Nov 2014 CN
205073000 Mar 2016 CN
103892953 May 2016 CN
104323843 Jan 2017 CN
104434287 Jan 2017 CN
105078555 Sep 2018 CN
1 570 781 Sep 2005 EP
2 053 580 Apr 2009 EP
2 749 235 Jul 2014 EP
2 754 419 Jul 2014 EP
2 496 183 Sep 2015 EP
3 000 443 Mar 2016 EP
2 608 749 Aug 2016 EP
2 403 434 Apr 2017 EP
3 431 032 Jan 2019 EP
1358988 Apr 1964 FR
1360208 May 1964 FR
2016-537036 Dec 2016 JP
2016-540610 Dec 2016 JP
1497721 Jul 1989 SU
1704102 Jan 1992 SU
199855038 Dec 1998 WO
0053077 Sep 2000 WO
04017836 Mar 2004 WO
04030559 Apr 2004 WO
04089224 Oct 2004 WO
04111948 Dec 2004 WO
05074368 Aug 2005 WO
06075331 Jul 2006 WO
06084193 Aug 2006 WO
07035925 Mar 2007 WO
07038290 Apr 2007 WO
08002588 Jan 2008 WO
08079546 Jul 2008 WO
08124079 Oct 2008 WO
09119181 Oct 2009 WO
09124245 Oct 2009 WO
10044880 Apr 2010 WO
10064234 Jun 2010 WO
10121147 Oct 2010 WO
10147972 Dec 2010 WO
11021192 Feb 2011 WO
12012863 Feb 2012 WO
12113030 Aug 2012 WO
12131660 Oct 2012 WO
13003435 Jan 2013 WO
14191790 Dec 2014 WO
15040552 Mar 2015 WO
15054543 Apr 2015 WO
15056131 Apr 2015 WO
15079011 Jun 2015 WO
15089118 Jun 2015 WO
15185219 Dec 2015 WO
15195843 Dec 2015 WO
15200720 Dec 2015 WO
16012726 Jan 2016 WO
16019424 Feb 2016 WO
16019425 Feb 2016 WO
16019426 Feb 2016 WO
1626053 Feb 2016 WO
16032875 Mar 2016 WO
16044352 Mar 2016 WO
16048800 Mar 2016 WO
16088130 Jun 2016 WO
16094826 Jun 2016 WO
16102026 Jun 2016 WO
16137347 Sep 2016 WO
16148675 Sep 2016 WO
16165030 Oct 2016 WO
17001851 Jan 2017 WO
17039596 Mar 2017 WO
17064719 Apr 2017 WO
17066518 Apr 2017 WO
17077356 May 2017 WO
17079655 May 2017 WO
WO-2017074890 May 2017 WO
17127838 Jul 2017 WO
17151949 Sep 2017 WO
17221257 Dec 2017 WO
18045086 Mar 2018 WO
18055494 Mar 2018 WO
18055518 Mar 2018 WO
18078636 May 2018 WO
18087758 May 2018 WO
18131044 Jul 2018 WO
18131045 Jul 2018 WO
18183314 Oct 2018 WO
18185755 Oct 2018 WO
18193316 Oct 2018 WO
18193317 Oct 2018 WO
18203100 Nov 2018 WO
18203101 Nov 2018 WO
19014452 Jan 2019 WO
2019036039 Feb 2019 WO
2019043426 Mar 2019 WO
2019068085 Apr 2019 WO
2019070729 Apr 2019 WO
2019118844 Jun 2019 WO
2019140240 Jul 2019 WO
Non-Patent Literature Citations (22)
Entry
Roberts, M. G., Oh, T., Pacheco, E. M. B., Mohankumar, R., Cootes, T. F., & Adams, J. E. (2011). Semi-automatic determination of detailed vertebral shape from lumbar radiographs using active appearance models. In Osteoporosis International (vol. 23, Issue 2, pp. 655-664). Springer Science and Business (Year: 2011).
Continue item U] Media LLC. https://doi.org/10.1007/s00198-011-1604-3 (Year: 2011).
Damopoulos, D., Glocker, B., Zheng, G. (2018). Automatic Localization of the Lumbar Vertebral Landmarks in CT Images with Context Features. In: Glocker, B., Yao, J., Vrtovec, T., Frangi, A., Zheng, G. (eds) Computational Methods and Clinical Applications in Musculoskeletal Imaging. MSKI 2017. Lecture (Year: 2018).
Continue item W] Notes in Computer Science, vol. 10734. Springer, Cham. https://doi.org/10.1007/978-3-319-74113-0_6. (Year: 2018).
Abe et al. “Scoliosis corrective force estimation from the implanted rod deformation using 3 D FEM analysis”, 2015, Scoliosis 10(Suppl 2):52, 6 pages.
Aubin et al. “Preoperative Planning Simulator for Spinal Deformity Surgeries”, Spine 2008, 33(20):2143-2152.
Barton et al., Mar./Apr. 2016, Early experience and initial outcomes with patient-specific spine rods for adult spinal deformity, Trending in Orthopedics, 39(2):79-86.
Fiere et al., Jul. 2016, 40. Preoperative planning and patient-specific rods for surgical treatment of thoracolumbar sagittal imbalance, in Surgery of the Spine and Spinal Cord. A Neurosurgical Approach, Van de Kalft ed., Springer International Publishing, Switzerland, pp. 645-662.
Foroozandeh et al., Summer 2012, 3D reconstruction using cubic Bezier spline curves and active contours (case study), Iranian Journal of Medical Physics, 9(3):169-176.
Galbusera et al., Feb. 2019, Artificial intelligence and machine learning in spine research, JOR Spine, 2:E1044, 20 pp.
Grove, 2011, Heterogeneous modeling of medical image data using B-spline functions, doctoral dissertation, Department of Computer Science and Engineering, University of South Florida, 212 pp.
Azarus, Jun. 21, 2013, An introduction to splines; 29 pp.
Li et al., 2009, Modeling and measurement of 3D deformation of scoliotic spine using 2D x-ray images, Lecture Notes in Computer Science; 8 pp.
Lin, Sep. 17-21, 2003, The simplified spine modeling by 3-D Bezier curve based on the orthogonal spinal radiographic images, Proceedings of the 25th Annual International Conference of the IEEE EMBS, Cancun, Mexico, pp. 944-946.
Pasha et al., 2018, Data-driven classification of the 3D spinal curve in adolescent idiopathic scoliosis with an applications in surgical outcome prediction, Scientific Reports, 8:16296, 10 pp.
Poredos et al.; 2015; Determination of the human spine curve based on laser triangulation, BMC Medical Imaging 15(2):1-11.
Prautzsch et al., Mar. 26, 2001, Bezier-and B-spline techniques; 58 pp.
Ratnakar et al. 2011, Predicting thoracic spinal postures in finite element model with Bezier technique, Ircobe Conference 2011, IRC-11-57, 4 pp.
Reinshagen et al. “A novel minimally invasive technique for lumbar decompression, realignment, and navigated interbody fusion”; J Clin Neurosci. 2015, 22(9):1484-1490; XP055503028.
Rickert et al., “Posterior lumbar interbody fusion implants”, Orthopaede, Springer Verlag, Berlin, DE vol. 44, No. 2 dated Jan. 28, 2015 pp. 162-169.
Solla et al.; Mar. 2019, Patient-specific rods for surgical correction of sagittal imbalance in adults: Technical aspects and preliminary results, Clin Spine Surg, 32(2), 7 pp.
Spontech Medical AG Vertaplan—die Software fur Wirbelsaulenchirurgen, Aug. 29, 2013 Retrieved from the Internet: URL: https://www.youtube.com/watch?v=q0qhW1T1cp8 in 1 page.
Related Publications (1)
Number Date Country
20210201483 A1 Jul 2021 US
Provisional Applications (1)
Number Date Country
62953677 Dec 2019 US