Methods of selecting surgical implants and related devices

Information

  • Patent Grant
  • 11134862
  • Patent Number
    11,134,862
  • Date Filed
    Friday, November 10, 2017
    7 years ago
  • Date Issued
    Tuesday, October 5, 2021
    3 years ago
Abstract
Methods may be provided to identify a medical implant from a plurality of medical implants to be fixed to an anatomical surface. Dimensional parameters for each of the plurality of medical implants may be provided, and dimensional parameters corresponding to the anatomical surface may be provided. The dimensional parameters for each of the plurality of medical implants may be compared with the dimensional parameters corresponding to the anatomical surface, and one of the medical implants may be selected from the plurality of medical implants based on comparing the dimensional parameters for each of the plurality of medical implants with the dimensional parameters corresponding to the anatomical surface. An identification of the medical implant selected from the plurality of medical implants may be provided through a user interface. Related devices and computer program products are also discussed.
Description
TECHNICAL FIELD

The present disclosure relates to medical procedures and, more particular, to medical implants and related methods, devices, and computer program products.


BACKGROUND

An orthopedic implant (such as a bone plate) may be used, for example, to support a damaged bone. The implant may be fabricated from stainless steel and/or titanium alloys, and a plurality of screw holes through the implant may allow fixation to the bone using bone screws. The surgeon may thus expose the damaged bone and screw the implant to the bone.


To facilitate variations in bone sizes and/or shapes, an implant for a particular bone may be manufactured in different sizes and/or shapes. The unique anatomy and injury pattern of each individual patient may thus require selection of a properly sized and contoured implant from a set of many available sizes and contours for the same type of implant. Positive treatment outcomes may correlate with well-fitting implants.


Accordingly, the surgeon may select from a number of implant sizes/shapes during surgery to fit the bone being repaired. The selection of a particular implant may involve the surgeon visually inspecting the exposed bone surface during surgery and selecting one or more of the implants based on the visual inspection. Selecting a best-fitting implant from among many implants may thus be a time-consuming and imprecise process for the surgeon, thereby increasing a time required to perform the surgery. Moreover, the surgeon may try to fit multiple implants to the bone before selecting the final implant resulting in waste due to contamination of implants that are tried but not used.


Accordingly, there continues to exist demand for improved methods of selecting orthopedic implants.


SUMMARY

Some embodiments of the present disclosure are directed to methods to identify a medical implant from a plurality of medical implants to be fixed to an anatomical surface. Dimensional parameters for each of the plurality of medical implants may be provided, and dimensional parameters corresponding to the anatomical surface may be provided. The dimensional parameters for each of the plurality of medical implants may be compared with the dimensional parameters corresponding to the anatomical surface, and one of the medical implants may be selected from the plurality of medical implants based on comparing the dimensional parameters for each of the plurality of medical implants with the dimensional parameters corresponding to the anatomical surface. An identification of the medical implant selected from the plurality of medical implants may be provided through a user interface. Related devices and computer program products are also discussed.


Other systems, methods, and computer program products according to embodiments of the inventive subject matter will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional systems, methods, and computer program products be included within this description, be within the scope of the present inventive subject matter, and be protected by the accompanying claims. Moreover, it is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.





BRIEF DESCRIPTION OF THE DRAWINGS

Other features of embodiments will be more readily understood from the following detailed description of specific embodiments thereof when read in conjunction with the accompanying drawings, in which:



FIG. 1 is a block diagram illustrating a selection device according to some embodiments of inventive concepts;



FIG. 2A illustrates a template material and FIG. 2B illustrates a template on a bone according to some embodiments of inventive concepts;



FIGS. 3A and 3B illustrate orthogonal views of the template of FIG. 2B according to some embodiments of inventive concepts;



FIG. 4A illustrates a template and FIG. 4B illustrates a corresponding bone plate according to some embodiments of inventive concepts;



FIG. 5 illustrates an image cradle according to some embodiments of inventive concepts;



FIG. 6 is a diagram illustrating an image cradle using two camera positions according to some embodiments of inventive concepts;



FIGS. 7A and 7B illustrate orthogonal images of a template taken using the image cradle of FIG. 6 according to some embodiments of inventive concepts;



FIGS. 8A, 8B, and 8C illustrate an image cradle and alignment markings thereof according to some embodiments of inventive concepts;



FIG. 9 is a diagram illustrating an image cradle with a mirror according to some embodiments of inventive concepts;



FIGS. 10A and 10B illustrate lines generated using autodetection from different images of a template according to some embodiments of inventive concepts;



FIGS. 11A and 11B illustrate a fitting of splines using lines of FIGS. 10A and 10B according to some embodiments of inventive concepts;



FIG. 12 illustrates a template in an image cradle including alignment markers according to some embodiments of inventive concepts;



FIG. 13 illustrates different spacings of alignment markings from the image cradle of FIG. 12 according to some embodiments of inventive concepts;



FIGS. 14A and 14B illustrate orthogonal images of a template in an image cradle using alignment markers of FIGS. 12 and 13 according to some embodiments of inventive concepts;



FIGS. 15A, 15B, and 15C are screenshots illustrating renderings of a spline according to some embodiments of inventive concepts;



FIGS. 16A and 16B illustrate transformations of a template oriented on a z-axis according to some embodiments of inventive concepts; and



FIG. 17 is a flow chart illustrating operations of a selection device according to some embodiments of inventive concepts.





DETAILED DESCRIPTION

In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present disclosure. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the present invention. It is intended that all embodiments disclosed herein can be implemented separately or combined in any way and/or combination.



FIG. 1 is a block diagram illustrating elements of selection device 100 configured to provide assistance in the selection of a medical implant (e.g., an orthopedic implant, such as a bone plate) according to some embodiments of inventive concepts. As shown, selection device 100 may include user interface 101, processor 103, memory 105, camera 107, and/or wired/wireless interface 109, and processor 103 may be coupled with each of user interface 101, memory 105, camera 107, and/or wired/wireless interface 109. Selection device 100 of FIG. 1, for example, may be implemented using a smartphone, a tablet computer, a laptop computer, a desktop computer, a dedicated computing device, etc., configured to perform operations to select an medical implant according to embodiments herein. Selection device 100, for example, may be a smartphone, tablet computer, laptop computer, or desktop computer running an app/software configured to perform operations discussed herein. According to some other embodiments, selection device 100 may be provided in/as a head mounted device worn by the surgeon. According to still other embodiments, selection device 100 may be integrated with other operating room equipment.


As discussed herein, operations of the selection device 100 of FIG. 1 may be performed by processor 103, user interface 101, wired/wireless interface 109, and/or camera 107. For example, processor 103 may accept data regarding the implant surface through camera 107 and/or wired/wireless interface 109, select one of a plurality of implants, and provide an identification of the selected implant through user interface 101. Moreover, modules may be stored in memory 105, and these modules may provide instructions so that when instructions of a module are executed by processor 103, processor 103 performs respective operations (e.g., operations discussed below with respect to FIG. 17). According to other embodiments, processor circuit 103 may be defined to include memory so that a separate memory is not required.


According to some embodiments, camera 107 may be used to capture images, where the images are used by processor 103 to provide/generate dimensional parameters corresponding to an anatomical surface (e.g., a bone surface) to which the implant (e.g., an orthopedic implant such as a bone plate) is to be fixed. According to some other embodiments, images or other data may be captured outside selection device 100 and received by processor 103 through wired/wireless interface 109, or dimensional parameters corresponding to the anatomical surface may be generated outside selection device 100 and received by processor 103 through wired/wireless interface 109, such that camera 107 may be omitted. Wired/wireless interface 109, for example, may include a wired interface (e.g., a Universal Serial Bus or USB port), a short range wireless interface (e.g., a BlueTooth transceiver, a WiFi transceiver, etc.), and/or a long range wireless interface (e.g., a cellular radio telephone transceiver).


As shown, user interface 101 may include one or more of a plurality of input/output devices. For example, keypad 101a, one or more buttons 101b, touch screen 101c, and/or microphone 101e may be provided to accept user input, and touch screen 101c and/or speaker 101d may provide user output (e.g., an identification of a selected medical input). According to some other embodiments, a conventional display (non-touch screen) may be used to provide user output with keypad 101a and/or button(s) 101b being used to accept user input. Camera 107, for example, may be operated responsive to user input through keypad 101a, button(s) 101b, touch screen 101c, and/or microphone 101e.


According to some embodiments of inventive concepts, methods, devices, and/or computer program products may be provided to log the unique morphology of an intended implant site (e.g., bone surface) intraoperatively and to apply best-fit algorithms to assist selection of a most suitable implant. For example, a best-fitting anatomically contoured bone plate may be selected from a plurality of bone plates according to some embodiments.


As discussed below with respect to FIGS. 2-11, a surgeon may use selection device 100 with the following operations of templating, imaging, and image analysis to select a particular implant from a plurality of implants of varying sizes and shapes/contours. While selection of a clavicle plate is discussed by way of example in the following embodiments, embodiments of inventive concepts may be applied for other medical/orthopedic implants and/or bone plates.


The surgeon may first surgically expose the site of intended plate fixation (e.g., a surface of clavicle 201), and then the surgeon may shape a malleable template to fit a 3-dimensional contour of the intended implant site. As shown in FIGS. 2A and 2B, a desired length of malleable template material 203a may be broken off (by hand) to represent a desired length of the implant, and the surgeon may shape the resulting malleable template 203b to fit a 3-dimensional contour of the exposed implant site as shown in FIG. 2B. The resulting shaped template 203b is shown in the two (substantially orthogonal) views of FIGS. 3A and 3B.


As shown in FIGS. 3A and 3B, the template 203b may have lengthwise segments 205 with lengths equal to the spacing between implant screw holes of the implant. The template material 203a may preferentially break at notches 207 between these segments, and each segment 205 may be marked with lines 209 perpendicular with respect to a trajectory of the segment. These lines 209 contrast with the template itself to facilitate image recognition. Lines 209 (or other markings) may be provided on only one of the two primary faces of the template to ensure reading of a proper orientation of the template (and not an inverted orientation). According to some embodiments, lines 209 may be provided on a face away from the bone to reduce obstruction of lines 209 due to blood or other material resulting from contact with the exposed bone. According to some other embodiments, lines 209 may be provided on a face adjacent to the bone so that the marked face of the template more closely matches the contour of the bone.


As shown in FIG. 4A, one end of template 203b may have a larger segment 211 to indicate/represent a specific end of the implant. For example, the larger segment 211 of FIG. 4A may represent a metaphyseal region 411 of bone plate 401 of FIG. 4B. This larger segment 211 of template 203b may alternatively be broken off if not wanted. In addition, lines 209 may be provided on template 203b to correspond to screw holes 409 of bone plate 401.


After shaping template 203b based on the contour and length of the implant site as shown in FIG. 2B, template 203b may be placed in imaging cradle 501 with sides 503 and 505 that are 90 degrees perpendicular (orthogonal) as shown in FIG. 5. These perpendicular surfaces 503 and 505 may have horizontal and/or vertical reference lines or other markings to facilitate image analysis by providing information about scaling and orientation of template 203b. With its V-shaped trough, imaging cradle 501 may enable complimentary imaging of template 203b at orthogonal angles, as shown in FIG. 6.


The orthogonal images of template 203b may be captured with camera 107 of selection device 100 of FIG. 1 from positions 100a and 100b as shown in FIG. 6. As discussed above, selection device 100 may be implemented using a smart phone, a tablet computer, or other similar device with on-board software. According to some other embodiments, the orthogonal images may be captured by a camera or other imaging device outside selection device 100, and the orthogonal images (or data relating thereto) may be provided through wired/wireless interface 109 of selection device 100 to processor 103. Center position and/or focal distance may be calibrated using aligning markers 511a-b, 515a-b, and 519a-b on image cradle 501. The angle of 90 degree projection images may be assisted using gyroscopic and/or accelerometer feedback that may be commonly available in smart phone devices used as selection device 100.


As shown in FIG. 6, selection device 100 including camera 107 may be held in positions 100a and 100b to take the respective images of FIGS. 7A and 7B. By aligning markers 511a and 515a and markers 511b and 515b in the image taken from position 100a and by aligning markers 519a and 515a and markers 519b and 515b in the image taken from position 100b, desired orientations of the orthogonal images may be provided. As discussed above, one camera 107 from selection device 100 (or one camera outside of selection device 100) may be used to take the orthogonal images of FIGS. 7A and 7B from positions 100a and 100b. According to some other embodiments, separate cameras may be mounted (permanently or detachably) in positions 100a and 100b to take the images of FIGS. 7A and 7B without requiring manual alignment. An image capture device, for example, may include imaging cradle 501 and two cameras that are mounted in positions 100a and 100b to capture the images of FIGS. 7A and 7B, and the resulting images (or data such as dimensional parameters relating thereto) may be provided to selection device 100 through wired/wireless interface 109.



FIG. 8A shows a view of imaging cradle 501 taken from camera position 101b with alignment markers 515a-b and 519a-b. FIG. 8B shows alignment of markers 515a and 519, and FIG. 8C shows alignment of markers 515b and 519b as will occur when the camera is properly aligned at position 101b. Once the alignment of FIGS. 8B and 8C has been achieved, the image can be taken with assurance that the camera is properly positioned. If either of markers 515a and 519a or markers 515b and 519b is out of alignment after taking the image, processor 103 may reject the image and generate a request (provided through a screen/display and/or speaker of user interface 101) for the user to retake the image. According to some embodiments, processor 103 may use such visual misalignment from markers 515a, 519a, 515b, and/or 519b to adjust the image and/or data derived therefrom.


As shown in FIG. 9, a mirror 521 (or multiple) may be added to image cradle 501 to enable two (or more) projection angles to be captured by camera 107 of selection device 100 with one image from one angle. Mirror 521 may be a non-reversing mirror. Position and distance may again be calibrated with markers on the cradle as discussed above with respect to FIGS. 8A-C. Use of mirror 521 may allow the orthogonal images of template 203b to be captured in one photo instead of two. Here, the one image may include both a direct image of template 203b corresponding to the image of FIG. 7B and a reflection 203b′ of template 203b corresponding to the image of FIG. 7A. As discussed above, camera 107 of selection device 100 may be used in FIG. 9, or a separate camera may be used with the image or data relating thereto being provided to processor 103 through wired/wireless interface 109. According to some other embodiments, a separate camera may be mounted with respect to cradle 501 to maintain a desired alignment.


In addition to a camera in a hand-held device, a head-mounted camera worn by the surgeon may be used to capture the shape of template 203b. A tracking system associated with such a head-mounted camera may provide a pose/orientation of the head-mounted camera relative to the imaging cradle 501 or holder for the implant and can provide/ensure that frames for analysis are captured at 90° or any desired angle for analysis.


In addition to or instead of a deformable template, the surgeon could place visual markers (such as reflective fiducial markers) at the anatomical site (e.g., on the exposed bone surface). Reflective fiducial markers could be tracked stereophotogrammetrically using tracking cameras, and their locations detected in 3D (3-dimensional) space. These 3D surface points could be analyzed similarly to template points and used by processor 103 to select the appropriate implant.


According to some other embodiments, surgical ink may be used wherein the surgical ink has a property/properties that cause it to selectively adhere to bone and not to surrounding soft tissue. Such ink could be detectable using visual tracking or could be radio-opaque, detectable using x-rays. Photos or radiographs of the bone with adhered ink could be processed by processor 103 to detect the bone surface contours. Selected points along the contours could be analyzed by processor 103 similarly to template points to select an appropriate implant.


Regardless of the method used to determine points on the surface of the bone where the implant is intended to rest, fitting operations discussed below may be used by processor 101.


Image analysis may then be performed by processor 103 using the images of FIGS. 7A and 7B, or using the combined image resulting from FIG. 9 (or using other images or related data). In embodiments using a smartphone to implement selection device 100, a software application on the device (e.g., an Android app, an iOS app, etc.) may use operations to auto-detect the lines 209 on each segment of template 203b, in each projection (e.g., in the images of FIGS. 7A and 7B). By corresponding lines 209 with projections, processor 103 may map lines 209 to 3D space. FIG. 10A illustrates an example of processor 103 autodetecting lines 209 from the image of template 203b in FIG. 7A, and FIG. 10B illustrates an example of processor 103 autodetecting lines 209 from the image of template 203b in FIG. 7B. Processor 103 may then fit a spline through the midpoint of each line 209 as shown in FIGS. 11A and 11B.


Using the app, processor 103 may then generate prompts for questions relevant to the surgical procedure, such as anatomical placement (e.g., superior vs anterior). Processor 103, for example, may provide the prompts/questions visually through a touch screen or other display of user interface 101 and/or audibly through a speaker of user interface 101. Using the app, processor 103 may follow a best-fit algorithm to match the spline that is associated with template 203b or other method of surface geometry detection with a library or lookup table of splines corresponding to sizes/curvatures of implants available for the procedure. Using the app, processor 103 may then provide a recommendation to the user (e.g., surgeon) regarding the implant with the best-fitting spline. The recommendation may include an identification of the selected implant (e.g., a part number), a quantitative closeness of fit (e.g., 0% to 100%), and/or next-best alternatives. Using the app, processor 103 may also suggest where to bend the implant for a better fit, such as between two specific screw holes. Processor 103 could provide (on a display such as touch screen 101c or an external monitor) a graphic of the selected plate overlaid on a graphic of the template with arrows indicating where and how much to bend the plate to achieve a better fit. The recommendation(s) may be provided visually through a touch screen or other display of user interface 101 and/or audibly through a speaker of user interface 101.


Another embodiment may use the bending template to define and apply all necessary bending to a straight plate. That is, the curvature defined by the template and read from the optical or other sensing algorithm would then be applied to a straight plate either manually or automatically.


If using the system to apply bending to a straight plate through a manual process, the processor 103 could provide (on a display such as touch screen 101c or an external monitor) a graphic of the desired curvature with arrows indicating locations and magnitudes of necessary bends. Processor 103 could also show an actual size “blueprint” of the plate in its final form that could be printed or shown actual size on a monitor. The system could also assist the surgeon or technician in determining whether starting from a straight plate is a better decision than starting from a pre-bent plate and further bending the plate or back-bending it. During manual bending, the surgeon or technician could periodically hold the plate up to the template to check whether the desired curvature was achieved. Such an on-screen template might be a better visual guide for the surgeon than the physical template that was laid on bone because it may have thickness, hole spacing and general appearance more similar to the actual plate than the template itself.


If using the system to apply bending to a straight plate through an automatic process, processor 103 could electronically feed information on the locations and magnitudes of bends through wired/wireless interface to an automatic bending device. The bending device would activate a bending mechanism that could include computer-controlled rollers, clamps, and/or actuators that would apply the desired bending to the straight plate so that it best matches the template.


Use of selection device 101 and/or methods thereof may thus automate implant selection during surgery. Using such automation may reduce human error, for example, due to a surgeon overlooking and/or misjudging a best-fitting implant, and using such automation may provide quantitative evaluation to augment subjective human judgment.


Moreover, it may be difficult for a surgeon to visually assess relative fits of different implants sitting in a case, and it may be impractical to try all of them. For example, each implant that is tried but not used may be thereafter unusable due to contamination from contact to the implant site. Virtual fitting of implants according to methods/devices herein may spare unused implants from unnecessary contamination at the surgical site, thereby reducing waste. Moreover, assisted implant selection using methods/devices herein may also reduce the time of the procedure, thereby reducing time that the patient is under anesthesia, benefitting both the surgical team and patient. By improving initial selection of the implant, bending of implants to fit the patient's anatomy may be reduced. Because excessive bending of an implant may weaken the implant, a reduction in bending may reduce a risk of implant failure.


Methods, devices, and computer program products discussed herein may thus provide a combination of speed of selection and initial accuracy of fit that is not attainable using manual selection. Such speed and accuracy can reduce the time required for surgery, reduce the time that a patient is subject to anesthesia, improve a fit of the implant, and improve the ultimate patient outcome. Moreover, by providing coordinates incrementally for both the template and for the available implants, an efficiency of the comparisons may be improved thereby improving an efficiency of operations of processor 103 to improve a computer-related technology.


According to some embodiments, clavicle plate (also referred to as an implant) selection software may be provided using a Windows-based interface or a smartphone/tablet app. Image processing may include detecting curvature of template 203b (also referred to as a plate surrogate) in 2 planes and extracting the 3D shape parameters for comparison to a database of implant shapes/sizes. VTK, Qt, Open-CV and other open-source options may be used by processor 103 to detect colors and contours from photo images. Moreover, a Structured Query Language SQL database may be used to store a library of information regarding shapes, dimensions, etc., regarding the available implants. Such a database may be stored in memory 105 of selection device 100, or the database may be stored external to selection device 100 with processor 103 accessing information from the database through wired/wireless interface 109.


Upon receiving images of FIGS. 7A and 7B, for example, processor 103 may automatically process and reorient the images using alignment marks, because orientations of the images from different cameras (of different types) cannot be guaranteed and because there may be uncertainty due to variations in how the user holds the device while taking the photo(s). Alignment marks according to some embodiments are discussed above with respect to FIGS. 6, 8A-C, and 9. According to some other embodiments, alignment may be provided using marks illustrated in FIG. 12.


As shown in FIG. 12, alignment marks G1 and G2 and alignment marks Y3 and Y4 may be used for one image of template 203b, and alignment marks Y1 and Y2 and alignment marks R1 and R2 may be used for another image of template 203b. These different colored alignment marks may be used to specify left-right-up-down, and these markings may be autodetected by processor 103 using Open-CV color masking operations. The alignment marks of FIG. 12 may also be used to distinguish a series of shots from one another to reduce/prevent accidental loading of duplicate shots instead of a valid pair. To distinguish the shots, one side of image cradle 501 may display yellow and green alignment marks (half circles that form circular dots when properly aligned), and the other side of image cradle 501 may display yellow and red alignment marks (half circles that form circular dots when properly aligned). As further shown in FIG. 12, lines may extend from the intersection of the two sides 503 and 505, for example, to provide scale.


As shown in the view of FIG. 13, half circles at the intersection of the sides (e.g., Y2 and R2 may be spaced apart by 178 mm, and half circles at the top of cradle 501 (e.g., G1 and Y3) may be spaced apart by 136 mm. Moreover, half circles at the top of cradle 501 (e.g., G1 and Y3) may have a smaller diameter than half circles at the intersection of sides (e.g., Y2 and R2) because the half circles at the intersection of sides (e.g., Y2 and R2) will be further from the camera when alignment is performed and the image is taken. Alignment using this arrangement of alignment half circles may facilitate/force the user to take photos from a distance of about 35 cm from template 203b. Having this focal distance as a known value while inter-dot distance is also known may allow scaling of pixels to millimeters. Additionally, the known spacing of vertical blue lines may be 1.0 cm to provide a secondary check of the scaling. This information can be used to determine a length of template 203b. FIGS. 14A and 14B provide the resulting orthogonal images of template 203b taken using image cradle 501 with the alignment markers of FIG. 12.


After processor 103 has processed the images of FIGS. 14A and 14B to identify segment locations, processor 103 may join the segment locations together and fit the segment locations with a spline. Processor 103 may calculate and render splines, for example, using VTK. FIGS. 15A, 15B, and 15C are screenshots showing a spline of 10 points joined together with a cylinder. According to some embodiments, it may be possible to grab each white spherical handle with a left mouse click and drag to adjust its position. It may also be possible to change the perspective view by clicking and dragging on the screen anywhere other than handle locations. In the app, processor 103 may adjust handle positions to modify a fit factor and also possibly the selection of best-fitting plate. Accordingly, it may be a useful feature to enable the user (surgeon) to proactively explore possibilities for plates/implants. For example, if the user makes the end of the plate a little more curvy, processor 103 may suggest a different plate.


According to additional embodiments, processor 103 may render images on a display of user interface 101 as flat strips joined by spheres instead of cylinders to represent actual plates. Moreover, processor 103 may only allow the user to move handles laterally on the display while keeping the longitudinal spacing between handles constant to facilitate easier comparison of the spline with a database of implant splines. Processor 103 may also use a data structure for passing spline points to code that compares the spline points to stored splines in a database.


Moreover, different layouts may be provided for the database of implants. One configuration may orient the found plate shape from template 203b such that the starting end is at (xs,ys,zs)=(0,0,0) and the opposite end is at (xf,yf,zf)=(0,0,zf) as shown in FIGS. 16A and 16B. That is, processor 103 may spatially transform the shape to orient it along the z axis. Then, processor 103 may rotate the shape to place it with the flat surface of the plate parallel to the xz plane and perpendicular to the yz plane, with dorsal direction toward +y and ventral direction toward −y. This configuration is shown in FIGS. 16A and 16B.


Processor 103 may then evaluate x,y values of the template at fixed increments of z and compare these x,y values against x,y values stored in the database for the different available implants. The z increment should be fine enough to capture the curvature of the plate without having to store an excessive number of values. For example, 5 mm increments may be used. Since plates may have different lengths (in addition to different curves/contours), the number of entries evaluated and compared may depend on the desired plate length. Table 1 below shows an example for a 50 mm plate. In this example, processor 103 may query the database and measure the average vector distance from the measured 9 incremental points (i.e., points 2-10) relative to each the corresponding points stored in database entries for plates with matching length (or close to the same length). The smallest mean would be the best matching plate. This mean value would also provide a gauge of goodness of fit.









TABLE 1







Measured data for template/implants.












Point
X
Y
Z
















1
0
0
0



2
X2
Y2
5



3
X3
Y3
10



4
X4
Y4
15



5
X5
Y5
20



6
X6
Y6
25



7
X7
Y7
30



8
X8
Y8
35



9
X9
Y9
40



10
 X10
 Y10
45



11
0
0
50










Operations of selection device 100 to identify a medical implant (e.g., a bone plate) from a plurality of medical implants will now be discussed with reference to the flow chart of FIG. 17.


At block 1701, processor 103 may provide dimensional parameters for each of the plurality of medical implants. Providing the dimensional parameters for each of the plurality of medical implants may include providing access to a stored database of the dimensional parameters that define a shape/dimensions for each of the respective medical implants of the plurality of medical implants. The stored database may be provided in memory 105 of selection device 100, or the stored database may be provided outside of selection device 100 with processor 103 accessing the stored database through wired/wireless interface 109. The stored database, for example, may include a table of points for each available implant as discussed above with respect to Table 1, such that dimensional parameters for each of the plurality of medical implants include coordinate values (e.g., x and y coordinate values) corresponding to increments along each of the medical implants (e.g., along the z-axis).


At block 1703, processor 103 may provide dimensional parameters corresponding to an anatomical surface (e.g., a surface of a bone) to which the implant is to be fixed. The dimensional parameters corresponding to the anatomical surface, for example, may include a table of points as discussed above with respect to Table 1, such that dimensional parameters corresponding to the anatomical surface include coordinate values (e.g., x and y coordinate values) corresponding to increments along the anatomical surface (e.g., along the z-axis). The dimensional parameters corresponding to the anatomical surface may be provided based on digital image data including first and second digital images that are different. Moreover, the digital image data may be taken from a template representing the anatomical surface, or the digital image data is taken from the anatomical surface (directly). Selection device 100, for example, may include camera 107 that captures digital images to be processed by processor 103 to generate the dimensional parameters. According to some other embodiments, images may be captured outside of selection device 100, received by processor 103 through wired/wireless interface 109, and processed by processor 103 to generate the dimensional parameters. According to still other embodiments, images may be captured outside selection device 100, the images may be processed outside of selection device 100 to generate the dimensional parameters, and the dimensional parameters may be received by processor 103 through wired/wireless interface 109.


At block 1705, processor 103 may compare the dimensional parameters for each of the plurality of medical implants with the dimensional parameters corresponding to the anatomical surface. For example, processor 103 may compare coordinate values (e.g., x-y coordinate values taken at increments along a z-axis) corresponding to the plurality of medical implants with coordinate values (e.g., x-y coordinate values taken at increments along a z-axis) corresponding to the anatomical surface. For such a comparison, processor 103 may determine differences between the coordinate values corresponding to the anatomical surface and respective ones of the coordinate values corresponding to each of the plurality of medical implants.


At block 1707, processor 103 may select one of the medical implants from the plurality of medical implants based on comparing the dimensional parameters for each of the plurality of medical implants with the dimensional parameters corresponding to the anatomical surface. Processor 103, for example, may select the one of the medical implants having a least average difference between the coordinate values corresponding to the anatomical surface and the coordinate values corresponding to the one of the medical implants that is selected.


At block 1709, processor 103 may provide an identification of the medical implant selected from the plurality of medical implants through user interface 101. The identification of the selected medical implant, for example, may be provided visually through a display (e.g., touch screen 101c) of user interface 101 and/or audibly through speaker 101d of user interface 101. The identification may include a name, a part number, a size, etc. In addition, processor 101 may provide additional information (visually or audibly), such as a recommended location to bend the selected implant.


As discussed above according to some embodiments, the dimensional parameters corresponding to the anatomical surface may be provided at block 1703 based on digital image data including first and second digital images that are different, with the first and second digital images being taken from template 203b representing the anatomical surface as shown, for example, in FIGS. 14A and 14B.


The first and second digital images, for example, may be taken of template 203b in image cradle 501 that includes first alignment markers (Y1, Y2, R1, and R2) for the first digital image and second alignment markers (Y3, Y4, G1, and G2) for the second digital image as discussed above, for example, with respect to FIGS. 12, 13, 14A, and 14B. Based on the images of FIGS. 14A and 14B, processor 103 may provide the dimensional parameters corresponding to the anatomical surface at block 1703 responsive to verifying alignment of the first digital image of FIG. 14A based on the first alignment markers (Y1, Y2, R1, and R2) and responsive to verifying alignment of the second digital image of FIG. 14B based on the second alignment markers (Y3, Y4, G1, and G2).


As discussed above, the first and second digital images of FIGS. 14A and 14B may be taken of template 203b in cradle 501 that includes first alignment markers (Y1, Y2, R1, and R2) for the first digital image and second alignment markers (Y3, Y4, G1, and G2) for the second digital image. At block 1703, processor 103 may use alignment markers for a digital image to determine alignment/misalignment of the image and either accept an image that is aligned, or reject an image that is misaligned and request that the user (surgeon) retake the rejected image. Processor 103, for example, may provide the dimensional parameters corresponding to the anatomical surface at block 1703 using the following operations. Responsive to first user input, a first version of the first digital image of FIG. 14A may be captured through camera 107 including template 203b in cradle 501 with first alignment markers (Y1, Y2, R1, and R2). Responsive to detecting misalignment of the first alignment markers in the first version of the first digital image of FIG. 14A, processor 103 may provide an instruction to retake the first digital image of FIG. 14A through the user interface 101 (e.g., a visual instruction through a display and/or an audible instruction through a speaker). Responsive to second user input after providing the instruction, a second version of the first digital image of FIG. 14A may be captured through camera 107 including template 203b in cradle 501 with the first alignment markers. Responsive to third user input, the second digital image of FIG. 14B may be captured through camera 107 including template 203b in cradle 501 with the second alignment markers. Responsive to the second version of the first digital image of FIG. 14A being aligned based on alignment markers (Y1, Y2, R1, and R2) and the second digital image of FIG. 14B being aligned based on alignment markers (Y3, Y4, G1, and G2), processor 103 may provide the dimensional parameters corresponding to the anatomical surface based on the second version of the first digital image and the second digital image. Either image may be rejected any number of times until each image is captured with proper alignment.


As discussed above, the first and second digital images of FIGS. 14A and 14B may be taken of template 203b in cradle 501 that includes first alignment markers (Y1, Y2, R1, and R2) for the first digital image and second alignment markers (Y3, Y4, G1, and G2) for the second digital image. Processor 103 may provide the dimensional parameters corresponding to the anatomical surface based on the at least one of the first alignment markers and/or the second alignment markers. Processor 103, for example, may use an alignment/misalignment of the alignment markers to determine a camera distance from cradle/template 501/203b, a camera angle relative to cradle/template 501/203b, and/or other information that may be used to determine dimensional parameters corresponding to the anatomical surface.


According to some other embodiments, processor 103 may the dimensional parameters corresponding to the anatomical surface based on digital image data including first and second digital images that are different, wherein the first and second digital images are taken from the anatomical surface (directly). Such images may be taken either before or during the operation, and the digital image data may include at least one of x-ray image data, computed tomography image data, ultrasound image data, magnetic resonance image data, and/or photographic image data.


According to some embodiments, providing dimensional parameters corresponding to the anatomical surface at block 1703 may include providing a curve (e.g., a spline) to represent a shape of the anatomical surface, and selecting at block 1707 may include selecting the one of the medical implants to match the shape of the anatomical surface based on the curve.


Further Definitions and Embodiments are discussed below.


In the above-description of various embodiments of the present disclosure, aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or contexts including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented in entirely hardware, entirely software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a “circuit,” “module,” “component,” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product comprising one or more computer readable media having computer readable program code embodied thereon.


Any combination of one or more computer readable media may be used. The computer readable media may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an appropriate optical fiber with a repeater, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.


A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.


Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C #, VB.NET, Python or the like, conventional procedural programming languages, such as the “C” programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS).


Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable instruction execution apparatus, create a mechanism for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.


These computer program instructions may also be stored in a computer readable medium that when executed can direct a computer processor, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions when stored in the computer readable medium produce an article of manufacture including instructions which when executed, cause a computer processor to implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer processor, other programmable instruction execution apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatuses or other devices to produce a computer implemented process such that the instructions which execute on the computer processor or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.


It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.


When an element is referred to as being “connected”, “coupled”, “responsive”, or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected”, “directly coupled”, “directly responsive”, or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, “coupled”, “connected”, “responsive”, or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term “and/or” includes any and all combinations of one or more of the associated listed items.


It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification.


The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various aspects of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.


The terminology used herein is for the purpose of describing particular aspects only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but do not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Like reference numbers signify like elements throughout the description of the figures.


The corresponding structures, materials, acts, and equivalents of any means or step plus function elements in the claims below are intended to include any disclosed structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present disclosure has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the disclosure. The aspects of the disclosure herein were chosen and described in order to best explain principles of the disclosure and practical applications, and to enable others of ordinary skill in the art to understand the disclosure with various modifications as are suited to the particular use contemplated.


Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present inventive concepts. All such variations and modifications are intended to be included herein within the scope of present inventive concepts. Accordingly, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the examples of embodiments are intended to cover all such modifications, enhancements, and other embodiments, which fall within the spirit and scope of present inventive concepts. Thus, to the maximum extent allowed by law, the scope of present inventive concepts are to be determined by the broadest permissible interpretation of the present disclosure including the following claims and their equivalents, and shall not be restricted or limited by the foregoing detailed description.

Claims
  • 1. A method of a selection device for identifying a bone plate implant from a plurality of bone plate implants to be fixed to an anatomical surface, the method comprising: providing a three dimensional contour corresponding to the anatomical surface based on digital image data taken from a template representing the anatomical surface of a particular patient;wherein the template includes markings to facilitate image recognition and provided on only one of two primary faces of the template to ensure proper orientation of the template,wherein the three dimensional contour corresponding to the anatomical surface is prepared by templating, imaging and image analysis;receiving by a processor of the selection device the provided three dimensional contour;comparing, by the processor of the selection device, the received three dimensional contour of the anatomical surface of the particular patient against stored three dimensional contours of a plurality of bone plate implants for optimal fit, the stored three dimensional contour including a shape of the bone plate implant;selecting, by the processor of the selection device, one of the bone plate implants from the plurality of bone plate implants based on the comparison of the optimal fit;providing, by the processor of the selection device, an identification of the selected bone plate implant through a user interface.
  • 2. The method of claim 1, further comprising providing access to a database storing the three dimensional contours of a plurality of bone plate implants that define the shape for each of the respective implants of the plurality of bone plate implants.
  • 3. The method of claim 2, wherein providing access comprises providing a curve to represent a shape of the anatomical surface, and wherein selecting comprises selecting the one of the implants to match the shape of the anatomical surface based on the curve.
  • 4. The method of claim 1, wherein the three dimensional countour for each of the plurality of implants include coordinate values corresponding to increments along each of the implants, the method further comprising providing coordinate values corresponding to increments along the anatomical surface, and wherein comparing comprises comparing the coordinate values corresponding to the plurality of implants with the coordinate values corresponding to the anatomical surface.
  • 5. The method of claim 4, wherein comparing comprises determining differences between the coordinate values corresponding to the anatomical surface and respective ones of the coordinate values corresponding to each of the plurality of implants, and wherein selecting comprises selecting the one of the implants having a least average difference between the coordinate values corresponding to the anatomical surface and the coordinate values corresponding to the one of the implants that is selected.
  • 6. The method of claim 1, wherein the first and second digital images are taken from the template positioned in a cradle that includes first alignment markers for the first digital image and second alignment markers for the second digital image, and wherein providing the three dimensional contour corresponding to the anatomical surface comprises providing the dimensional parameters corresponding to the anatomical surface based on the first and second digital images responsive to verifying alignment of the first digital image based on the first alignment markers and responsive to verifying alignment of the second digital image based on the second alignment markers.
  • 7. The method of claim 1, wherein the first and second images are taken from the template positioned in a cradle including first alignment markers for the first digital image and second alignment markers for the second digital image, wherein providing the three dimensional contour line corresponding to the anatomical surface comprises: responsive to first user input, capturing a first version of the first digital image including the template in the cradle with the first alignment markers,responsive to misalignment of the first alignment markers in the first version of the first digital image, providing an instruction to retake the first digital image through the user interface,responsive to second user input after providing the instruction, capturing a second version of the first digital image including the template in the cradle with the first alignment markers,responsive to third user input, capturing the second digital image including the template in the cradle with the second alignment markers, andproviding the three dimensional contour corresponding to the anatomical surface based on the second version of the first digital image and the second digital image.
  • 8. The method of claim 1, wherein the first and second digital images are taken of the template in a cradle that includes first alignment markers for the first digital image and second alignment markers for the second digital image, and wherein providing the three dimensional contour corresponding to the anatomical surface comprises providing the dimensional parameters corresponding to the anatomical surface based on the at least one of the first alignment markers and/or the second alignment markers.
  • 9. The method of claim 1, wherein the digital image data is taken from the anatomical surface.
  • 10. The method of claim 9, wherein the digital image data comprises at least one of x-ray image data, computed tomography image data, ultrasound image data, magnetic resonance image data, and/or photographic image data.
  • 11. The method of claim 1, wherein the anatomical surface is a surface of a bone.
  • 12. A computer program product, comprising: a tangible computer readable storage medium comprising computer readable program code embodied in the medium that when executed by a processor causes the processor to perform operations comprising:providing a three dimensional contour corresponding to the anatomical surface of a particular patient based on digital image data taken from a template representing the anatomical surface;wherein the template includes markings to facilitate image recognition and provided on only one of two primary faces of the template to ensure proper orientation of the template;wherein the three dimensional contour corresponding to the an anatomical surface is prepared by templating, imaging and image analysis;receiving by the processor the provided three dimensional contour;comparing the received three dimensional contour of the anatomical surface of the particular patient against stored three dimensional contour of a plurality of bone plate implants for optimal fit, the stored three dimensional contour including a shape of the bone plate implant;selecting one of the bone plate implants from the plurality of bone plate implants based on the comparison of the optimal fit;providing an identification of the selected bone plate implant through a user interface.
US Referenced Citations (683)
Number Name Date Kind
4150293 Franke Apr 1979 A
5246010 Gazzara et al. Sep 1993 A
5354314 Hardy et al. Oct 1994 A
5397323 Taylor et al. Mar 1995 A
5598453 Baba et al. Jan 1997 A
5772594 Barrick Jun 1998 A
5791908 Gillio Aug 1998 A
5820559 Ng et al. Oct 1998 A
5825982 Wright et al. Oct 1998 A
5887121 Funda et al. Mar 1999 A
5911449 Daniele et al. Jun 1999 A
5951475 Gueziec et al. Sep 1999 A
5987960 Messner et al. Nov 1999 A
6012216 Esteves et al. Jan 2000 A
6031888 Ivan et al. Feb 2000 A
6033415 Mittelstadt et al. Mar 2000 A
6080181 Jensen et al. Jun 2000 A
6106511 Jensen Aug 2000 A
6122541 Cosman et al. Sep 2000 A
6144875 Schweikard et al. Nov 2000 A
6157853 Blume et al. Dec 2000 A
6167145 Foley et al. Dec 2000 A
6167292 Badano et al. Dec 2000 A
6201984 Funda et al. Mar 2001 B1
6203196 Meyer et al. Mar 2001 B1
6205411 DiGioia, III et al. Mar 2001 B1
6212419 Blume et al. Apr 2001 B1
6231565 Tovey et al. May 2001 B1
6236875 Bucholz et al. May 2001 B1
6246900 Cosman et al. Jun 2001 B1
6301495 Gueziec et al. Oct 2001 B1
6306126 Montezuma Oct 2001 B1
6312435 Wallace et al. Nov 2001 B1
6314311 Williams et al. Nov 2001 B1
6320929 Von Der Haar Nov 2001 B1
6322567 Mittelstadt et al. Nov 2001 B1
6325808 Bernard et al. Dec 2001 B1
6340363 Bolger et al. Jan 2002 B1
6377011 Ben-Ur Apr 2002 B1
6379302 Kessman et al. Apr 2002 B1
6402762 Hunter et al. Jun 2002 B2
6424885 Niemeyer et al. Jul 2002 B1
6447503 Wynne et al. Sep 2002 B1
6451027 Cooper et al. Sep 2002 B1
6477400 Barrick Nov 2002 B1
6484049 Seeley et al. Nov 2002 B1
6487267 Wolter Nov 2002 B1
6490467 Bucholz et al. Dec 2002 B1
6490475 Seeley et al. Dec 2002 B1
6499488 Hunter et al. Dec 2002 B1
6501981 Schweikard et al. Dec 2002 B1
6507751 Blume et al. Jan 2003 B2
6535756 Simon et al. Mar 2003 B1
6560354 Maurer, Jr. et al. May 2003 B1
6565554 Niemeyer May 2003 B1
6587750 Gerbi et al. Jul 2003 B2
6614453 Suri et al. Sep 2003 B1
6614871 Kobiki et al. Sep 2003 B1
6619840 Rasche et al. Sep 2003 B2
6636757 Jascob et al. Oct 2003 B1
6645196 Nixon et al. Nov 2003 B1
6666579 Jensen Dec 2003 B2
6669635 Kessman et al. Dec 2003 B2
6701173 Nowinski et al. Mar 2004 B2
6757068 Foxlin Jun 2004 B2
6782287 Grzeszczuk et al. Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6786896 Madhani et al. Sep 2004 B1
6788018 Blumenkranz Sep 2004 B1
6804581 Wang et al. Oct 2004 B2
6823207 Jensen et al. Nov 2004 B1
6827351 Graziani et al. Dec 2004 B2
6837892 Shoham Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6856826 Seeley et al. Feb 2005 B2
6856827 Seeley et al. Feb 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6892090 Verard et al. May 2005 B2
6920347 Simon et al. Jul 2005 B2
6922632 Foxlin Jul 2005 B2
6968224 Kessman et al. Nov 2005 B2
6978166 Foley et al. Dec 2005 B2
6988009 Grimm et al. Jan 2006 B2
6991627 Madhani et al. Jan 2006 B2
6996487 Jutras et al. Feb 2006 B2
6999852 Green Feb 2006 B2
7007699 Martinelli et al. Mar 2006 B2
7016457 Senzig et al. Mar 2006 B1
7043961 Pandey et al. May 2006 B2
7062006 Pelc et al. Jun 2006 B1
7063705 Young et al. Jun 2006 B2
7072707 Galloway, Jr. et al. Jul 2006 B2
7083615 Peterson et al. Aug 2006 B2
7097640 Wang et al. Aug 2006 B2
7099428 Clinthorne et al. Aug 2006 B2
7108421 Gregerson et al. Sep 2006 B2
7130676 Barrick Oct 2006 B2
7139418 Abovitz et al. Nov 2006 B2
7139601 Bucholz et al. Nov 2006 B2
7155316 Sutherland et al. Dec 2006 B2
7164968 Treat et al. Jan 2007 B2
7167738 Schweikard et al. Jan 2007 B2
7169141 Brock et al. Jan 2007 B2
7172627 Fiere et al. Feb 2007 B2
7194120 Wicker et al. Mar 2007 B2
7197107 Arai et al. Mar 2007 B2
7231014 Levy Jun 2007 B2
7231063 Naimark et al. Jun 2007 B2
7239940 Wang et al. Jul 2007 B2
7248914 Hastings et al. Jul 2007 B2
7301648 Foxlin Nov 2007 B2
7302288 Schellenberg Nov 2007 B1
7313430 Urquhart et al. Dec 2007 B2
7318805 Schweikard et al. Jan 2008 B2
7318827 Leitner et al. Jan 2008 B2
7319897 Leitner et al. Jan 2008 B2
7324623 Heuscher et al. Jan 2008 B2
7327865 Fu et al. Feb 2008 B2
7331967 Lee et al. Feb 2008 B2
7333642 Green Feb 2008 B2
7339341 Oleynikov et al. Mar 2008 B2
7366562 Dukesherer et al. Apr 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7422592 Morley et al. Sep 2008 B2
7435216 Kwon et al. Oct 2008 B2
7440793 Chauhan et al. Oct 2008 B2
7460637 Clinthorne et al. Dec 2008 B2
7466303 Yi et al. Dec 2008 B2
7493153 Ahmed et al. Feb 2009 B2
7505617 Fu et al. Mar 2009 B2
7533892 Schena et al. May 2009 B2
7542791 Mire et al. Jun 2009 B2
7555331 Viswanathan Jun 2009 B2
7567834 Clayton et al. Jul 2009 B2
7594912 Cooper et al. Sep 2009 B2
7606613 Simon et al. Oct 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7623902 Pacheco Nov 2009 B2
7630752 Viswanathan Dec 2009 B2
7630753 Simon et al. Dec 2009 B2
7643862 Schoenefeld Jan 2010 B2
7660623 Hunter et al. Feb 2010 B2
7661881 Gregerson et al. Feb 2010 B2
7683331 Chang Mar 2010 B2
7683332 Chang Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7691098 Wallace et al. Apr 2010 B2
7702379 Avinash et al. Apr 2010 B2
7702477 Tuemmler et al. Apr 2010 B2
7711083 Heigl et al. May 2010 B2
7711406 Kuhn et al. May 2010 B2
7720523 Omernick et al. May 2010 B2
7725253 Foxlin May 2010 B2
7726171 Langlotz et al. Jun 2010 B2
7742801 Neubauer et al. Jun 2010 B2
7751865 Jascob et al. Jul 2010 B2
7760849 Zhang Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7763015 Cooper et al. Jul 2010 B2
7787699 Mahesh et al. Aug 2010 B2
7796728 Bergfjord Sep 2010 B2
7813838 Sommer Oct 2010 B2
7818044 Dukesherer et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7831294 Viswanathan Nov 2010 B2
7834484 Sartor Nov 2010 B2
7835557 Kendrick et al. Nov 2010 B2
7835778 Foley et al. Nov 2010 B2
7835784 Mire et al. Nov 2010 B2
7840253 Tremblay et al. Nov 2010 B2
7840256 Lakin et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7844320 Shahidi Nov 2010 B2
7853305 Simon et al. Dec 2010 B2
7853313 Thompson Dec 2010 B2
7865269 Prisco et al. Jan 2011 B2
D631966 Perloff et al. Feb 2011 S
7879045 Gielen et al. Feb 2011 B2
7881767 Strommer et al. Feb 2011 B2
7881770 Melkent et al. Feb 2011 B2
7886743 Cooper et al. Feb 2011 B2
RE42194 Foley et al. Mar 2011 E
RE42226 Foley et al. Mar 2011 E
7900524 Calloway et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7909122 Schena et al. Mar 2011 B2
7925653 Saptharishi Apr 2011 B2
7930065 Larkin et al. Apr 2011 B2
7935130 Williams May 2011 B2
7940999 Liao et al. May 2011 B2
7945012 Ye et al. May 2011 B2
7945021 Shapiro et al. May 2011 B2
7953470 Vetter et al. May 2011 B2
7954397 Choi et al. Jun 2011 B2
7971341 Dukesherer et al. Jul 2011 B2
7974674 Hauck et al. Jul 2011 B2
7974677 Mire et al. Jul 2011 B2
7974681 Wallace et al. Jul 2011 B2
7979157 Anvari Jul 2011 B2
7983733 Viswanathan Jul 2011 B2
7988215 Seibold Aug 2011 B2
7996110 Lipow et al. Aug 2011 B2
8004121 Sartor Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8010177 Csavoy et al. Aug 2011 B2
8019045 Kato Sep 2011 B2
8021310 Sanborn et al. Sep 2011 B2
8035685 Jensen Oct 2011 B2
8046054 Kim et al. Oct 2011 B2
8046057 Clarke Oct 2011 B2
8052688 Wolf, II Nov 2011 B2
8054184 Cline et al. Nov 2011 B2
8054752 Druke et al. Nov 2011 B2
8057397 Li et al. Nov 2011 B2
8057407 Martinelli et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8062375 Glerum et al. Nov 2011 B2
8066524 Burbank et al. Nov 2011 B2
8073335 Labonville et al. Dec 2011 B2
8079950 Stern et al. Dec 2011 B2
8086299 Adler et al. Dec 2011 B2
8092370 Roberts et al. Jan 2012 B2
8098914 Liao et al. Jan 2012 B2
8100950 St. Clair et al. Jan 2012 B2
8105320 Manzo Jan 2012 B2
8108025 Csavoy et al. Jan 2012 B2
8109877 Moctezuma de la Barrera et al. Feb 2012 B2
8112292 Simon Feb 2012 B2
8116430 Shapiro et al. Feb 2012 B1
8120301 Goldberg et al. Feb 2012 B2
8121249 Wang et al. Feb 2012 B2
8123675 Funda et al. Feb 2012 B2
8133229 Bonutti Mar 2012 B1
8142420 Schena Mar 2012 B2
8147494 Leitner et al. Apr 2012 B2
8150494 Simon et al. Apr 2012 B2
8150497 Gielen et al. Apr 2012 B2
8150498 Gielen et al. Apr 2012 B2
8165658 Waynik et al. Apr 2012 B2
8170313 Kendrick et al. May 2012 B2
8179073 Farritor et al. May 2012 B2
8182476 Julian et al. May 2012 B2
8184880 Zhao et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8208708 Homan et al. Jun 2012 B2
8208988 Jensen Jun 2012 B2
8219177 Smith et al. Jul 2012 B2
8219178 Smith et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8224024 Foxlin et al. Jul 2012 B2
8224484 Swarup et al. Jul 2012 B2
8225798 Baldwin et al. Jul 2012 B2
8228368 Zhao et al. Jul 2012 B2
8231610 Jo et al. Jul 2012 B2
8263933 Hartmann et al. Jul 2012 B2
8239001 Verard et al. Aug 2012 B2
8241271 Millman et al. Aug 2012 B2
8248413 Gattani et al. Aug 2012 B2
8256319 Cooper et al. Sep 2012 B2
8271069 Jascob et al. Sep 2012 B2
8271130 Hourtash Sep 2012 B2
8281670 Larkin et al. Oct 2012 B2
8282653 Nelson et al. Oct 2012 B2
8301226 Csavoy et al. Oct 2012 B2
8311611 Csavoy et al. Nov 2012 B2
8320991 Jascob et al. Nov 2012 B2
8332012 Kienzle, III Dec 2012 B2
8333755 Cooper et al. Dec 2012 B2
8335552 Stiles Dec 2012 B2
8335557 Maschke Dec 2012 B2
8348931 Cooper et al. Jan 2013 B2
8353963 Glerum Jan 2013 B2
8358818 Miga et al. Jan 2013 B2
8359730 Burg et al. Jan 2013 B2
8374673 Adcox et al. Feb 2013 B2
8374723 Zhao et al. Feb 2013 B2
8379791 Forthmann et al. Feb 2013 B2
8386019 Camus et al. Feb 2013 B2
8392022 Ortmaier et al. Mar 2013 B2
8394099 Patwardhan Mar 2013 B2
8395342 Prisco Mar 2013 B2
8398634 Manzo et al. Mar 2013 B2
8400094 Schena Mar 2013 B2
8414957 Enzerink et al. Apr 2013 B2
8418073 Mohr et al. Apr 2013 B2
8450694 Baviera et al. May 2013 B2
8452447 Nixon May 2013 B2
RE44305 Foley et al. Jun 2013 E
8462911 Vesel et al. Jun 2013 B2
8465476 Rogers et al. Jun 2013 B2
8465771 Wan et al. Jun 2013 B2
8467851 Mire et al. Jun 2013 B2
8467852 Csavoy et al. Jun 2013 B2
8469947 Devengenzo et al. Jun 2013 B2
RE44392 Hynes Jul 2013 E
8483434 Buehner et al. Jul 2013 B2
8483800 Jensen et al. Jul 2013 B2
8486532 Enzerink et al. Jul 2013 B2
8489235 Moll et al. Jul 2013 B2
8500722 Cooper Aug 2013 B2
8500728 Newton et al. Aug 2013 B2
8504201 Moll et al. Aug 2013 B2
8506555 Ruiz Morales Aug 2013 B2
8506556 Schena Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8512318 Tovey et al. Aug 2013 B2
8515576 Lipow et al. Aug 2013 B2
8518120 Glerum et al. Aug 2013 B2
8521331 Itkowitz Aug 2013 B2
8526688 Groszmann et al. Sep 2013 B2
8526700 Issacs Sep 2013 B2
8527094 Kumar et al. Sep 2013 B2
8528440 Morley et al. Sep 2013 B2
8532741 Heruth et al. Sep 2013 B2
8541970 Nowlin et al. Sep 2013 B2
8548563 Simon et al. Oct 2013 B2
8549732 Burg et al. Oct 2013 B2
8551114 Ramos de la Pena Oct 2013 B2
8551116 Julian et al. Oct 2013 B2
8556807 Scott et al. Oct 2013 B2
8556979 Glerum et al. Oct 2013 B2
8560118 Green et al. Oct 2013 B2
8561473 Blumenkranz Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8571638 Shoham Oct 2013 B2
8571710 Coste-Maniere et al. Oct 2013 B2
8573465 Shelton, IV Nov 2013 B2
8574303 Sharkey et al. Nov 2013 B2
8585420 Burbank et al. Nov 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597198 Sanborn et al. Dec 2013 B2
8600478 Verard et al. Dec 2013 B2
8603077 Cooper et al. Dec 2013 B2
8611985 Lavallee et al. Dec 2013 B2
8613230 Blumenkranz et al. Dec 2013 B2
8621939 Blumenkranz et al. Jan 2014 B2
8624537 Nowlin et al. Jan 2014 B2
8630389 Kato Jan 2014 B2
8634897 Simon et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8639000 Zhao et al. Jan 2014 B2
8641726 Bonutti Feb 2014 B2
8644907 Hartmann et al. Feb 2014 B2
8657809 Schoepp Feb 2014 B2
8660635 Simon et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8675939 Moctezuma de la Barrera Mar 2014 B2
8678647 Gregerson et al. Mar 2014 B2
8679125 Smith et al. Mar 2014 B2
8679183 Glerum et al. Mar 2014 B2
8682413 Lloyd Mar 2014 B2
8684253 Giordano et al. Apr 2014 B2
8685098 Glerum et al. Apr 2014 B2
8693730 Umasuthan et al. Apr 2014 B2
8694075 Groszmann et al. Apr 2014 B2
8696458 Foxlin et al. Apr 2014 B2
8700123 Okamura et al. Apr 2014 B2
8706086 Glerum Apr 2014 B2
8706185 Foley et al. Apr 2014 B2
8706301 Zhao et al. Apr 2014 B2
8717430 Simon et al. May 2014 B2
8727618 Maschke et al. May 2014 B2
8734432 Tuma et al. May 2014 B2
8738115 Amberg et al. May 2014 B2
8738181 Greer et al. May 2014 B2
8740882 Jun et al. Jun 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8764448 Yang et al. Jul 2014 B2
8771170 Mesallum et al. Jul 2014 B2
8781186 Clements et al. Jul 2014 B2
8781630 Banks et al. Jul 2014 B2
8784385 Boyden et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8787520 Baba Jul 2014 B2
8792704 Isaacs Jul 2014 B2
8798231 Notohara et al. Aug 2014 B2
8800838 Shelton, IV Aug 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8812077 Dempsey Aug 2014 B2
8814793 Brabrand Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8818105 Myronenko et al. Aug 2014 B2
8820605 Shelton, IV Sep 2014 B2
8821511 von Jako et al. Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827996 Scott et al. Sep 2014 B2
8828024 Farritor et al. Sep 2014 B2
8830224 Zhao et al. Sep 2014 B2
8834489 Cooper et al. Sep 2014 B2
8834490 Bonutti Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8844789 Shelton, IV et al. Sep 2014 B2
8855822 Bartol et al. Oct 2014 B2
8858598 Seifert et al. Oct 2014 B2
8860753 Bhandarkar et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864798 Weiman et al. Oct 2014 B2
8864833 Glerum et al. Oct 2014 B2
8867703 Shapiro et al. Oct 2014 B2
8870880 Himmelberger et al. Oct 2014 B2
8876866 Zappacosta et al. Nov 2014 B2
8880223 Raj et al. Nov 2014 B2
8882803 Iott et al. Nov 2014 B2
8883210 Truncale et al. Nov 2014 B1
8888821 Rezach et al. Nov 2014 B2
8888853 Glerum et al. Nov 2014 B2
8888854 Glerum et al. Nov 2014 B2
8894652 Seifert et al. Nov 2014 B2
8894688 Suh Nov 2014 B2
8894691 Iott et al. Nov 2014 B2
8906069 Hansell et al. Dec 2014 B2
8964934 Ein-Gal Feb 2015 B2
8992580 Bar et al. Mar 2015 B2
8996169 Lightcap et al. Mar 2015 B2
9001963 Sowards-Emmerd et al. Apr 2015 B2
9002076 Khadem et al. Apr 2015 B2
9044190 Rubner et al. Jun 2015 B2
9107683 Hourtash et al. Aug 2015 B2
9125556 Zehavi et al. Sep 2015 B2
9131986 Greer et al. Sep 2015 B2
9215968 Schostek et al. Dec 2015 B2
9241724 Lang Jan 2016 B2
9308050 Kostrzewski et al. Apr 2016 B2
9380984 Li et al. Jul 2016 B2
9393039 Lechner et al. Jul 2016 B2
9398886 Gregerson et al. Jul 2016 B2
9398890 Dong et al. Jul 2016 B2
9414859 Ballard et al. Aug 2016 B2
9420975 Gutfleisch et al. Aug 2016 B2
9492235 Hourtash et al. Nov 2016 B2
9592096 Maillet et al. Mar 2017 B2
9750465 Engel et al. Sep 2017 B2
9757203 Hourtash et al. Sep 2017 B2
9795354 Menegaz et al. Oct 2017 B2
9814535 Bar et al. Nov 2017 B2
9820783 Donner et al. Nov 2017 B2
9833265 Donner et al. Nov 2017 B2
9848922 Tohmeh et al. Dec 2017 B2
9925011 Gombert et al. Mar 2018 B2
9931025 Graetzel et al. Apr 2018 B1
10034717 Miller et al. Jul 2018 B2
20010036302 Miller Nov 2001 A1
20020035321 Bucholz et al. Mar 2002 A1
20040068172 Nowinski et al. Apr 2004 A1
20040076259 Jensen et al. Apr 2004 A1
20050096502 Khalili May 2005 A1
20050143651 Verard et al. Jun 2005 A1
20050171558 Abovitz et al. Aug 2005 A1
20060100610 Wallace et al. May 2006 A1
20060173329 Marquart et al. Aug 2006 A1
20060184396 Dennis et al. Aug 2006 A1
20060241416 Marquart et al. Oct 2006 A1
20060291612 Nishide et al. Dec 2006 A1
20070015987 Benlloch Baviera et al. Jan 2007 A1
20070021738 Hasser et al. Jan 2007 A1
20070038059 Sheffer et al. Feb 2007 A1
20070073133 Schoenefeld Mar 2007 A1
20070156121 Millman et al. Jul 2007 A1
20070156157 Nahum et al. Jul 2007 A1
20070167712 Keglovich et al. Jul 2007 A1
20070233238 Huynh et al. Oct 2007 A1
20080004523 Jensen Jan 2008 A1
20080013809 Zhu et al. Jan 2008 A1
20080033283 Dellaca et al. Feb 2008 A1
20080046122 Manzo et al. Feb 2008 A1
20080082109 Moll et al. Apr 2008 A1
20080108912 Node-Langlois May 2008 A1
20080108991 von Jako May 2008 A1
20080109012 Falco et al. May 2008 A1
20080144906 Allred et al. Jun 2008 A1
20080161680 von Jako et al. Jul 2008 A1
20080161682 Kendrick et al. Jul 2008 A1
20080177203 von Jako Jul 2008 A1
20080214922 Hartmann et al. Sep 2008 A1
20080228068 Viswanathan et al. Sep 2008 A1
20080228196 Wang et al. Sep 2008 A1
20080235052 Node-Langlois et al. Sep 2008 A1
20080269596 Revie et al. Oct 2008 A1
20080287771 Anderson Nov 2008 A1
20080287781 Revie et al. Nov 2008 A1
20080300477 Lloyd et al. Dec 2008 A1
20080300478 Zuhars et al. Dec 2008 A1
20080302950 Park et al. Dec 2008 A1
20080306490 Lakin et al. Dec 2008 A1
20080319311 Hamadeh Dec 2008 A1
20090012509 Csavoy et al. Jan 2009 A1
20090030428 Omori et al. Jan 2009 A1
20090080737 Battle et al. Mar 2009 A1
20090185655 Koken et al. Jul 2009 A1
20090198121 Hoheisel Aug 2009 A1
20090216113 Meier et al. Aug 2009 A1
20090228019 Gross et al. Sep 2009 A1
20090259123 Navab et al. Oct 2009 A1
20090259230 Khadem et al. Oct 2009 A1
20090264899 Appenrodt et al. Oct 2009 A1
20090281417 Hartmann et al. Nov 2009 A1
20100022874 Wang et al. Jan 2010 A1
20100039506 Sarvestani et al. Feb 2010 A1
20100125286 Wang et al. May 2010 A1
20100130986 Mailloux et al. May 2010 A1
20100228117 Hartmann Sep 2010 A1
20100228265 Prisco Sep 2010 A1
20100249571 Jensen et al. Sep 2010 A1
20100274120 Heuscher Oct 2010 A1
20100280363 Skarda et al. Nov 2010 A1
20100331858 Simaan et al. Dec 2010 A1
20110019884 Blau Jan 2011 A1
20110022229 Jang et al. Jan 2011 A1
20110077504 Fischer et al. Mar 2011 A1
20110098553 Robbins et al. Apr 2011 A1
20110137152 Li Jun 2011 A1
20110213384 Jeong Sep 2011 A1
20110224684 Larkin et al. Sep 2011 A1
20110224685 Larkin et al. Sep 2011 A1
20110224686 Larkin et al. Sep 2011 A1
20110224687 Larkin et al. Sep 2011 A1
20110224688 Larkin et al. Sep 2011 A1
20110224689 Larkin et al. Sep 2011 A1
20110224825 Larkin et al. Sep 2011 A1
20110230967 O'Halloran et al. Sep 2011 A1
20110238080 Ranjit et al. Sep 2011 A1
20110276058 Choi et al. Nov 2011 A1
20110282189 Graumann Nov 2011 A1
20110286573 Schretter et al. Nov 2011 A1
20110295062 Gratacos Solsona et al. Dec 2011 A1
20110295370 Suh et al. Dec 2011 A1
20110306986 Lee et al. Dec 2011 A1
20120035507 George et al. Feb 2012 A1
20120046668 Gantes Feb 2012 A1
20120051498 Koishi Mar 2012 A1
20120053597 Anvari et al. Mar 2012 A1
20120059248 Holsing et al. Mar 2012 A1
20120071753 Hunter et al. Mar 2012 A1
20120108954 Schulhauser et al. May 2012 A1
20120136372 Amat Girbau et al. May 2012 A1
20120143084 Shoham Jun 2012 A1
20120184839 Woerlein Jul 2012 A1
20120197182 Millman et al. Aug 2012 A1
20120226145 Chang et al. Sep 2012 A1
20120235909 Birkenbach et al. Sep 2012 A1
20120245596 Meenink Sep 2012 A1
20120253332 Moll Oct 2012 A1
20120253360 White et al. Oct 2012 A1
20120256092 Zingerman Oct 2012 A1
20120294498 Popovic Nov 2012 A1
20120296203 Hartmann et al. Nov 2012 A1
20130006267 Odermatt et al. Jan 2013 A1
20130016889 Myronenko et al. Jan 2013 A1
20130030571 Ruiz Morales et al. Jan 2013 A1
20130035583 Park et al. Feb 2013 A1
20130060146 Yang et al. Mar 2013 A1
20130060337 Petersheim et al. Mar 2013 A1
20130094742 Feilkas Apr 2013 A1
20130096574 Kang et al. Apr 2013 A1
20130113791 Isaacs et al. May 2013 A1
20130116706 Lee et al. May 2013 A1
20130131695 Scarfogliero et al. May 2013 A1
20130144307 Jeong et al. Jun 2013 A1
20130158542 Manzo et al. Jun 2013 A1
20130165937 Patwardhan Jun 2013 A1
20130178867 Farritor et al. Jul 2013 A1
20130178868 Roh Jul 2013 A1
20130178870 Schena Jul 2013 A1
20130204271 Brisson et al. Aug 2013 A1
20130211419 Jensen Aug 2013 A1
20130211420 Jensen Aug 2013 A1
20130218142 Tuma et al. Aug 2013 A1
20130223702 Holsing et al. Aug 2013 A1
20130225942 Holsing et al. Aug 2013 A1
20130225943 Holsing et al. Aug 2013 A1
20130231556 Holsing et al. Sep 2013 A1
20130237995 Lee et al. Sep 2013 A1
20130245375 DiMaio et al. Sep 2013 A1
20130261640 Kim et al. Oct 2013 A1
20130272488 Bailey et al. Oct 2013 A1
20130272489 Dickman et al. Oct 2013 A1
20130274761 Devengenzo et al. Oct 2013 A1
20130281821 Liu et al. Oct 2013 A1
20130296884 Taylor et al. Nov 2013 A1
20130303887 Holsing et al. Nov 2013 A1
20130307955 Deitz et al. Nov 2013 A1
20130317521 Choi et al. Nov 2013 A1
20130325033 Schena et al. Dec 2013 A1
20130325035 Hauck et al. Dec 2013 A1
20130331686 Freysinger et al. Dec 2013 A1
20130331858 Devengenzo et al. Dec 2013 A1
20130331861 Yoon Dec 2013 A1
20130342578 Isaacs Dec 2013 A1
20130345717 Markvicka et al. Dec 2013 A1
20130345757 Stad Dec 2013 A1
20140001235 Shelton, IV Jan 2014 A1
20140012131 Heruth et al. Jan 2014 A1
20140031664 Kang et al. Jan 2014 A1
20140046128 Lee et al. Feb 2014 A1
20140046132 Hoeg et al. Feb 2014 A1
20140046340 Wilson et al. Feb 2014 A1
20140049629 Siewerdsen et al. Feb 2014 A1
20140058406 Tsekos Feb 2014 A1
20140073914 Lavallee et al. Mar 2014 A1
20140080086 Chen Mar 2014 A1
20140081128 Verard et al. Mar 2014 A1
20140088612 Bartol et al. Mar 2014 A1
20140094694 Moctezuma de la Barrera Apr 2014 A1
20140094851 Gordon Apr 2014 A1
20140096369 Matsumoto et al. Apr 2014 A1
20140100587 Farritor et al. Apr 2014 A1
20140121676 Kostrzewski et al. May 2014 A1
20140128882 Kwak et al. May 2014 A1
20140135796 Simon et al. May 2014 A1
20140142591 Alvarez et al. May 2014 A1
20140142592 Moon et al. May 2014 A1
20140148692 Hartmann et al. May 2014 A1
20140163581 Devengenzo et al. Jun 2014 A1
20140171781 Stiles Jun 2014 A1
20140171900 Stiles Jun 2014 A1
20140171965 Loh et al. Jun 2014 A1
20140180308 von Grunberg Jun 2014 A1
20140180309 Seeber et al. Jun 2014 A1
20140187915 Yaroshenko et al. Jul 2014 A1
20140188132 Kang Jul 2014 A1
20140194699 Roh et al. Jul 2014 A1
20140130810 Azizian et al. Aug 2014 A1
20140221819 Sarment Aug 2014 A1
20140222023 Kim et al. Aug 2014 A1
20140228631 Kwak et al. Aug 2014 A1
20140234804 Huang et al. Aug 2014 A1
20140257328 Kim et al. Sep 2014 A1
20140257329 Jang et al. Sep 2014 A1
20140257330 Choi et al. Sep 2014 A1
20140275760 Lee et al. Sep 2014 A1
20140275985 Walker et al. Sep 2014 A1
20140276931 Parihar et al. Sep 2014 A1
20140276940 Seo Sep 2014 A1
20140276944 Farritor et al. Sep 2014 A1
20140288413 Hwang et al. Sep 2014 A1
20140299648 Shelton, IV et al. Oct 2014 A1
20140303434 Farritor et al. Oct 2014 A1
20140303643 Ha et al. Oct 2014 A1
20140305995 Shelton, IV et al. Oct 2014 A1
20140309659 Roh et al. Oct 2014 A1
20140316436 Bar et al. Oct 2014 A1
20140323803 Hoffman et al. Oct 2014 A1
20140324070 Min et al. Oct 2014 A1
20140330288 Date et al. Nov 2014 A1
20140364720 Darrow et al. Dec 2014 A1
20140371577 Mallet et al. Dec 2014 A1
20150039034 Frankel et al. Feb 2015 A1
20150085970 Bouhnik et al. Mar 2015 A1
20150088293 Metzger Mar 2015 A1
20150146847 Liu May 2015 A1
20150150524 Yorkston et al. Jun 2015 A1
20150196261 Funk Jul 2015 A1
20150213633 Chang et al. Jul 2015 A1
20150335480 Alvarez et al. Nov 2015 A1
20150342647 Frankel et al. Dec 2015 A1
20160005194 Schretter et al. Jan 2016 A1
20160166329 Langan et al. Jun 2016 A1
20160235480 Scholl et al. Aug 2016 A1
20160249990 Glozman et al. Sep 2016 A1
20160302871 Gregerson et al. Oct 2016 A1
20160320322 Suzuki Nov 2016 A1
20160331335 Gregerson et al. Nov 2016 A1
20170135770 Scholl et al. May 2017 A1
20170143284 Sehnert et al. May 2017 A1
20170143426 Isaacs et al. May 2017 A1
20170156816 Ibrahim Jun 2017 A1
20170202629 Maillet et al. Jul 2017 A1
20170212723 Atarot et al. Jul 2017 A1
20170215825 Johnson et al. Aug 2017 A1
20170215826 Johnson et al. Aug 2017 A1
20170215827 Johnson et al. Aug 2017 A1
20170231710 Scholl et al. Aug 2017 A1
20170258426 Risher-Kelly et al. Sep 2017 A1
20170273748 Hourtash et al. Sep 2017 A1
20170296277 Hourtash et al. Oct 2017 A1
20170360493 Zucher et al. Dec 2017 A1
Foreign Referenced Citations (2)
Number Date Country
1571581 Sep 2005 EP
2006594 Jan 2006 JP
Non-Patent Literature Citations (1)
Entry
US 8,231,638 B2, 07/2012, Swarup et al. (withdrawn)
Related Publications (1)
Number Date Country
20190142304 A1 May 2019 US