The present disclosure relates to a method and device for ordering a custom orthopedic device including providing security provisions, receiving measurement information, capturing an image of the appropriate portion of the limb, and submitting ordering requests.
To provide customized support for a joint, a clinician may provide a patient with a custom fitted orthopedic device adapted to the specific anatomical dimensions of the individual patient. A common orthopedic device for customization is a knee brace. A patient will typically obtain a customized brace through a clinician having the expertise to assure that the orthopedic device fits the patient properly.
A clinician can prepare the brace himself, or order a custom orthopedic device remotely through the mail or by submitting an order over mail, phone, fax or the Internet. During the ordering process, the clinician typically provides the manufacturer or seller (“provider”) with an image of a portion of the limb including the joint and measurements of the limb around the joint. The custom orthopedic brace is produced based on the submitted image of the limb and measurements. The provider may require the image of the limb be captured at a certain orientation, angle, height, and distance relative to the limb to ensure that the captured image accurately portrays the dimensions and proportions of the limb. Appropriate tags, reference indicia and reference markings of anatomy are often placed on the limb to identify the patient, the limb, and any other necessary information if the photo is misplaced from an order form.
It is undesirable for the picture to be taken when the camera is at an angle relative to the limb (angle normal to the line of progression of the limb) or when the limb is not aligned with the center of the image since such an image would inaccurately portray the dimensions and proportions of the limb. Producing a custom orthopedic brace based on such a misaligned image results in a poorly fitting brace. Likewise, poor resolution or lack of indicia applied on the limb may impede the producer in fully understanding the contours of the patient's limb.
Using a conventional camera, the photographer must estimate or otherwise measure the specified distance between the camera and the limb, the specified portion of the limb to capture in the image, and the appropriate orientation of the camera relative to the limb. Since the conventional camera does not provide feedback about the angle or orientation at which the camera is held, it is difficult for the photographer to determine whether the image of the limb being captured meets the requirements of the manufacturer or seller without additional aids.
The patient may need to use different devices to complete the entire ordering process. If the image of the limb is captured with a conventional digital camera, the image must then be transferred to a computer before the order and image can be uploaded over the Internet to the server of the manufacturer or seller.
While providing a photo is useful in understanding the patient's anatomy, dimensional measurements are likewise required. Various forms are required for completion by the clinician to determine measurement data and patient personal information. Other forms require the clinician to indicate brace models, features, accessories, colors, etc. From the requirement for forms and a photo, the ordering process both complicated and risks a mismatch of documents for the order.
The features of the disclosure provide a solution to the need to reduce image and limb misalignment and improve the ease of capturing an image of the limb and of ordering a custom orthopedic device without multiple forms.
According to a method for ordering a custom orthopedic device for a joint, the method includes aligning a viewfinder image displayed on a screen and generated by an image sensor of a portable device with at least one predetermined portion of a limb including a joint. The method involves capturing and storing at least one image of the portion of the limb using the image sensor of the portable device based on at least one guideline. The at least one captured image is associated with measurements of the limb, and patient information entered into the portable device. The order is transmitted to the provider and contains the at least one captured image, the measurements of the limb, and the patient information from the portable device. The at least one guideline may be a depth of field guideline, a horizontal orientation guideline, a vertical orientation guideline, a tilt guideline, or a limb alignment guideline.
The at least one image of the limb may satisfy the depth of field guideline, the horizontal or the vertical orientation guideline, the tilt angle guideline, and the limb alignment guideline. The limb alignment guideline is a depth of field guideline overlaid on the viewfinder image. The depth of field guideline may be a reference frame for a first distance above a joint, a second distance below a joint, and a centering of the limb and joint in the captured image.
According to a variation, the first and second distances are the same and referenced from a knee axis line. The distances above and below the joint may be aligned with the depth of field guideline in the viewfinder image before capturing the image.
Once the orientation of the portable device relative to the limb satisfies the horizontal angle guideline or vertical angle guideline and the tilt angle guideline, the portable device enables image capture. The method may also include calibrating the image sensor of the portable device.
The method may include executing an ordering application, determining whether the ordering application has been previously executed. The image sensor may be calibrated upon the determination that the ordering application has not been previously executed. The image of the limb enables capture upon the determination that the ordering application has been previously executed.
The method may include reviewing the captured image of the limb and selecting a custom orthopedic device configuration. The step of reviewing the captured image of the limb includes viewing the captured image with an overlaid depth of field guideline to confirm the captured portion of the limb satisfies the overlaid depth of field guideline. The method may also include entering basic patient information into the portable device including measurements of the limb at various locations on the limb. The captured image may be overlaid with the basic patient information. The overlaid captured image may be stored in the portable device.
The method can involve configuring the custom orthopedic device, reviewing the order, and storing the order in a memory of the portable device. At least one previous order may be stored in the memory of the portable device. The order may be transmitted as an e-mail containing the patient information and the saved, captured image of the limb.
In an embodiment of the device, the device includes an image sensor configured to capture an image, a display, a gyroscope and/or accelerometer, a communication interface, a processor, and a memory. The processor is configured to enable capturing an image of a portion of a limb including a joint using the image sensor based on at least one guideline, and the image of the limb satisfies the at least one guideline. The gyroscope and/or accelerometer is configured to provide orientation data to the processor. The communication interface is configured to transmit an order containing the captured image and patient information from the apparatus over a network to a provider. The at least one guideline is at least one of a depth of field guideline, a horizontal orientation guideline, a vertical orientation guideline, a tilt guideline, or a limb alignment guideline.
The method may include a login page requiring clinician and patient input. Upon entry of the information on the login page, the user is directed to an order configuration home screen or page. From the home screen, the user may select many pages for making the customized order. The user may first select the image capture and input measurements, followed by entering patient information, orthopedic device (brace) configuration, and any other order information. The user is not limited to a sequence of page use, other than upon entry of all data fields and appropriate image capture, the order is sent to the provider.
The limb alignment guideline is a depth of field guideline overlaid on a viewfinder image. The depth of field guideline is a reference frame for a first distance above a joint, a second distance below a joint, and a centering of the limb and joint in the captured image. The processor is configured to provide an indication of the orientation of the device on the display. The processor is configured to calibrate the image sensor by setting the image sensor to a first resolution and a first zoom level. At least one image of anatomical landmarks or markings on the limb is captured. A three-dimensional model of the limb is generated from the markings along with circumferential measurements.
The device and method for ordering a custom orthopedic device is described with reference to the accompanying drawings which show preferred embodiments according to the device described herein. It will be noted that the device as disclosed in the accompanying drawings is illustrated by way of example only. The various elements and combinations of elements described below and illustrated in the drawings can be arranged and organized differently to result in embodiments which are still within the spirit and scope of the device described herein.
A better understanding of different embodiments of the disclosure may be had from the following description read with the accompanying drawings in which like reference characters refer to like elements.
While the disclosure is susceptible to various modifications and alternative constructions, certain illustrative embodiments are in the drawings and are described below. It should be understood, however, there is no intention to limit the disclosure to the specific embodiments disclosed, but on the contrary, the intention covers all modifications, alternative constructions, combinations, and equivalents falling within the spirit and scope of the disclosure.
It will be understood that unless a term is expressly defined in this disclosure to possess a described meaning, there is no intent to limit the meaning of such term, either expressly or indirectly, beyond its plain or ordinary meaning.
Any element in a claim that does not explicitly state “means for” performing a specified function, or “step for” performing a specific function is not to be interpreted as a “means” or “step” clause as specified in 35 U.S.C. § 112, paragraph 6.
The embodiments of the method 2 and device 4 disclosed enable the user to easily capture accurate images of the limb and complete the ordering process on a single device. The method 2 may be implemented in an application executed on a portable device. The application guides the user through the ordering and image capturing process. Such a device and process reduces misalignment issues while integrating picture capturing and ordering into a single device.
The device used to capture the image and order the custom orthopedic brace may be any device having a display and an image sensor such as a mobile phone (iPhone®, Android® phone, Blackberry®, Windows® phone, etc.), a tablet (iPad®, Android® tablet, Windows® tablet etc.), a personal digital assistant (PDA), a computer, or any other portable device.
The preferred device 4 has an image sensor, a display, a processor, a gyroscope and/or accelerometer, memory, and a communication interface to allow communication of the order directly from the device over a network to a server of the manufacturer or seller.
At step 200, the application determines whether the current session is the first use of the device 4 to order the custom orthopedic device. In one embodiment, if the current session is the first use, the application calibrates a camera of the device at 300. During calibration 300, the application automatically sets the camera to specific settings such as a specific resolution, zoom, and color setting.
Alternatively, the calibration 300 of the camera may be omitted. In this embodiment, the preferred camera settings are in text within the application, and the calibration is manually performed by the user by adjusting the settings of the camera. The preferred camera settings may be displayed during the image capturing process and during a review of the captured image. The preferred camera settings can also be displayed and included in a “help” section of the application.
If the current session is not the first use or calibration of the camera is unnecessary, the application enables capture of an image using guidelines and a specific alignment or orientation at step 400. In step 500, the captured image is reviewed by the user to determine whether the captured image meets the specified alignment. At step 600, patient information is entered, and a brace configuration is selected at 700. The user is prompted to review the order at 800 before the order is saved on the device and transmitted over a network to a server of the custom orthopedic device manufacturer or seller at 900.
The capturing of the image of the limb at step 400 is described in more detail regarding
In determining whether the device 4 is in an acceptable portrait or landscape orientation, the application uses the gyroscope and/or accelerometer of the device 4 to determine whether the orientation of the device 4 is within a certain degree range whether the longitudinal axis 6 of the device 4 is within five degrees of the horizontal axis or within five degrees of the vertical axis.
The application provides a visual indication 8 on a display 18 of the device 4 whether the device 4 is properly oriented and allows the user to adjust the device 4 until the orientation requirements are met.
Once the orientation requirements are met, the display shows the viewfinder image and enables capture of an image. The display 18 of the device 4 becomes the viewfinder for the camera and a depth of field guideline 16 is overlaid on the viewfinder image. The depth of field guideline 16 provides the user with assistance in centering the joint in the photo and capturing the appropriate portion of the joint. Before capturing the image of the limb, the limb may be measured and marked at multiple points to indicate specific distances above and below the joint, to aid in alignment with the depth of field guideline, and to provide reference points for circumference measurements of the limb. The markings may be at specific points or at regular intervals along and around the limb.
To aid the user in correctly framing the limb in the picture, markings may first be placed on the limb indicating specific locations on the limb. An anatomical landmark or marking is placed at the point on the limb about 15 cm below the joint and about 15 cm above the joint which corresponds to the depth of field guidelines. Therefore, the photographer need only align the markings with the upper and lower guidelines and align the center hash mark over the center of the joint.
The method may require a delay before the image is taken to assure stable and clear focus. Upon alignment with the field guideline, the method requires a steady position before taking the image. A signal may be released, such as a green dot, to prompt the user to capture the image by pressing a button on the device.
In a variation, the clinician first identifies the medial joint space and marks it appropriately. The clinician then measures approximately 2 cm above the medial joint space and draws a line across the knee, from medial to lateral sides, to define the knee axis line. The clinician then identifies and marks the lateral joint space. The clinician then may measure and mark points both 7.5 cm and 15 cm above and below the knee axis line. The clinician marks the tibial peak below the 15 cm mark and the tibial peak above the 7.5 cm mark, and connects the two with a line.
Alternatively, the application may directly enable the capture of the image instead of checking the device orientation before enabling the display 18 to function as the viewfinder of the camera. The application may provide on-screen guides as to the current orientation of the device relative to the preferred orientation regarding each of the axes next to or over the viewfinder image.
In this manner, the application assists and guides the user in capturing the optimal image of the limb for a custom orthopedic device. Through immediate on-screen guidance as to the orientation of the device 4, the user can easily and quickly adjust the angle, orientation, and alignment of the device to obtain a well aligned and consistent photograph of the limb. The possibility of misalignment between the image capturing device and the limb is therefore greatly reduced.
According to a variation, the clinician takes both an anterior view of the limb with the markings, and a lateral view of the limb. In both instances, the depth of field guideline may vary according to the orientation of the limb, which is selected on the device by the clinician. The method may include image capture from any number sides of a patient's limb, including anterior, posterior, lateral and medial views and angles.
Once an image is captured, the user is prompted to review the captured image. During the review of the captured photograph at step 500, the captured image with the overlaid guidelines 16 is shown, and the user determines whether the correct portion of the limb is captured within the image and whether the limb is centered within the image. If the limb and the device 4 were not aligned when the image was captured, a new image may be taken. If a new image is captured, the application returns to step 400 to guide the user through the correct orientation and framing of the limb in the image. Once the photograph is confirmed, the user is prompted to enter measurements of the limb. The measurements of the limb may include the medial-lateral (M-L) width measurement of the limb and the circumference of the limb at various points above and below the joint. A measuring device such as a caliper or measuring tape may obtain the measurements of the limb.
To associate the captured image 22 with the patient, the captured image 22 is stored with an identification label 24. The identification label 24 may be text overlaid on the captured image 22.
As shown in
At 600, additional patient information is entered to fill out the order form. The patient information may include the name of the user, the prescriber of the orthopedic device, and the diagnosis or symptoms of the user with identification of the problem joint. After the patient information is entered, the application can create a partial order on the device in encrypted XML format including the captured photo and patient information. Other security measures may protect the patient's data under HIPAA regulations.
At step 700, the user or patient selects the appropriate orthopedic device configuration such as the orthopedic device model and color. The application guides the user through the different orthopedic devices and provides the user with options based on the selected orthopedic device. After selection and configuration of the orthopedic device, the partial order is saved. The user can enter a menu listing all orders saved on the device or continue to a review of the order. The list also indicates the status of the order such as whether the order has been transmitted to the maker or seller. If the user selects a link for an order not yet transmitted, the user is prompted to review the order.
At step 800, the order is displayed for review. The saved image and the collected information are displayed. The user can edit portions of the order with the changes made during the review being saved with the order. The user can also enter the clinician information and payment and shipping information. The user then may save the order or transmitting the order.
At step 900, the order is saved and/or transmitted to the server of the custom orthopedic device provider. The application sends the order in an e-mail and automatically populates the fields of the e-mail based on the data in the saved order. The patient information, brace information, and clinician or user information are inserted into the body of the e-mail while the saved image associated with the order is automatically attached to or inserted into the e-mail. When the user elects to send the e-mail containing the order information, the device can send the order directly to a server of the seller through a network. The application and/or the server may display or send a notification to the user to confirm the order.
While the embodiments described relate to a method and device for ordering a custom orthopedic device which can be accomplished with a single image, in another embodiment the method and device are used with or within a custom orthopedic device production method and system which produces the custom orthopedic device based on a three-dimensional model of the limb generated from a plurality of captured images.
To generate a three-dimensional model, markers or reference points are placed on the limb and are subsequently captured in the image. The markers or reference points assist in determining the dimensions of the limb from the image by providing information related to the surface of the limb. To place the markers or reference points on the limb, a sock or sleeve having markings may be worn on the limb or the limb may be marked at particular intervals. Markings can be contrasting colored markings in many shapes such as a circular shape, a rectangular shape, a triangular shape, or any combination and are preferably the same size. The distribution and density of the markings over the surface of the sock, sleeve, or limb varies depending on the type of limb and the desired three-dimensional modeling resolution.
Providing more markings or a higher density of markings in a certain area produces a more accurate three-dimensional model of the limb since more reference points would be provided in the captured image. The markings may be concentrated in areas where there are more variations in the continuity of the limb surface such as around the joint area. Additional references may be added to the limb or the sock. In ordering a custom knee brace, additional markings or references are added to indicate the center of the knee, the angle of the tibia, and the locations of the condyles.
Once the ordering application is executed and before the image capturing step 400, the user is prompted to select whether to use a three-dimensional model. If the user does not select the three-dimensional model, the method continues as described regarding
Before beginning the image capturing process 402, markings or reference points are added to the limb in the manner described. In the image capturing process 402 starting with step 404, the photographer captures at least two images of the limb at various angles or views. Preferably, at least four individual images are captured of the limb with each image capturing a different angle or view of the limb such that the entire circumference of the appropriate portion of the limb is captured within the plurality of images. The application may instruct the user to capture a certain number of images of the limb from different angles, orientations, heights, or different portions of the limb.
Alternatively, the application may use continuous image capturing where the application automatically captures images at different intervals such that the photographer need only move the device and indicate when images of all views or sides of the limb have been captured.
During the image capturing process 402, the application can guide the photographer in capturing the appropriate angles or views of the limb with the appropriate alignment of the limb in the image using accelerometer and/or gyroscope data from the device using the guidelines described. Depending on the depth of field of the images, the application can determine the appropriate number of images, angles, or views needed to generate an accurate three-dimensional model of the limb.
At step 406, the application analyzes and processes the plurality of captured images to stitch the images together and form a continuous view of the limb. The application can perform the stitching of the images together automatically or with user assistance. The stitching of the images can also be performed while the images are being captured or with the plurality of individual separately captured images.
At 408, from the stitched image, the application generates a three-dimensional model of the limb using the markings on the sock, sleeve, or limb. The model may be a computer-aided design (CAD) point cloud surface where the markings shown in the captured images translate into points in the point could surface. The application preferably generates a 360° view of the limb.
Once the image capturing process 402 is completed, the application prompts the user to review the generated three-dimensional model to ensure that the three-dimensional model accurately depicts the surface shape of the corresponding portion of the limb at 502. During review of the three-dimensional model, the user can rotate and zoom into the model to view all sides of and different levels of detail of the limb. The user is given the option of approving the generated model or re-capturing the limb to generate a more accurate model.
If the user performs the image capturing process 402 again, the current generated model may be saved for comparison with later models during the review stage. If the user approves of the generated model, the application continues with the general ordering process stages 600-900 as described regarding
Many of the elements described in the disclosed embodiments may be implemented as modules. A module is defined here as an isolatable element that performs a defined function and has a defined interface to other elements. The modules described in this disclosure may be implemented in hardware, a combination of hardware and software, firmware, or a combination, all of which are behaviorally equivalent. Modules may be implemented using computer hardware in combination with software routine(s) written in a computer language (such as C, C++, Fortran, Java, Basic, Matlab or the like). It may be possible to implement modules using physical hardware that incorporates discrete or programmable analog and/or digital hardware. Examples of programmable hardware include: computers, microcontrollers, microprocessors, application-specific integrated circuits (ASICs); field programmable gate arrays (FPGAs); and complex programmable logic devices (CPLDs). Computers, microcontrollers and microprocessors are programmed using languages such as assembly, C, C++ or the like. Finally, the above mentioned technologies may be used in combination to achieve the result of a functional module.
The application may be software embodied on a computer readable medium which when executed by a processor of a computer performs a sequence of steps. A computer readable medium may be a floppy disk, a flexible disk, a hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, a RAM, a ROM, or any other medium from which a computer can read. Various forms of computer readable media may carry one or more sequences of one or more instructions to a processor for execution. The software may be transmitted over a wired or wireless network to the device.
While the foregoing steps embodiments have been described and shown, alternatives and modifications of these embodiments, such as those suggested by others, may be made to fall within the scope of the invention. A preferred order for the steps in the method of ordering the custom orthopedic device has been described. It is noted that the order of the steps in the method may be rearranged.
Number | Name | Date | Kind |
---|---|---|---|
399167 | Starrett | Mar 1889 | A |
1007229 | Nielsen | Oct 1911 | A |
1294723 | Swinburne | Feb 1919 | A |
1621526 | Culell | Mar 1927 | A |
2318864 | Jackson | May 1943 | A |
2980110 | Brumfield et al. | Apr 1961 | A |
3008239 | Lange | Nov 1961 | A |
3140546 | Bartlett | Jul 1964 | A |
3953900 | Thompson | May 1976 | A |
4008523 | Von Voros | Feb 1977 | A |
4315372 | Kinkead | Feb 1982 | A |
4776327 | Russell | Oct 1988 | A |
4807605 | Mattingly | Feb 1989 | A |
4827916 | Kosova | May 1989 | A |
4843720 | Kim | Jul 1989 | A |
5038795 | Roush et al. | Aug 1991 | A |
5443510 | Shetty et al. | Aug 1995 | A |
5556373 | Motloch | Sep 1996 | A |
5662594 | Rosenblatt | Sep 1997 | A |
5741215 | D'Urso | Apr 1998 | A |
5768134 | Swaelens et al. | Jun 1998 | A |
5857987 | Habermeyer | Jan 1999 | A |
5880964 | Schall et al. | Mar 1999 | A |
5888216 | Haberman | Mar 1999 | A |
5911126 | Massen | Jun 1999 | A |
6236743 | Pratt | May 2001 | B1 |
6427695 | Zanetti et al. | Aug 2002 | B1 |
6463351 | Clynch | Oct 2002 | B1 |
6540708 | Manspeizer | Apr 2003 | B1 |
6553685 | Nishina et al. | Apr 2003 | B2 |
6564086 | Marchitto et al. | May 2003 | B2 |
6572571 | Lowe | Jun 2003 | B2 |
6597965 | Graves et al. | Jul 2003 | B2 |
6613006 | Asherman | Sep 2003 | B1 |
6613716 | Hoefs et al. | Sep 2003 | B2 |
6725118 | Fried et al. | Apr 2004 | B1 |
6726641 | Chiang et al. | Apr 2004 | B2 |
6968246 | Watson et al. | Nov 2005 | B2 |
7127101 | Littlefield et al. | Oct 2006 | B2 |
7210926 | Tadros et al. | May 2007 | B2 |
7242798 | Littlefield et al. | Jul 2007 | B2 |
7298890 | Massen | Nov 2007 | B2 |
7340316 | Spaeth et al. | Mar 2008 | B2 |
7489813 | Rutschmann et al. | Feb 2009 | B2 |
7661170 | Goode et al. | Feb 2010 | B2 |
7735237 | Moon | Jun 2010 | B1 |
7797072 | Summit | Sep 2010 | B2 |
7896827 | Ingimundarson et al. | Mar 2011 | B2 |
8005651 | Summit et al. | Aug 2011 | B2 |
8059917 | Dumas et al. | Nov 2011 | B2 |
8581905 | Mitchell | Nov 2013 | B2 |
8739428 | Emtman | Jun 2014 | B2 |
8988503 | Pfeiffer et al. | Mar 2015 | B2 |
9149224 | Newman et al. | Oct 2015 | B1 |
20010002232 | Young et al. | May 2001 | A1 |
20020016631 | Marchitto et al. | Feb 2002 | A1 |
20020194023 | Turley et al. | Dec 2002 | A1 |
20030032906 | Narula et al. | Feb 2003 | A1 |
20030065259 | Gateno et al. | Apr 2003 | A1 |
20040019266 | Marciante et al. | Jan 2004 | A1 |
20040032595 | Massen | Feb 2004 | A1 |
20040068337 | Watson et al. | Apr 2004 | A1 |
20040088584 | Shachar et al. | May 2004 | A1 |
20040133431 | Udiljak et al. | Jul 2004 | A1 |
20040162511 | Barberio | Aug 2004 | A1 |
20040230149 | Littlefield et al. | Nov 2004 | A1 |
20040236424 | Berez et al. | Nov 2004 | A1 |
20040260402 | Baldini et al. | Dec 2004 | A1 |
20050004472 | Pratt | Jan 2005 | A1 |
20050015172 | Fried et al. | Jan 2005 | A1 |
20050031193 | Rutschmann et al. | Feb 2005 | A1 |
20050043835 | Christensen | Feb 2005 | A1 |
20050044740 | Hansen | Mar 2005 | A1 |
20050061332 | Greenawalt et al. | Mar 2005 | A1 |
20050065458 | Kim | Mar 2005 | A1 |
20050256392 | Matory et al. | Nov 2005 | A1 |
20060100832 | Bowman | May 2006 | A1 |
20060161267 | Clausen | Jul 2006 | A1 |
20070016323 | Fried | Jan 2007 | A1 |
20070083384 | Geslak et al. | Apr 2007 | A1 |
20070133850 | Paez | Jun 2007 | A1 |
20070225630 | Wyatt et al. | Sep 2007 | A1 |
20080120756 | Shepherd | May 2008 | A1 |
20080124064 | Klinghult et al. | May 2008 | A1 |
20080294083 | Chang et al. | Nov 2008 | A1 |
20080319362 | Joseph | Dec 2008 | A1 |
20090088674 | Caillouette et al. | Apr 2009 | A1 |
20090254015 | Segal et al. | Oct 2009 | A1 |
20100008588 | Feldkhun et al. | Jan 2010 | A1 |
20100137770 | Ingimundarson et al. | Jun 2010 | A1 |
20100138193 | Summit et al. | Jun 2010 | A1 |
20100228646 | Heidel | Sep 2010 | A1 |
20100268138 | Summit et al. | Oct 2010 | A1 |
20110001983 | Becker et al. | Jan 2011 | A1 |
20110056004 | Landi | Mar 2011 | A1 |
20110092804 | Schoenefeld et al. | Apr 2011 | A1 |
20110149094 | Chen et al. | Jun 2011 | A1 |
20110166435 | Lye | Jul 2011 | A1 |
20110248987 | Mitchell | Oct 2011 | A1 |
20120098992 | Hosoe | Apr 2012 | A1 |
20120165648 | Ferrantelli | Jun 2012 | A1 |
20120235993 | Kim | Sep 2012 | A1 |
20130123668 | Rodrigues Quintas et al. | May 2013 | A1 |
20130301901 | Satish et al. | Nov 2013 | A1 |
20140300722 | Garcia | Oct 2014 | A1 |
20160317079 | Newman et al. | Nov 2016 | A1 |
20170281009 | Obropta, Jr. et al. | Oct 2017 | A1 |
Entry |
---|
“Omega: Omega Tracer is the most versatile CAD/CAM technology for the orthotic and prosthetic industry.” Downloaded from https://web.archive.org/web/20130513201657/http://www.willowwoodco.com/products-and-services/omega, May 13, 2013, 42 pages. |
“Digital Measuring System for Unloader & CTi CM Braces”, Ossur; Aug. 1, 2012; 3 pages. |
International Search Report and Written Opinion from International Application No. PCT/US13/56896, dated Apr. 23, 2014. |
Number | Date | Country | |
---|---|---|---|
20210169665 A1 | Jun 2021 | US |
Number | Date | Country | |
---|---|---|---|
61694314 | Aug 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14011300 | Aug 2013 | US |
Child | 17178643 | US |