1. Field
The present disclosure relates generally to medical interface devices, and more specifically to electronics for orthopedic instrumentation and measurement.
2. Introduction
Clinicians rely on information or media during an operative workflow. Such media may be in various visual and auditory formats. As sophisticated instruments are introduced in the clinical environment, clinicians may experience a learning curve for user interface applications. Customizing the user experience and implementing new wireless techniques into such operative workflow will advance and facilitate surgical instrument use during operative workflow.
While the specification concludes with claims defining the features of the invention that are regarded as novel, it is believed that the invention will be better understood from a consideration of the following description in conjunction with the drawing figures, in which like reference numerals are carried forward.
In one embodiment, an apparatus can include a receiver that receives ultrasonic waveforms from a probe that emits the ultrasonic waveforms from three or more ultrasonic transducers in a three-dimensional sensing space, and a controller coupled to a memory having computer instructions which, when executed by the controller, cause the controller to digitally sample ultrasonic waveforms from the three or more microphones on the receiver to produce sampled received ultrasonic waveforms and track a relative location and movement of the probe in the three-dimensional ultrasonic sensing space from differential time of flight waveform analysis of the sampled received ultrasonic waveforms.
In a second embodiment, a probe comprises three or more ultrasonic transducers that emit ultrasonic waveforms towards a receiver that receives the ultrasonic waveforms in a three-dimensional ultrasonic sensing space where the receiver has a controller that digitally samples the ultrasonic waveforms from three or more microphones on the receiver to produce sampled received ultrasonic waveforms and tracks a relative location and movement of the probe in the three-dimensional ultrasonic sensing space from differential time of flight waveform analysis of the sampled received ultrasonic waveforms.
In a third embodiment, a portable measurement system comprises a probe having a plurality of ultrasonic transducers that emit ultrasonic waveforms for creating a three-dimensional sensing space, a user interface control that captures a location and position of the probe in the three-dimensional sensing space and a receiver. The receiver can comprise a plurality of microphones to capture the ultrasonic waveforms transmitted from the probe to produce captured ultrasonic waveforms and a digital signal processor that digitally samples the captured ultrasonic waveforms and tracks a relative location and movement of the probe with respect to the receiver in the three-dimensional ultrasonic sensing space from time of flight waveform analysis.
The receiver 104 and particularly the probe 110 can be used in conjunction with a variety of instruments from an instrument set. The instrument set comprises a clamp 121 for attaching the receiver 104 or probe 110 to a surgical device, such as a remear, impactor, or other guided tool. A calibration plate 122 is used on an end of the surgical tool to calibrate the receiver 104 to the probe 110, namely mapping, a desired geometry or landmark (e.g., remear hemispherical center, rod length, rod offset, etc.) of the attached tool (e.g., remear, impactor, etc). The attachable/detachable probe tip 123 inserts within the probe 110 to mark or register (anatomical) landmarks. The probe tip position relative to the transducers on the probe 110 and corresponding geometry is predetermined and fixed. The double T end of the cam lock 125 attaches/detaches to the receiver 104 (or probe 110) when such device is to be affixed to an object (e.g., bone, operating tray, bed, stand, etc). The second end of the cam lock 124 opens/closes to clamp around the ball end of the mounting pin 125. The second end of the mounting pin 125 may be a bi-cortical screw design to insert within bone. The ball end may also be affixed to another object (e.g., operating tray, bed, stand, etc.). The cam-lock 124 and ball mount 125 permit for wide range angulation for supporting 120 degree line-of-sight between the receiver 104 and the probe 110. An example of device guidance with the probe is disclosed in U.S. patent application Ser. No. 13/277,408, filed Oct. 20, 2011, entitled “Method and System for Media Presentation During Operative Work-flow”, the entire contents which are hereby incorporated by reference.
As illustrated, the system 200 comprises the pod 102, the probe 110, and the receiver 104. Not all the components shown are required; fewer components can be used depending on required functionality as explained ahead.
The pod 102 is communicatively coupled to the transmitter or probe 110 and the receiver 104 over a communication link. In the configuration shown, the pod 102 contains the primary electronics for performing the sensory processing of the communicatively coupled sensory devices. The transmitter 110 and the receiver 104 contain minimal components for operation, which permits the sensory devices to be low-cost and light weight for mounting and handling. In another configuration, the primary electronic components of the pod 102 can be miniaturized onto and/or integrated with the receiver 104 with the battery 235 and other pod components; thus removing the pod and permitting a completely wireless system.
The probe 110 can receive control information from the pod 102 over a wired connection 109 which is used for transmitting sensory signals (ultrasonic waveforms). The control information can be in the form of digital pulses or analog waveforms. Control information can be multiplexed at the pod 102 to each transmitter 110 for reducing GPIO port use. In one embodiment, the transmitter or probe 110 comprises three ultrasonic transmitters 211-213 for each transmitting signals (e.g., ultrasonic waveforms) through the air in response to the received control information. Material coverings for the transmitters 211-21 are transparent to sound (e.g., ultrasound) and light (e.g., infrared) yet impervious to biological material such as water, blood or tissue. In one arrangement, a clear plastic membrane (or mesh) is stretched taught. The transmitter or probe 110 may contain more or less than the number of components shown; certain component functionalities may be shared as integrated devices. One such example of an ultrasonic sensor is disclosed in U.S. patent application Ser. No. 11/562,410 filed Nov. 13, 2006 the entire contents of which are hereby incorporated by reference. Additional ultrasonic sensors can be included to provide an over-determined system for three-dimensional sensing. The ultrasonic sensors can be MEMS microphones, receivers, ultrasonic transmitters or combination thereof. As one example, each ultrasonic transducer can perform separate transmit and receive functions.
The probe 110 may include a user interface 214 (e.g., LED, or button) 302 and/or 304 that receives user input for requesting positional information. It can be a multi-action button that communicates directives to control or complement the user interface. With a wired connection, the probe 110 receives amplified line drive signal's from the pod 102 to drive the transducers 211-213. The line drive signals pulse or continuously drive the transducers 211-212 to emit ultrasonic waveforms. In a wireless connection, the electronic circuit (or controller) 214 generates the driver signals to the three ultrasonic transmitters 211-213 and the battery 215 of the probe 110 to provide energy for operation (e.g., amplification, illumination, timing, etc). The IR Link 216 can be an IR transmitter or photo-diode that communicates with respective elements of the corresponding IR link 229 on the receiver 104. The transmitter on either end device can send an optical synchronization pulse coinciding with an ultrasonic pulse transmission when used in wireless mode; that is, without wire line. A photo diode on the receiving end terminates the IR Link. A battery 218 can be provided for the wireless configuration if a wired line is not available to provide power of control information from the pod 102. The communications port 216 relays the user input to the pod 102, for example, when the button of the interface 214 on the probe 110 is pressed.
The probe 110 by way of control information from the pod 102 can intermittently transmit ultrasonic waves from the three (3) ultrasonic transducers. The transmission cycle can vary over a 5-10 ms interval at each of the three transmitters; each transmitter takes turns transmitting an ultrasonic waveform. The ultrasonic waveforms propagate through the air and are sensed by the microphones on the Receiver 104. The system 200 can support a system polling rate; <500 Hz. The Receiver 104 determines positional information of the probe 110 or “Wand” from range and localization of transmitted ultrasonic waveforms. The system can support short range tracking of the Receiver 104 and the probe 110 between 10 and 90 cm apart. The Receiver 104 measures the position and orientation of the probe 110 with respect to the Receiver 104 coordinate system in three-dimensions (3D) within about 120 degrees conical line of sight.
The probe 110 includes an intertial measurement unit (IMU) 242 to detect and stabilize for orientation. The inertial measurement unit 242 comprises at least one among an accelerometer for measuring gravitational vectors, a magnetometer for measuring intensity of earth's magnetic field and corresponding (north) vector, and a gyroscope for stabilizing absolute spatial orientation and position. This permits a micro-controller operation of inertial measurement unit 242 to detect abrupt physical motion of a hand-operated tool (e.g., remear, impactor, etc. attached with the probe 110) exceeding threshold limits of a pre-specified plan, and report by way of the micro-controller an indication status associated with the position and orientation that the hand-operated tool exceeds the threshold limit of the pre-specified plan.
The Receiver 104 includes a plurality of microphones 221-224, an amplifier 225 and a controller 226. The microphones capture both acoustic and ultrasonic signals transmitted by the transducers 211-213 of the transmitter 110. The frequency response characteristics of the microphone permit for low Q at a transmitter 110 resonant frequency (e.g., 40, 60, 80 KHz) and also provide uniform gain for wideband acoustic waveforms in the audio range 20 to 20 KHz. The amplifier 225 amplifies the captured acoustic signals to improve the signal to noise ratio and dynamic range. It should be noted that ultrasonic signals are also acoustic signals, yet at a higher frequency than the audio range. The controller 226 can include discrete logic and other electronic circuits for performing various operations, including, analog to digital conversion, sample and hold, and communication functions with the pod 102. The captured, amplified ultrasonic signals are conveyed over the wired connection 109 to the pod 102 for processing, filtering and analysis.
A thermistor 227 measures ambient air temperature for assessing propagation characteristics of acoustic waves when used in conjunction with a transmitter 210 configured with ultrasonic sensors. An optional IR Link 229 may be present for supporting wireless communication with the transmitter 210 as will be explained ahead. An Inertial Measurement Unit (IMU) 241 may also be present for determining relative orientation and movement. The IMU 241 includes an integrated accelerometer, a gyroscope and a compass. This device can sense motion, including rate, direction and multiple degrees of freedom, including 6 axis tilt during motion and while stationary. The IMU can be used to refine position estimates as well as detection of a pivot point from pattern recognition of circular movements approximating a hemispherical surface.
The Receiver 104 responds to ultrasonic waves transmitted by the transmitter or probe 110. If more than one probe is used, the receiver can respond in a round-robin fashion; that is, respond to multiplex transmit signals to respective transmitters 110 that emit at specific known times and within certain timing intervals. The Receiver 104 determines positional information of the transmitter 110 from range and localization of received ultrasonic waves captured at the microphones, and also from knowledge of which transmitter is pulsed. Notably, one or more transmitters 110 can be present for determining orientation among a group of transmitters 110. The pod 102 wirelessly transmits this information as positional data (i.e., translation vectors and rotational matrices) to a Display Unit. Aspects of ultrasonic sensing are disclosed in U.S. patent application Ser. No. 11/839,323 filed Aug. 15, 2007, the entire contents of which are incorporated by reference herein. An IMU 241 with operation similar to the IMU 242 on the probe 110 can be present on the receiver 104. The Pod 102 can comprise a processor 231, a communications unit 232, a user interface 233, a memory 234 and a battery 235. The processor 231 controls overall operation and communication between the transmitter 110 and the receiver 104, including digital signal processing of signals, communication control, synchronization, user interface functionality, temperature sensing, optical communication, power management, optimization algorithms, and other processor functions. The processor 231 supports transmitting of timing information including line drive signals to the transmitter 110, receiving of captured ultrasonic signals from the receiver 104, and signal processing for determination of positional information related to the orientation of the transmitter 110 to the receiver 104 for assessing and reporting cut angle information.
The processor 231 can utilize computing technologies such as a microprocessor (uP) and/or digital signal processor (DSP) with associated storage memory such a Flash, ROM, RAM, SRAM, DRAM or other like technologies for controlling operations of the aforementioned components of the terminal device. The instructions may also reside, completely or at least partially, within other memory, and/or a processor during execution thereof by another processor or computer system.
The electronic circuitry of the processor 231 (or controller) can comprise one or more Application Specific Integrated Circuit (ASIC) chips or Field Programmable Gate Arrays (FPGAs), for example, specific to a core signal processing algorithm or control logic. The processor can be an embedded platform running one or more modules of an operating system (OS). In one arrangement, the storage memory 234 may store one or more sets of instructions (e.g., software) embodying any one or more of the methodologies or functions described herein.
The communications unit 232 can further include a transceiver that can support singly or in combination any number of wireless access technologies including without limitation Bluetooth, Wireless Fidelity (WiFi), ZigBee and/or other short or long range radio frequency communication protocols. This provides for wireless communication to a remote device 104 (see
The memory 234 stores received ultrasonic waveforms and processing output related to tracking of received ultrasonic waveforms and other timing information, state logic, power management operation and scheduling. The battery 235 powers the processor 231 and associated electronics thereon and also the transmitter 110 and the receiver 104 in the wired configuration.
The user interface 233 can include one or more buttons to permit handheld operation and use (e.g., on/off/reset button) and illumination elements 237 to provide visual feedback.
In a first arrangement, the receiver 104 is wired via a tethered electrical connection (109 and 103) to the transmitter 110. Timing information from the pod 102 tells the transmitter 210 when to transmit, and includes optional parameters that can be applied for pulse shaping and noise suppression. The processor 231 on the pod establishes Time of Flight measurements according to the timing with respect to a reference time base in the case of ultrasonic signaling. One example of pulse shaping is taught in U.S. Pat. No. 7,414,705 the entire contents of which are hereby incorporated by reference. In a second arrangement, the receiver 104 is wirelessly coupled to the transmitter 110 via an optical signaling connection. The infrared transmitter 216 on the transmitter 110 transmits an infrared timing signal with each transmitted pulse shaped signal. The infrared timing signal is synchronized with the transmitting of the ultrasonic signals to the receiver 104. The receiver 104 can include the IR Link 229 (e.g., IR emitter or photo diode) which the pod 102 monitors to determine when the infrared timing signal is received. The pod 102 can synchronize infrared timing information to establish Time of Flight measurements with respect to a reference transmit time. The infrared transmitter and photo diode establish transmit-receive timing information to within microsecond accuracy.
With respect to
Referring to
The entire SiSonic SPM0404UD5 condenser microphone is a square die manufactured entirely from silicon. The operational temperature range is −40° C. to +105° C. and therefore far wider than standard Electret Condenser Microphone (ECM) ratings. Under specified environmental conditions the standard deviation of the sensitivity is maximum 1 dB. After test conditions are performed, the sensitivity of the microphone does not deviate more than 3 dB from its initial value.
The ultrasound based tracking technology determines spatial position through measurement of ultrasonic waves. As shown in
D=dTOF*c, where
c=331.5+0.607t(m/s) where t=temperature(° C.) EQ 1
Referring to
Referring
Referring to
During operative workflow, the user can index the three-way switch 302 left or right to navigate forward or backward over GUI components as well as pages of a tab menu of the GUI. As illustrated, a hip nav page is displayed in the tab menu. Each page of the tab menu is associated with an operative workflow, for example, as shown for hip replacement surgery. In the exemplary illustration, the tab menu can present various pages (Patient Info, HIP Nav, Tool Nav, HIP-Leg Alignment) corresponding to an operative workflow of a Hip replacement. The operative workflow and accordingly the GUI 112 can be designed specific to an orthopedic procedure (e.g., knee, hip and spine) with pages of the tab menu similarly designed. The pod 102 thus presents the media according to a customized use of the probe during an operation workflow. It permits navigating a menu system of a Graphical User Interface via the tracking of the probe relative to the receiver. Furthermore, the pod 102 can recognize an operation workflow and report measurement data from the probe associated with the operation workflow. As one example, upon moving the probe in a circular pattern the device can automatically detect femur head identification and proceed to the corresponding user component and page of the tab menu. Aspects of detecting a femur head are disclosed in U.S. patent application Ser. No. 12/853,987 filed Aug. 10, 2011, the entire contents of which are incorporated by reference herein. Aspects of pattern recognition using neural networks and hidden Markov models in ultrasonic sensing applications for recognizing user interface gestures are also disclosed in U.S. patent application Ser. No. 11/936,777 filed Nov. 7, 2007, the entire contents of which are incorporated by reference herein.
Referring again to
√{square root over ((xk−x)2+(yk−y)2+(zk−z)2)}{square root over ((xk−x)2+(yk−y)2+(zk−z)2)}{square root over ((xk−x)2+(yk−y)2+(zk−z)2)}=rk EQ 2
The trilateration method of multiple TOFs generates a unique position <x,y,z> of a single transmitter, as shown by the middle figure of
With bone surgeries involving hips, knees or elbows, determination of rotation and translation will be used. The spatial coordinates for each transmitter of the device are used to create a transformation matrix that represents the rotation and translation (position) of the device. This transformation matrix establishes the orientation (angle) and position (location) of the device relative to the Receiver.
The 3×3 rotation matrix identifies the rotation about the devices local X, Y and Z coordinate axes, as noted below.
The 1×3 translation matrix P identifies the (x,y,z) location of the device, as defined by its local coordinate system origin, relative to the origin of the Receiver coordinate system.
The Pod (102) can generate the Spatial Data in accordance with the trilateration TOF analysis of the received ultrasonic signals. The calculated spatial data from the Pod is then communicated over a USB connection, for example, to a computer and presentation device 114 (see Link Station of
The Pod also provides error codes with the spatial data that report the data integrity and device status indications. It does this by way of a device driver (dynamic library) on the computer that received the data communication. The computer provides error control and renders the data to the display through the GUI.
The instrument set (of
The Plate is specified by four spatial locations (each a specified coordinate as listed below):
The spatial coordinates of the Plate and/or Pointer are measured during manufacture and saved to a memory in the instrument device cable. The Pod retrieves these coordinates as well as calibration data from the memory of the device cable once connected. The Computer then retrieves the Plate and/or Pointer coordinates from the Pod once connected. The Computer then makes use of the Plate and/or Pointer coordinates within the GUI as anatomical landmarks are captured for creating the reference bone coordinate systems and for reporting bone cut angles and resection depth measurements or for placement of cups in a hip center for use in adjusting.
Broadly stated, a system and method is provided for touchlessly resolving a pivot point and other measurements in a 3 dimensional space. This can be particularly suited for situations where one end of a rigid object is inaccessible, but remains stationary at a pivot point, while the other end is free to move, and is accessible to an input pointing device. The system as noted above comprises a probe or wand and a receiver that are spatially configurable to touchlessly locate the pivot point without direct contact. The wand and receiver uniquely track each other's relative displacement to geometrically resolve the location of the pivot point. The pivot point can be a hip joint and a rigid object can be a femur bone with one end at the hip joint and the other end free to move.
The embodiments will be further described below in the context of a Hip replacement. Hip replacement surgery is a very successful procedure to alleviate pain from an arthritic hip joint. A successful outcome requires the resurfacing of the pelvic acetabular cup and the diseased femoral head. The Hip joint articulates in a ball and socket configuration, and the hip joint replacement incorporates an artificial acetabular cup, femoral stem, various modular neck and heads. The surgeon removes the diseased portions of the joint, and attempts to create a stable joint by appropriate positioning of the acetabular cup and femoral components. He mainly does this through visual landmarks and surgical experience. Common errors include a poorly positioned acetabular cup as it relates to version, inclination, depth, and position. This can lead to dislocations, subluxations, component edge loading, early poly wear, cup loosening and migration. The femur can be malpositioned as well as it relates to version, position, varus-valgus orientation. This can lead to hip instability—dislocation, or a tight hip leading to pain, limp, and limited motion. The use of modular heads and necks are in response to improving hip stability. A leg can be inappropriately lengthened or the hips offset may be over- or under tensioned leading to a failure of the surgery.
Described herein is a surgical approach integrating ultrasonic (US) technology positioning as it relates to implant preparation and implantation of the prosthesis. The system can be integrated with image or imageless systems. The US sensors can be attached to bony landmarks for tracking and attached to standard instruments to allow appropriate bony preparation. The surgeon pre-operatively images the hip and determines the implant sizing and positioning. The key element is defining the center of hip rotation that is present and what hip center he wants to achieve intra-operatively. The surgeon positions the patient as it relates to his approach. A pin can be mounted in the ipsilateral ASIS. The pin is stabilized to the pelvis. Knowing the positioning of the pelvis, a sensor is mounted to the pin. The pelvis and acetabulum can be registered to a pre-op image (e.g., digital X-ray, Computer Assisted Tomography (CAT), Magnetic Resonance Imaging (MRI), etc.) or an imageless system.
Referring now to
In another arrangement, as part of the surgical work flow, the probe 110 may be thereafter attached to an impactor (e.g. 171) with a temporarily mounted prosthetic cup (e.g., 172) that is guided into position in the acetabular cup of the pelvis 173. In such arrangement, the link station 114 by way of the GUI 112 permits the user to guide the cup 172 to a center of the acetabular cup (hip joint) in the pelvis 173 with a known inversion angle and inclination angle with respect to the reference coordinate system of the pelvis. Briefly, the reference coordinate system is created from the previous registration of pelvic anatomical landmarks (i.e., left ASIS, right ASIS, pubis, etc) by way of the probe 110 pointer tip marking these anatomical landmarks. This coordinate system also serves as the reference for the acetabular cup center which is referenced for determination of leg offset and leg length.
As one example of capturing the acetabular cup center, referring to
In an anterior approach, the contra-lateral ASIS (anterior superior iliac spine) can be mounted, and the hip joint defined by the rotation method described here. By knowing what the non-diseased hip joint center and leg length are, this information can be utilized to compare the operated hip outcome intra-operatively, and adjustments made as necessary to achieve balanced hips and leg lengths/offset.
The pelvis is registered by collecting a set of points on the pelvis, and in the acetabular cup, for example, as described using probe pointer registration The probe 110 tracker wirelessly sends this information to the receiver 104. A particular way to register the cup is to remove the femoral head from the joint, and utilize a sensorized initial cup trial (105 as illustrated in
A strap (not shown) can be securely attached to the distal thigh and if the hip joint is mobile, the center of the hip is registered with circumductive movements. (See
Once the cup is reamed under guided navigation, the surgeon may now attach the final acetabular cup that will be the final implant. The US sensor as it is attached to the insertion handle, now guides the insertion of the cup, and as the surgeon impacts the implant in, with known reported inversion, inclination and depth through the software on the GUI that depicts when he has achieved optimal implantation. (See
The femoral stem preparation now begins. The femoral canal is broached and trial stems are inserted to achieve appropriate canal fill and stability while appropriate version and depth are defined. Modular head and neck are attached to the stem and the hip joint is reduced. By having an ultrasonic (US) tracker on the greater trochanter or attached to the lateral aspect of the trial itself, and registering points distally on the femur, the hip offset, leg length can be identified and compared to pre-op or to what was planned on the GUI. Adjustments can be made to achieve optimal positioning and stability by adjusting the modular components (femoral head, neck, acetabular liner, depth of the cup or femoral stem). (See
In summary, described here is a highly accurate, ultrasonic-based, disposable navigation system that allows the sensorization of standard hip instruments and implants to provide the surgeon real time knowledge of implant positioning and the optimal leg length, offset, and joint stability, to achieve a successful joint procedure. The system can be integrated into a imageless or image free platform and incorporates wirelessly with a GUI that houses the necessary software for intra-op surgical adjustments. The mobility of the sensors, allows attachment to the bony pelvis and femur, or incorporated into the trials themselves. They can be attached to instruments controlled by the surgeon or a robotic haptic controlled instrumented arm. The data of the surgery can now be wirelessly sent to a data registry. Other sensory placement arrangements include, but are not limited to:
A sensorized acetabular prosthesis may be utilized in this method where the sensorized acetabular prosthesis includes modular components and the outer shell is separate from an insert or liner comprising the articulation surface for articulation against a femoral head, and also the non-modular prosthesis where the acetabular outer shell and insert or liner comprise a single complete implant. A sensorized femoral component may be utilized in this method as well.
Other sources of sound distortion may however be present during transmit and receiver operation of the tracking system, for example, voiced or noise signals in the operating environment. Thus, the microphones capture both ultrasonic and acoustic waveforms which are electrically converted to combined acoustic signals. In order to remove the external acoustic waveforms from the captured signal, the processor applies noise suppression and other digital filters to isolate the ultrasonic signals from the audio and noise signals.
During transmit-receive communications between a transmitter 110 and the receiver 220, the pod 102 digitally samples captured signals which as described above may be a combination of acoustic and ultrasonic waveforms to produce sampled received ultrasonic waveforms. The pod tracks a relative location and movement of the probe in the three-dimensional ultrasonic sensing space from differential time of flight waveform analysis of the sampled received ultrasonic waveforms. For precise tracking, the ultrasonic waveforms that overlap with digitally sampled acoustic waveforms received at the microphones are first isolated as indicated above through noise suppression and filtering, and thereafter, or in conjunction with, conditioned to suppress a ringing portion of the received ultrasonic waveforms. This signal conditioning minimizes a distortion associated with ultrasonic transducer ring-down during generation of a high-resolution position tracking of the probe.
The illustrations of embodiments described herein are intended to provide a general understanding of the structure of various embodiments, and they are not intended to serve as a complete description of all the elements and features of apparatus and systems that might make use of the structures described herein. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. Other embodiments may be utilized and derived there from, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Figures are also merely representational and may not be drawn to scale. Certain proportions thereof may be exaggerated, while others may be minimized. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
Accordingly, this application also incorporates by reference the following Applications: U.S. Pat. No. 7,788,607 (Attorney Docket No. 80009) entitled “Method and System for Mapping Virtual Coordinates”, U.S. patent application Ser. No. 11/683,410 (Attorney Docket No. B0011) entitled “Method and System for Three-Dimensional Sensing”, U.S. patent application Ser. No. 11/683,412 (Attorney Docket No. B00.12 entitled “Application Programming Interface (API) for Sensory Events”, U.S. patent application Ser. No. 11/684,413 (Attorney Docket No. 800.13) entitled “Visual Toolkit for a Virtual User Interface”, U.S. patent application Ser. No. 11/683,415 (Attorney Docket No. B00.14) entitled “Virtual User Interface Method and Device Thereof”, U.S. patent application Ser. No. 11/683,416 (Attorney Docket No. B00.15) entitled “Touchless Tablet Method and Device Thereof” and U.S. patent application Ser. No. 12/050,790 (Attorney Docket No. 80023) entitled Method and Device for Touchless Media Searching.
Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.
These are but a few examples of embodiments and modifications that can be applied to the present disclosure without departing from the scope of the claims stated below. Accordingly, the reader is directed to the claims section for a fuller understanding of the breadth and scope of the present disclosure.
This application is a continuation-in-part of U.S. patent application Ser. No. 11/683,410 filed Mar. 7, 2007 entitled “Method and Device for Three-Dimensional Sensing”, the entire contents of which are hereby incorporated by reference. This application also claims the priority benefit of U.S. Provisional Patent Application No. 61/597,026 entitled “Anatomical Pivot Point for Leg Length and Offset Calculations” filed Feb. 9, 2012, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
60779868 | Mar 2006 | US | |
61597026 | Feb 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11683410 | Mar 2007 | US |
Child | 13424359 | US |