User interface and methods for sonographic display device

Information

  • Patent Grant
  • 8937630
  • Patent Number
    8,937,630
  • Date Filed
    Friday, April 26, 2013
    12 years ago
  • Date Issued
    Tuesday, January 20, 2015
    10 years ago
Abstract
A user interface for a sonographic device is disclosed that displays a sonographic image and at least one reference object corresponding to a catheter size. The reference object may be scaled in proportion to the sonographic image. In addition, the user interface may further display a plurality of vertical lines and a plurality of horizontal lines arranged in a grid-like pattern and placed over the sonographic image. The size of both the sonographic image and the at least one reference object may change in proportion to a chosen insertion depth. A display device, a sonographic device and corresponding methods are also disclosed.
Description
BRIEF DESCRIPTION OF THE DRAWINGS

Non-limiting and non-exhaustive embodiments of the disclosure are described, including various embodiments of the disclosure with reference to the figures, in which:



FIG. 1 is an illustration of a first view of an exemplary user interface for a sonographic device according to one embodiment;



FIG. 2 is an illustration of a second view of an exemplary user interface for a sonographic device according to one embodiment;



FIG. 3 is an illustration of a third view of an exemplary user interface for a sonographic device according to one embodiment;



FIG. 4 is a semi-schematic perspective view of a sonographic system for providing a user interface of the type illustrated in FIGS. 1-3 according to one embodiment;



FIG. 5 is a schematic block diagram of a sonographic system according to one embodiment; and



FIG. 6 is a flowchart of a method for displaying a sonographic image according to certain embodiments.







DETAILED DESCRIPTION

A sonographic device is a diagnostic imaging device commonly used by medical professionals to visualize the size, structure, and/or condition of a patient's muscles, internal organs, vascular structures, or the like. Sonographic devices typically comprise a processing unit, a probe connected to the processing unit, and a display device in communication with the processing unit.


In certain embodiments described herein, a graphical user interface (generally referred to herein as a “user interface”) for a sonographic system is used to select the size of a catheter for placement in a vascular structure. However, one of skill in the art will recognize from the disclosure herein that the systems and methods described herein are not so limited. For example, the systems and methods described herein may also be used to select needle sizes and/or the sizes of other devices for placement (e.g., intraoperatively or percutaneously) in vascular structures and/or various organs and structures of the body.


According to at least one embodiment, a user interface for a sonographic device may be configured to display a sonographic image and at least one reference object corresponding to a catheter size. In one embodiment, the reference object is circular and includes a diameter corresponding to a catheter size, such as an outer diameter of a catheter. In other embodiments, the reference object can be any other suitable shape.


In at least one embodiment, the at least one reference object is scaled in proportion to the sonographic image. In addition, the user interface may be configured to display a plurality of vertical lines and a plurality of horizontal lines arranged in a grid-like pattern and placed over the sonographic image. The area defined by a first vertical line and a second vertical line adjacent to the first vertical line may correspond to a dimensional measurement unit. Similarly, the area defined by a first horizontal line and a second horizontal line adjacent to the first horizontal line may correspond to a dimensional measurement unit. In certain embodiments, the size of both the sonographic image and the at least one reference object changes in proportion to a chosen insertion depth.


In at least one embodiment, the user interface is further configured to allow a user to select the catheter size corresponding to the first reference object. In addition, or in other embodiments, the user interface is further configured to selectively display a selected reference object over the sonographic image. For example, a user may drag or otherwise reposition a reference object from a first area of the user interface to a second area of the user interface over the sonographic image. The user may also be allowed to change the size of the selected reference object displayed over the sonographic image so as to correspond to a desired catheter size.


A method for displaying a sonographic image on a sonographic display device may comprise displaying a sonographic image on the sonographic display device, displaying a plurality of vertical and horizontal lines arranged in a grid-like pattern on the sonographic display device, and placing the grid-like pattern over the sonographic image. The method may also comprise displaying, on the sonographic display device, at least one reference object corresponding to a catheter size.


The embodiments of the disclosure will be best understood by reference to the drawings, wherein like elements are designated by like numerals throughout. In the following description, numerous specific details are provided for a thorough understanding of the embodiments described herein. However, those of skill in the art will recognize that one or more of the specific details may be omitted, or other methods, components, or materials may be used. In some cases, operations are not shown or described in detail.


Furthermore, the described features, operations, or characteristics may be combined in any suitable manner in one or more embodiments. It will also be readily understood that the order of the steps or actions of the methods described in connection with the embodiments disclosed may be changed as would be apparent to those skilled in the art. Thus, any order in the drawings or Detailed Description is for illustrative purposes only and is not meant to imply a required order, unless specified to require an order.


Embodiments may include various steps, which may be embodied in machine-executable instructions to be executed by a general-purpose or special-purpose computer (or other electronic device). In other embodiments, the steps may be performed by hardware components that include specific logic for performing the steps or by a combination of hardware, software, and/or firmware.


Embodiments may also be provided as a computer program product including a machine-readable medium having stored thereon instructions that may be used to program a computer (or other electronic device) to perform processes described herein. The machine-readable medium may include, but is not limited to, hard drives, floppy diskettes, optical disks, CD-ROMs, DVD-ROMs, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, solid-state memory devices, or other types of media/machine-readable medium suitable for storing electronic instructions.


Several aspects of the embodiments described will be illustrated as software modules or components. As used herein, a software module or component may include any type of computer instruction or computer executable code located within a memory device. A software module may, for instance, comprise one or more physical or logical blocks of computer instructions, which may be organized as a routine, program, object, component, data structure, etc., that performs one or more tasks or implements particular abstract data types.


In certain embodiments, a particular software module may comprise disparate instructions stored in different locations of a memory device, which together implement the described functionality of the module. Indeed, a module may comprise a single instruction or many instructions, and may be distributed over several different code segments, among different programs, and across several memory devices. Some embodiments may be practiced in a distributed computing environment where tasks are performed by a remote processing device linked through a communications network. In a distributed computing environment, software modules may be located in local and/or remote memory storage devices. In addition, data being tied or rendered together in a database record may be resident in the same memory device, or across several memory devices, and may be linked together in fields of a record in a database across a network.


Generally, one or more of the exemplary embodiments described and/or illustrated herein may be configured for use in connection with a sonographic device, also known as a sonograph, an ultrasonograph, or an ultrasound device. FIGS. 1-3 are illustrations of an exemplary user interface 10 for a sonographic device according to certain embodiments. The user interface 10 may be displayed on any display device capable of displaying sonographic images. The display device according to certain embodiments may be connected to a processing unit of a sonographic device. Optionally, as discussed below in relation to FIGS. 4 and 5, the display device may comprise a portion of, and be integrally formed with, a sonographic device.


In at least one embodiment, the user interface 10 may comprise a first display area 20 and a second display area 30. In certain embodiments, the first display area 20 may be configured to display text or graphics-based information. For example, the first display area 20 may be configured to display certain status information, such as the current date or time. The second display area 30 generally represents an area configured to display various sonographic images and/or data. In many embodiments, the second display area 30 may be configured to display a sonographic image 90. The sonographic image 90 generally represents any image generated using sonographic or ultrasonographic techniques, as known to those of ordinary skill in the art. In certain embodiments, the sonographic image 90 may depict the muscle, internal organs, or vasculature of a patient.


As seen in FIGS. 1-3, the user interface 10 may also include an insertion depth indicator 40. In at least one embodiment, the insertion depth indicator 40 displays an insertion depth for the sonographic image 90 displayed in the second display area 30. In other words, in this embodiment, the insertion depth indicator 40 indicates, graphically and/or using text, the distance (in measurement units, such as centimeters or inches) from the surface of a patient's skin to the cross-sectional image 90 displayed in the second display area 30. In certain embodiments, the insertion depth for the image 90 may range between about 1.5 cm to about 6 cm and may be chosen by an operator of the sonographic device.


In at least one embodiment, the size of the sonographic image 90 displayed in the second display area 30 may change as the insertion depth is varied by the operator of the sonographic device. That is, as the insertion depth is increased, the relative size of the sonographic image 90 displayed in the second display area 30 may also increase. In contrast, as the insertion depth is decreased, the relative size of the sonographic image 90 displayed in the second display area 30 may also decrease. For example, the size of the sonographic image 90 may increase as the insertion depth is increased from about 3.0 cm (illustrated in FIG. 1) to about 4.5 cm (illustrated in FIGS. 2 and 3).


As seen in FIGS. 1-3, the second display area 30 may also include a plurality of vertical reference lines 32 and a plurality of horizontal reference lines 34 arranged in a grid-like pattern over the sonographic image 90. In certain embodiments, the vertical reference lines 32 may be configured to extend from, and be in alignment with, a plurality of vertical reference marks 36 positioned along the periphery of the second display area 30. Similarly, the horizontal reference lines 34 may be configured to extend from, and be in alignment with, a plurality of horizontal reference marks 38 positioned along the periphery of the second display area 30. In certain embodiments, a user may select whether or not to display the reference lines 32, 34 and/or the reference marks 36, 38.


In at least one embodiment, the area defined by adjacent vertical reference marks 36 may correspond to a dimensional measurement unit, such as a centimeter, inch, or a fraction thereof. Similarly, the area defined by adjacent horizontal reference marks 38 may correspond to a dimensional measurement unit, such as a centimeter, inch, or a fraction thereof. For example, each vertical reference mark 36 may be spaced exactly one-half of a centimeter apart from each adjacent vertical reference mark 36. Similarly, each horizontal reference mark 38 may be spaced exactly one-half or one-quarter of a centimeter apart from each adjacent horizontal reference mark 38.


As seen in FIGS. 1-3, the user interface 10 may also include a third display area 50 configured to display one or more reference circles 51, 53, 55, 57, 59. In certain embodiments, each reference circle displayed in the third display area 50 may be sized and shaped so as to correspond to a French catheter size. In other words, each reference circle may include an outer diameter that equals the outer diameter of a chosen catheter size. For example, the outer diameter of the reference circle 51 may correspond to the outer diameter of a 2 French catheter, the outer diameter of the reference circle 53 may correspond to the outer diameter of a 3 French catheter, the outer diameter of the reference circle 55 may correspond to the outer diameter of a 4 French catheter, the outer diameter of the reference circle 57 may correspond to the outer diameter of a 5 French catheter, and the outer diameter of the reference circle 59 may correspond to the outer diameter of a 6 French catheter.


One skilled in the art will recognize from the disclosure herein that the third display area 50 of the user interface 10 may display fewer than five reference circles or more than five reference circles. For example, in one embodiment, the third display area 50 displays seven reference circles (respectively corresponding to the outer diameter of a 2 French catheter, a 3 French catheter, a 4 French catheter, a 5 French catheter, a 6 French catheter, a 7 French catheter, and an 8 French catheter). In addition, or in other embodiments, a user may select the number of reference circles and the size of each reference circle to display in the third display area 50.


Further, as discussed above, one skilled in the art will recognize from the disclosure herein that the reference circles 51, 53, 55, 57, 59 or other icons displayed in the third display area 50 may correspond to the size of other objects besides catheters. For example, objects displayed in the third display area 50 may correspond to various needle sizes or the sizes of other insertable or implantable objects for vascular structures, organs or other bodily structures.


In at least one embodiment, the catheter sizes represented by each reference circle displayed in the third display area 50 may be accurately proportioned in size with the sonographic image 90 displayed in the second display area 30. That is, each reference circle displayed in the third display area 50 may be drawn on a 1:1 scale with the sonographic image 90. Thus, in certain embodiments, the various reference circles displayed in the third display area 50 may be used by a medical professional in determining a preferred catheter size for insertion within a patient. For example, by comparing one or more of the reference circles displayed in the third display area 50 with the sonographic image 90 displayed in the second display area 30, a medical professional may be able to determine the catheter size that will be best suited for insertion within, for example, the vasculature of a patient.


In at least one embodiment, the size of each reference circle displayed in the third display area 50 may vary in relation to a chosen insertion depth (imaging depth), indicated in the insertion depth indicator 40. In particular, the size of each reference circle displayed in the third display area 50 may increase or decrease as the insertion depth is increased or decreased, such that the scale between the sonographic image 90 and each catheter size displayed in the third display area 50 remains 1:1. For example, the size of each reference circle 51, 53, 55, 57, 59 may increase as an insertion depth is increased from about 3.0 cm (illustrated in FIG. 1) to about 4.5 cm (illustrated in FIG. 2) such that the scale between the sonographic image 90 and each reference circle remains 1:1. Thus, the various reference circles displayed in the third display area 50 may be used by a medical professional in determining a preferred catheter size for insertion within a patient, regardless of a chosen insertion depth.


Referring to FIG. 3, a user according to certain embodiments may select and position a reference circle 61 within the second display area 30. Overlaying the reference circle 61 on the sonographic image 90 allows the user to more accurately compare the size of the reference circle 61 with vascular structures, organs or other bodily parts represented by the sonographic image 90 in the second display area 30.


The selected reference circle size may correspond to one of the reference circles 51, 53, 55, 57, 59 displayed in the third display area 50. In addition, or in other embodiments, the user may select a reference circle size for display in the second display area 30 that is not displayed in the third display area 50. For example, as shown in FIG. 3, the reference circle 61 corresponding to the outer diameter of a 7 French catheter size has been positioned over the sonographic image 90 in the second display area 30.


In certain embodiments, the user may place the reference circle 61 within the second display area 30 by selecting it from the third display area 50 and dragging (or otherwise repositioning) it to a desired location within the second display area. The user may use, for example, a touch screen, a mouse, an action button 60, a keyboard, combinations of the foregoing, or other input devices to make the selection and position the reference circle 61 over the sonographic image 90 within the second display area 30. The user may also use such inputs to resize the reference circle 61 after positioning it within the second display area 30.


While FIGS. 1-3 illustrate the sonographic image 90 and the reference circles 51, 53, 55, 57, 59 displayed in different display areas (the second display area 30 and the third display area 50, respectively), the disclosure herein is not so limited. Indeed, an artisan will recognize from the disclosure herein that both the sonographic image 90 and the reference circles 51, 53, 55, 57, 59 may be displayed in the same display area of the user interface 10. For example, the sonographic image 90 and the reference circles 51, 53, 55, 57, 59 may each be displayed in the second display area 30.


In certain embodiments, the user interface 10 illustrated in FIGS. 1-3 may include various types of control objects to provide input and output functionality. Examples of suitable control objects include, without limitation, push buttons, by which a user may indicate acceptance of a particular action, radio buttons for selecting one of a number of available choices for a particular parameter, and check boxes for activating or deactivating various features. Additional examples of control objects include scroll bars for displaying different portions of a display area within a display window, sliders for adjusting variable values, and minimization buttons for displaying or hiding the contents of a folder or a pop-up menu.


In many embodiments, a user may activate one or more of these various control objects by touching the control object on a touch-sensitive display screen. In another embodiment, in a case where the display connected to the ultrasound device is a computer monitor, a user may activate one or more of these various control objects by positioning a cursor above the control object using a user input device (such as a mouse) connected to the ultrasound device and actuating the object by pushing a button or the like on the user input device.


In one embodiment, the user interface 10 illustrated in FIGS. 1-3 may include a plurality of action buttons 60. In at least one embodiment, the action buttons 60 may correspond to, and allow a user to select, various actions or functions that may be performed. Examples of actions that may be performed upon actuation of the action buttons 60 may include, without limitation: 1) a print operation for printing the area displayed within the second display area 30; 2) a save operation for saving an image of the picture displayed in the second display area 30; 3) a freeze operation for freezing a live picture displayed within the second display area 30; 4) a depth marker operation for allowing a user to select and position one or more depth markers within the second display area 30; 5) a grid operation for allowing a user to selectively display or hide reference lines 32, 34 and/or reference marks 36, 38; and 6) a reference circle operation for allowing a user to select and position a reference circle 51, 53, 55, 57, 59 within the second display area 30.


As seen in FIGS. 1-3, the user interface 10 may also include a display attribute interface 70. In at least one embodiment, a user may adjust various display attributes, such as the contrast, brightness, or hue of the display device by interacting with the display attribute interface 70. In addition, the user interface 10 may also include a power indicator 80. In certain embodiments, the power indicator 80 is configured to display the amount of charge remaining in a battery used to power the ultrasound device and the accompanying display device.


Referring to FIG. 4, there is shown a semi-schematic perspective view of a sonographic system 100 for providing a user interface 10 of the type illustrated in FIGS. 1-3. The sonographic system 100 may include, in one embodiment, a housing 102 including a display device 104 and one or more user controls 106. In one embodiment, the sonographic system 100 also includes a probe 108 including one or more user controls 110. The probe 108 is configured to transmit ultrasonic signals and to receive reflected ultrasonic signals. The sonographic system 100 processes the received ultrasonic signals for display on the display device 104 as discussed herein.


The user controls 106 may include, for example, image gain controls to adjust the amplification of a received ultrasonic signal, image depth controls to image structures at different depths and adjust the focus of a sonograph displayed on the display device 104, depth marker controls to selectively display depth markers and/or grid lines as discussed above, print and/or save controls to print/save an image currently displayed on the display device 10, image freeze controls to pause an image currently displayed on the display device 10, time/date set controls, and other controls for operating the sonographic system 100 as discussed herein. Such controls, or a subset thereof, may also be included in the user controls 110 on the probe 108 and/or on the display device (e.g., selectable using a touch screen). In addition, or in other embodiments, the functionality of the user controls 106, 110 may be provided by a keyboard, mouse, or other suitable input device.



FIG. 5 is a schematic block diagram of the sonographic system 100 according to one embodiment. The illustrated components may be implemented using any suitable combination of hardware, software, and/or firmware. In one configuration, the sonographic system 100 includes ultrasound circuitry 120 for generating, transmitting, and processing ultrasound signals, as is known to one skilled in the art. All or a portion of the ultrasound circuitry 120 may be included within the probe 108. One skilled in the art will recognize that at least a portion of the ultrasound circuitry 120 may also be included within the housing 102.


The sonographic system 100 may also include, according to one embodiment, a communication interface 122 for communicating with a network such as the Internet or World Wide Web, an intranet such as a local area network (LAN) or a wide area network (WAN), a public switched telephone network (PSTN), a cable television network (CATV), or any other network of communicating computerized devices.


The sonographic system 100 may further include a memory 124, such as a random access memory (RAM), hard drives, floppy diskettes, optical disks, CD-ROMs, DVD-ROMs, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, and/or solid-state memory devices. The memory 124 may store an operating system (OS) for the sonographic system 100 (e.g., a Windows CE® OS or a Linux® OS), application program code, and various types of data. In one embodiment, the memory 124 stores sonographic images 90 and/or images of reference circles 51, 53, 55, 57, 59, as illustrated in FIGS. 1-3.


An input/output interface 126 may be provided for receiving commands from an input device, such as a mouse, keyboard, or remote control. The input/output interface 126 may detect, for example, a user pressing the user controls 106, 110. The input/output interface 126 may also send data to an output device, such as a printer or external storage device.


The sonographic system 100 may further include a display interface 128 for rendering graphical data, including the user interface 10, on the display device 104 and/or an external display device.


A processor 130 controls the operation of the sonographic system 100, including the other components described above, which may be in electrical communication with the processor 130 via a bus 132. The processor 130 may be embodied as a microprocessor, a microcontroller, a digital signal processor (DSP), or other device known in the art. The processor 130 may perform logical and arithmetic operations based on program code and data stored within the memory 124 or an external storage device.


Of course, FIG. 5 illustrates only one possible configuration of a sonographic system 100. An artisan will recognize that various other architectures and components may be provided.



FIG. 6 is a flowchart of a method 140 for displaying a sonographic image according to certain embodiments. The method 140 includes displaying 142 a sonographic image in a first area of a user interface and displaying 144 reference objects in a second area of a user interface. The reference objects may include, for example, the reference circles 51, 53, 55, 57, 59 shown in FIGS. 1-3. Of course, however, the reference objects are not so limited and may be of any shape. As discussed herein, a dimension of the reference objects correspond to a dimension of a device configured to be inserted or implanted within a patient's body.


In one embodiment, the method 140 also includes determining 146 whether a user has requested a change in the size of one or more of the reference objects displayed in the second area of the user interface. If a change has been requested, the method 140 includes resizing 148 the default sizes of the reference objects based on the user's request. For example, the user may request that the size of a particular reference object corresponding to a 6 French catheter size be changed so as to correspond to a 7 French catheter size.


In one embodiment, the method 140 also includes determining 150 whether a user has requested that grid lines be displayed over the sonographic image in the first area of the user interface. If the user has turned on the grid lines, the method 140 displays 152 a plurality of vertical and horizontal lines in a grid-like pattern over the sonographic image. As discussed above, an area defined by a first vertical line and a second vertical line adjacent to the first vertical line corresponds to a dimensional measurement unit. Similarly, an area defined by a first horizontal line and a second horizontal line adjacent to the first horizontal line corresponds to a dimensional measurement unit. If the user has turned off the grid lines, the method 140 hides 154 any grid lines displayed in the first area of the user interface.


In one embodiment, the method 140 also includes determining 156 whether a user has requested that a selected reference object be overlaid on the sonographic image in the first area of the user interface. If yes, the method 140 includes receiving 158 the user's selection of a reference object size for the overlay. As discussed above, the user may select the size by selecting one of the reference objects displayed in the second area of the user interface. In addition, or in another embodiment, the user may define a size that does not correspond to any reference objects displayed in the second area of the user interface.


The method 140 also receives 160 the user's selection of an overlay position within the first area of the user interface. As discussed above, the user may position the selected reference object by dragging it (e.g., using a touch sensitive screen or mouse), using arrows (e.g., on a touch sensitive screen, display device, or keyboard), or otherwise defining the coordinates (e.g., using a keyboard) where the reference object is moved or placed. The method 140 then displays 162 the selected reference object at the selected position over the sonographic image in the first area of the user interface. If the user turns off the overlay, the method 140 hides 164 the selected reference object displayed in the first area of the user interface.


In one embodiment, the method 140 determines 166 whether a user has changed an insertion depth (imaging depth). If the insertion depth has been changed, the method 140 resizes 168 the sonographic image and the selected reference object, if any, displayed in the first area of the user interface in proportion to the selected insertion depth. Although not shown in FIG. 6, in one embodiment, the method 140 also adjusts the distance between lines in the grid, if displayed, in proportion to the selected insertion depth. In another embodiment, the method may change the dimensional measurement unit corresponding to the area defined by successive vertical and/or horizontal lines in the grid, if displayed, in proportion to the selected insertion depth.


The method 140 also resizes 170 the reference objects displayed in the second area of the user interface in proportion to the selected insertion depth. Thus, as the insertion depth changes such that the sonographic image displayed in the first area of the user interface is enlarged (zoom in) or reduced (zoom out), the respective sizes of the reference objects (whether displayed in the first or second area of the user interface) are maintained relative to the size of a vascular structure, organ, or other bodily structure represented in the sonographic image. Accordingly, a user may determine the reference object size most suitable or desired for the particular vascular structure, organ, or other bodily structure.


Various modifications, changes, and variations apparent to those of skill in the art may be made in the arrangement, operation, and details of the methods and systems of the disclosure without departing from the spirit and scope of the disclosure. Thus, it is to be understood that the embodiments described above have been presented by way of example, and not limitation, and that the invention is defined by the appended claims.

Claims
  • 1. A sonographic device, comprising: a user interface including a display, the user interface configured to display a live sonographic image and at least a first reference object on the display, wherein the first reference object represents a medical instrument and is movable independent from the medical instrument, and wherein the user interface scales the first reference object in proportion to the live sonographic image when a size of the live sonographic images changes.
  • 2. The sonographic device according to claim 1, further configured to display a plurality of vertical lines and a plurality of horizontal lines arranged in a grid-like pattern over the live sonographic image.
  • 3. The sonographic device according to claim 2, wherein an area defined by a first vertical line and a second vertical line adjacent to the first vertical line corresponds to a dimensional measurement unit.
  • 4. The sonographic device according to claim 2, wherein an area defined by a first horizontal line and a second horizontal line adjacent to the first horizontal line corresponds to a dimensional measurement unit.
  • 5. The sonographic device according to claim 1, further including an insertion depth indicator that indicates an insertion depth of the live sonographic image.
  • 6. The sonographic device according to claim 5, wherein a size of the live sonographic image and a size of the first reference object change in proportion to a change in insertion depth.
  • 7. The sonographic device according to claim 1, wherein the first reference object represents a catheter.
  • 8. The sonographic device according to claim 1, further including a plurality of control objects allowing a user to select various actions or functions to be performed.
  • 9. The sonographic device according to claim 8, wherein at least one of the control objects allows a user to perform a freeze operation for freezing the live sonographic image.
  • 10. A computer program stored in an executable format on one or more than one non-transitory machine-readable medium including instructions for performing a method for facilitating selection of a medical instrument, the method comprising: displaying a live sonographic image of an internal structure of a body in a user interface;displaying a reference object in the user interface, the reference object representing a medical instrument and being movable independent from the medical instrument; andautomatically scaling the reference object in proportion to the live sonographic image such that the scale between the live sonographic image and the reference object remains proportionate.
  • 11. The computer program according to claim 10, wherein disparate instructions for performing the method are stored in different locations on more than one machine-readable medium, and wherein the disparate instructions implement the method together.
  • 12. The computer program according to claim 10, wherein multiple blocks of instructions are distributed over several different code segments and across more than one machine-readable medium.
  • 13. The computer program according to claim 10, wherein the one or more than one non-transitory machine-readable medium is selected from the group consisting of hard drives, floppy diskettes, optical disks, CD-ROMs, DVD-ROMs, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, solid-state memory devices, and combinations of these.
  • 14. The computer program according to claim 13, wherein the method includes linking to a remote processing device through a communications network.
  • 15. The computer program according to claim 13, wherein the user may request that the reference object be overlaid on the live sonographic image by selecting the reference object from an area of the user interface including multiple different reference objects, and the method includes determining which reference object is selected by the user.
  • 16. The computer program according to claim 13, wherein the method includes determining whether a user has requested that the reference object be overlaid on the live sonographic image, and wherein the instructions provide for displaying the reference object in an overlay position over the live sonographic image, if the user has requested that the reference object be overlaid on the live sonographic image.
  • 17. The computer program according to claim 16, wherein the method includes determining the overlay position based on input from the user.
  • 18. The computer program according to claim 16, wherein the method includes allowing the user to select the overlay position by dragging the reference object to the overlay position.
  • 19. The computer program according to claim 13, wherein the method includes determining whether a user has changed an insertion depth, and the instructions provide for automatically scaling the reference object and live sonographic image in proportion to the changed insertion depth, if the user has changed the insertion depth.
  • 20. The computer program according to claim 13, wherein the medical instrument is a catheter.
RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 13/546,900, filed Jul. 11, 2012, now U.S. Pat. No. 8,432,417, which is a continuation of U.S. patent application Ser. No. 11/745,756, filed May 8, 2007, now U.S. Pat. No. 8,228,347, which claims the benefit of priority under 35 U.S.C. §119(e) of U.S. Provisional Application No. 60/746,741, filed May 8, 2006, each of which is hereby incorporated by reference herein in its entirety.

US Referenced Citations (233)
Number Name Date Kind
3858325 Goerler Jan 1975 A
4141347 Green et al. Feb 1979 A
4280047 Enos Jul 1981 A
4520671 Hardin Jun 1985 A
5182728 Shen et al. Jan 1993 A
5285785 Meyer Feb 1994 A
5305203 Raab Apr 1994 A
5325293 Dorne Jun 1994 A
5400513 Duffield Mar 1995 A
5416816 Wenstrup et al. May 1995 A
5534952 Zanecchia et al. Jul 1996 A
5609485 Bergman et al. Mar 1997 A
5722412 Pflugrath et al. Mar 1998 A
5795297 Daigle Aug 1998 A
5817024 Ogle et al. Oct 1998 A
5893363 Little et al. Apr 1999 A
5908387 LeFree et al. Jun 1999 A
5920317 McDonald Jul 1999 A
5970119 Hofmann Oct 1999 A
6019724 Gronningsaeter et al. Feb 2000 A
6048314 Nikom Apr 2000 A
6063030 Vara et al. May 2000 A
6063032 Grunwald May 2000 A
6117079 Brackett et al. Sep 2000 A
6132373 Ito et al. Oct 2000 A
6132379 Patacsil et al. Oct 2000 A
6135961 Pflugrath et al. Oct 2000 A
6154576 Anderson et al. Nov 2000 A
6203498 Bunce et al. Mar 2001 B1
6213944 Miller et al. Apr 2001 B1
6231231 Farrokhnia et al. May 2001 B1
6251072 Ladak et al. Jun 2001 B1
6360116 Jackson, Jr. et al. Mar 2002 B1
D456509 Schultz Apr 2002 S
6364839 Little et al. Apr 2002 B1
6371918 Bunce Apr 2002 B1
6379302 Kessman et al. Apr 2002 B1
6383139 Hwang et al. May 2002 B1
6413217 Mo Jul 2002 B1
6416475 Hwang et al. Jul 2002 B1
D461895 Barnes et al. Aug 2002 S
6443894 Sumanaweera et al. Sep 2002 B1
6447451 Wing et al. Sep 2002 B1
6450978 Brosseau et al. Sep 2002 B1
6468212 Scott et al. Oct 2002 B1
6471651 Hwang et al. Oct 2002 B1
6485422 Mikus et al. Nov 2002 B1
6511427 Sliwa, Jr. et al. Jan 2003 B1
6512942 Burdette et al. Jan 2003 B1
6516215 Roundhill Feb 2003 B1
6524247 Zhao et al. Feb 2003 B2
6569101 Quistgaard et al. May 2003 B2
6575908 Barnes et al. Jun 2003 B2
6579262 Mick et al. Jun 2003 B1
6592565 Twardowski Jul 2003 B2
6604630 Cabatic et al. Aug 2003 B1
6638223 Lifshitz et al. Oct 2003 B2
6648826 Little et al. Nov 2003 B2
6687386 Ito et al. Feb 2004 B1
6702763 Murphy et al. Mar 2004 B2
6754608 Svanerudh et al. Jun 2004 B2
D496596 Dalrymple Sep 2004 S
6793391 Zimmermann Sep 2004 B2
6817982 Fritz et al. Nov 2004 B2
6835177 Fritz et al. Dec 2004 B2
6857196 Dalrymple Feb 2005 B2
6863655 Bjaerum et al. Mar 2005 B2
6928146 Broyles et al. Aug 2005 B2
D509900 Barnes et al. Sep 2005 S
6941166 MacAdam et al. Sep 2005 B2
6962566 Quistgaard et al. Nov 2005 B2
6968227 MacAdam et al. Nov 2005 B2
6979294 Selzer et al. Dec 2005 B1
7006955 Daft et al. Feb 2006 B2
7022075 Grunwald et al. Apr 2006 B2
7169108 Little et al. Jan 2007 B2
7174201 Govari et al. Feb 2007 B2
D538432 Diener et al. Mar 2007 S
7223242 He et al. May 2007 B2
D544962 Diener et al. Jun 2007 S
D558351 Diener et al. Dec 2007 S
D559390 Diener et al. Jan 2008 S
7349522 Yan et al. Mar 2008 B2
7449640 Coleman Nov 2008 B2
7453472 Goede et al. Nov 2008 B2
7457672 Katsman et al. Nov 2008 B2
7466323 Krishnamurthy et al. Dec 2008 B2
D591423 Diener et al. Apr 2009 S
D592750 Diener et al. May 2009 S
D592760 Diener et al. May 2009 S
7534211 Hwang et al. May 2009 B2
7549961 Hwang Jun 2009 B1
7588541 Floyd et al. Sep 2009 B2
7591786 Holmberg et al. Sep 2009 B2
7599730 Hunter et al. Oct 2009 B2
7604596 Hwang et al. Oct 2009 B2
7604601 Altmann et al. Oct 2009 B2
7606402 Heimdal et al. Oct 2009 B2
7643040 Gabrielson et al. Jan 2010 B1
7656418 Watkins et al. Feb 2010 B2
7686766 Quistgaard et al. Mar 2010 B2
7694814 Cristobal et al. Apr 2010 B1
7724680 Karlsson May 2010 B2
7727153 Fritz et al. Jun 2010 B2
7728821 Hillis et al. Jun 2010 B2
7740586 Hwang et al. Jun 2010 B2
D625014 Hansen et al. Oct 2010 S
D625015 Hansen et al. Oct 2010 S
7809400 Hwang Oct 2010 B1
7819807 Barnes et al. Oct 2010 B2
7831449 Ying et al. Nov 2010 B2
7840040 Wilcox et al. Nov 2010 B2
7846098 Bakircioglu et al. Dec 2010 B2
7849250 Diener et al. Dec 2010 B2
7867168 Little et al. Jan 2011 B2
7876945 Lotjonen Jan 2011 B2
7883467 Akaki et al. Feb 2011 B2
7955265 Burla et al. Jun 2011 B2
7978461 Diener et al. Jul 2011 B2
7996688 Little Aug 2011 B2
8004572 Figueredo et al. Aug 2011 B2
8007438 Osaka et al. Aug 2011 B2
8025622 Rold et al. Sep 2011 B2
8228347 Beasley et al. Jul 2012 B2
20010056235 Quistgaard et al. Dec 2001 A1
20020056047 Lehman May 2002 A1
20020143256 Wing et al. Oct 2002 A1
20020173721 Grunwald et al. Nov 2002 A1
20020177774 Hwang et al. Nov 2002 A1
20020193686 Gilboa Dec 2002 A1
20020198454 Seward et al. Dec 2002 A1
20030009102 Quistgaard et al. Jan 2003 A1
20030013965 Quistgaard et al. Jan 2003 A1
20030013966 Barnes et al. Jan 2003 A1
20030047126 Tomaschko Mar 2003 A1
20030069897 Roy et al. Apr 2003 A1
20030074650 Akgul et al. Apr 2003 A1
20030078501 Barnes et al. Apr 2003 A1
20030141205 Cabatic et al. Jul 2003 A1
20030163047 Little et al. Aug 2003 A1
20030195418 Barnes et al. Oct 2003 A1
20030199762 Fritz et al. Oct 2003 A1
20040015079 Berger et al. Jan 2004 A1
20040066398 Dolimier et al. Apr 2004 A1
20040099815 Sfez et al. May 2004 A1
20040111183 Sutherland et al. Jun 2004 A1
20040116808 Fritz et al. Jun 2004 A1
20040133110 Little et al. Jul 2004 A1
20040138564 Hwang et al. Jul 2004 A1
20040143181 Damasco et al. Jul 2004 A1
20040150963 Holmberg et al. Aug 2004 A1
20040152982 Hwang et al. Aug 2004 A1
20040169673 Crampe et al. Sep 2004 A1
20040215072 Zhu Oct 2004 A1
20040240715 Wicker et al. Dec 2004 A1
20050020911 Viswanathan et al. Jan 2005 A1
20050054917 Kitson Mar 2005 A1
20050075544 Shapiro et al. Apr 2005 A1
20050096528 Fritz et al. May 2005 A1
20050101868 Ridley et al. May 2005 A1
20050107688 Strommer May 2005 A1
20050119555 Fritz et al. Jun 2005 A1
20050124885 Abend et al. Jun 2005 A1
20050131291 Floyd et al. Jun 2005 A1
20050215896 McMorrow et al. Sep 2005 A1
20050215904 Sumanaweera et al. Sep 2005 A1
20050228276 He et al. Oct 2005 A1
20050228287 Little et al. Oct 2005 A1
20050235272 Skinner Oct 2005 A1
20050265267 Hwang Dec 2005 A1
20050288586 Ferek-Petric Dec 2005 A1
20060015039 Cassidy et al. Jan 2006 A1
20060020206 Serra et al. Jan 2006 A1
20060025684 Quistgaard et al. Feb 2006 A1
20060058652 Little Mar 2006 A1
20060058655 Little Mar 2006 A1
20060058663 Willis et al. Mar 2006 A1
20060085068 Barry Apr 2006 A1
20060111634 Wu May 2006 A1
20060116578 Grunwald et al. Jun 2006 A1
20060173303 Yu et al. Aug 2006 A1
20060174065 Kuzara et al. Aug 2006 A1
20060253032 Altmann et al. Nov 2006 A1
20070049822 Bunce et al. Mar 2007 A1
20070071266 Little et al. Mar 2007 A1
20070085452 Coleman Apr 2007 A1
20070116373 Hwang et al. May 2007 A1
20070121769 Gabrielson et al. May 2007 A1
20070127792 Virtue Jun 2007 A1
20070199848 Ellswood et al. Aug 2007 A1
20070232910 Hwang et al. Oct 2007 A1
20070239008 Nakajima et al. Oct 2007 A1
20070239019 Richard et al. Oct 2007 A1
20070239021 Oonuki et al. Oct 2007 A1
20070258632 Friedman et al. Nov 2007 A1
20070271507 Connor et al. Nov 2007 A1
20070282263 Kalafut et al. Dec 2007 A1
20080033759 Finlay Feb 2008 A1
20080099638 Diener et al. May 2008 A1
20080104300 Diener et al. May 2008 A1
20080119731 Becerra et al. May 2008 A1
20080137922 Catallo et al. Jun 2008 A1
20080141107 Catallo et al. Jun 2008 A1
20080144896 Burke Jun 2008 A1
20080183079 Lundberg Jul 2008 A1
20080208047 Delso Aug 2008 A1
20080247618 Laine et al. Oct 2008 A1
20080249407 Hill et al. Oct 2008 A1
20080287789 Hwang et al. Nov 2008 A1
20080310816 Allen et al. Dec 2008 A1
20090043195 Poland Feb 2009 A1
20090069681 Lundberg et al. Mar 2009 A1
20090069725 Diener et al. Mar 2009 A1
20090076385 Jackson et al. Mar 2009 A1
20090143672 Harms et al. Jun 2009 A1
20090171212 Garon Jul 2009 A1
20090270722 Floyd et al. Oct 2009 A1
20090275835 Hwang et al. Nov 2009 A1
20100053197 Gabrielson et al. Mar 2010 A1
20100081930 Dunbar Apr 2010 A1
20100094132 Hansen et al. Apr 2010 A1
20100121189 Ma et al. May 2010 A1
20100121190 Pagoulatos et al. May 2010 A1
20100121196 Hwang et al. May 2010 A1
20100130855 Lundberg et al. May 2010 A1
20100145195 Hyun Jun 2010 A1
20100260398 Ma et al. Oct 2010 A1
20100274131 Barnes et al. Oct 2010 A1
20100295870 Baghdadi et al. Nov 2010 A1
20110066031 Lee et al. Mar 2011 A1
20110152684 Altmann et al. Jun 2011 A1
20110234630 Batman et al. Sep 2011 A1
20120281021 Beasley et al. Nov 2012 A1
Foreign Referenced Citations (3)
Number Date Country
9838486 Sep 1998 WO
0124704 Apr 2001 WO
2005008418 Jan 2005 WO
Non-Patent Literature Citations (12)
Entry
EP 07874457.0 filed May 8, 2007 Office Action dated May 31, 2011.
PCT/US2007/011192 filed May 8, 2007 International Preliminary Report on Patentability dated Nov. 11, 2008.
PCT/US2007/011192 filed May 8, 2007 Search Report dated Sep. 8, 2008.
PCT/US2007/011192 filed May 8, 2007 Written Opinion dated Sep. 8, 2008.
U.S. Appl. No. 11/745,756, filed May 8, 2007, Advisory Action dated Jun. 8, 2010.
U.S. Appl. No. 11/745,756, filed May 8, 2007, Final Office Action dated Jan. 20, 2011.
U.S. Appl. No. 11/745,756, filed May 8, 2007 Final Office Action dated Mar. 26, 2010.
U.S. Appl. No. 11/745,756, filed May 8, 2007 Non-Final Office Action dated Aug. 4, 2010.
U.S. Appl. No. 11/745,756, filed May 8, 2007 Non-Final Office Action dated Aug. 5, 2011.
U.S. Appl. No. 11/745,756, filed May 8, 2007 Non-Final Office Action dated Nov. 2, 2009.
U.S. Appl. No. 13/546,900, filed Jul. 11, 2012 Non-Final Office Action dated Aug. 16, 2012.
U.S. Appl. No. 13/546,900, filed Jul. 11, 2012 Notice of Allowance dated Jan. 15, 2013.
Related Publications (1)
Number Date Country
20130245431 A1 Sep 2013 US
Provisional Applications (1)
Number Date Country
60746741 May 2006 US
Continuations (2)
Number Date Country
Parent 13546900 Jul 2012 US
Child 13871982 US
Parent 11745756 May 2007 US
Child 13546900 US