Aspects of the present disclosure generally relate to operating room devices, methods, and systems. Some aspects are suitable for urological procedures.
Numerous treatment devices may be used in a typical urological procedure to diagnose conditions and perform treatments (e.g., kidney stone management, BPH treatments, prostatectomies, tumor resection, uterine fibroids management, etc.). Each treatment device (e.g., a fluid source or a laser source) may rely upon specific input (e.g., flow rate or power level). These inputs are typically provided with device specific controls, such as a keypad attached to a source module (e.g., the fluid source or laser source) located outside of the operating room.
To perform certain treatments, a surgeon may be required to configure and operate each of these treatment devices, individually and/or in combination. For example, a single urological procedure may employ multiple scopes, each having their own display and controls; multiple treatment devices, each having their own display and controls; and multiple patient monitoring devices, each having their own display and controls—all of which must be coordinated during the various stages of a typical procedure. Procedural inefficiency can be attributable to non-communication between these devices at each stage, increasing operating times and costs. For example, the surgeon may be required to either operate each device separately, requiring mastery of many devices; and/or utilize an assistant to operate one or more of the devices, increasing the communication burden. Administrative efficiency after the procedure can also be reduced, for example, by requiring staff to separately record the use of each device, further increasing effort and costs.
Aspects of the operating room devices, methods, and systems described herein may address these issues, and/or other deficiencies of the art.
One disclosed aspect is a method. For example, the method may comprise: receiving, at a processing unit, data associated with a patient; determining, with the processing unit, a treatment for the patient; identifying, with the processing unit, control settings associated with one or more treatment devices that are (i) in communication with the processing unit, and (ii) operable to perform the treatment; and generating, with the processing unit, a display including the control settings and at least one view of the data. Numerous aspects of exemplary methods are now described.
According to some aspects, the data may include a three-dimensional model of a portion of the patient, and the receiving step may comprise: receiving, at the processing unit, the three-dimensional model from a data source; and selecting, with the processing unit, the at least one view from the three-dimensional model. The data may include images of the patient, and the receiving step may comprise: receiving, at the processing unit, the images from the data source; and/or generating, with the processing unit, a three-dimensional model of the patient from the images. The determining step may further comprise: identifying, with the processing unit, potential treatments for the patient; determining, with the processing unit, whether the one or more treatment devices are operable to perform the potential treatments; and generating, with the processing unit, the display to include a listing of the potential treatments performable by one or more treatment devices.
In some aspects, the identifying step may comprise obtaining, with the processing unit, the control settings from the one or more treatment devices. For example, the identifying step may comprise: receiving, at the processing unit, a device identifier from the one or more treatment devices; delivering, with the processing unit, the device identifier to a data source; and receiving, at the processing unit, the control settings from the data source. The one or more treatment devices may include a first treatment device in communication with the processing unit and a second treatment device in communication with the processing unit. Accordingly, the identifying step may further comprise: receiving, at the processing unit, a first control setting for the first treatment device and a second control setting for the second treatment device; and identifying, with the processing unit, one or more computer applications for operating the first and second treatment devices in a coordinated manner to perform a treatment. For example, identifying the one or more computer applications may comprise: delivering, with the processing unit, the first and second control settings to a data source; and receiving, from the data source, the one or more applications configured to operate the first and second devices simultaneously.
The at least one view may include a first view different from a second view, and the generating step may comprise: positioning a distal end of the one or more treatment devices in the patient; and locating, with the processing unit, said distal end in the first and second views. For example, the distal end of the treatment may include a locator beacon, and the processing unit may include a tracking module configured to identify the locator beacon in the first and second views. The data may include images of interior surfaces of the patient, the first view may include the interior surfaces, and the method may comprise: overlaying, with the processing unit, a grid onto the interior surfaces depicted in the first view of the patient; tagging, with the processing unit, at least one area defined by the grid; and locating, with the processing unit, the at least one area in the second view of the patient.
According to some aspects, the method may comprise: receiving, with the processing unit, an input from an input device in communication with processing unit; and operating, with the processing unit, the treatment device according to the input and the control setting, wherein the input is a user-generated signal including at least one of an audio signal, a tactile signal, and a visual signal. The method may further comprise: identifying, with the processing unit, an object in the patient; and locating, with the processing unit, the object in the at least one view. The generation step may comprise: determining, with the processing unit, a characteristic of the object using at least one sensor; and modifying, with the processing unit, the input or the control settings based on the characteristic. In addition, the method may further comprise: generating, with the processing unit, one or more reports including the control settings and the input; and outputting, with the processing unit, the one or more reports.
Another disclosed aspect is a method. For example, this method may comprise: generating, with a processing unit, a display including control settings for a treatment device and at least one view of data associated with a patient; overlaying onto the at least one view, with the processing unit, depictions of (i) a treatment site, (ii) a path to the treatment site, and (iii) a location of the treatment device on the path; moving the treatment device along the path responsive to the display, and continuously updating the display with processing unit responsive to the movements, until the location of treatment device arrives at the treatment site; obtaining from an input device, with the processing unit, an input responsive to the control settings; and operating, with the processing unit, the treatment device to apply a treatment energy at the treatment site according to the input and the control settings. Numerous aspects of exemplary methods are now described.
According to some aspects, the input may comprise a user-generated signal including at least one of an audio signal, a tactile signal, and a visual signal; and the method may further comprise: converting, with the processing unit, the user-generated signal into a control signal; and outputting, with the processing unit, the control signal to the treatment device. The method may further comprise: determining, with the processing unit, a characteristic of the object or the treatment energy using one or more sensors; and modifying, with the processing unit, the control signal based on the determined characteristic. For example, the method may comprise: obtaining, with the processing unit, a computer application configured for use with the one or more sensors; and determining, with the processing unit, the characteristic with the computer application and one or more sensors. The object may include one or more stones, the characteristic may include a measure of stone burden, stone size, or stone type associated with the one or more stones, and the computer application may be configured to modify the control signal based on the measure. The characteristic also may include the composition of the one or more stones, and the computer application may be configured to modify the control signal based on the composition. In some aspects, the treatment energy may comprise a laser energy, the characteristic may include a measure of the laser energy, and the computer application may be configured to modify the control signal based on the measure of the laser energy.
Another disclosed aspect is a system. For example, the system may comprise: a processing unit in communication with one or more treatment devices; a display generated by the processing unit to comprise at least one view of data associated with a patient, and control settings for the one or more treatment devices; and an input device operable with the display to receive an input responsive to the control settings, and activate the one or more treatment devices. Numerous aspects of exemplary systems are now described.
According to some aspects, the processing unit may be configured to: obtain the data from a data source; and obtain the control settings from the treatment device or the data source. The data may include a three-dimensional model of a portion of the patient, and the processing unit may be configured to generate the at least one view based on three-dimensional model. For example, the data may include images of the patient, and the processing unit may be configured to: obtain, from a data source, a computer application configured to generate the three-dimensional model from the images; and/or generate the three-dimensional model with the application.
The processing unit may be configured to: identify capabilities associated with the one or more treatment devices; obtain, from a data source, a computer application based on the identified capabilities; and generate the control settings with the computer application. The one or more treatment devices may include a first treatment device and a second treatment device, and the control settings may include at least one option for operating the first and second treatment devices in a coordinated manner to perform a treatment. For example, the first treatment device may be a laser source, the second treatment device may be a fluid source, and the at least one option may be configured to operate the laser source and fluid source according to a predetermined sequence.
Another disclosed aspect is an input device. For example, a treatment device may include a handle, and the input device may include a display actuator mounted on the handle and operable with the at least one view. The treatment device may be a scope, and the handle may include the display actuator and one or more scope actuators configured to operate a steerable portion of the scope. The input device may be configured to receive a user-generated signal including at least one of audio signal, a tactile signal, and a visual signal. For example, the input device may include one or more sensors configured to receive the user-generated signal, such as a display actuator with one or more buttons configured to receive the tactile signal, or a movement sensor configured to receive the visual signal by tracking movements of the display actuator or the user. In one aspect, one of the above-described systems further comprises a projector configured to output the display onto a surface, and the input device may comprise an eye movement sensor configured to receive the visual signal by tracking movements of at least one eye of a user relative to the surface. For example, the surface and/or the eye movement sensor may be head-mounted so as to position the display and sensor within a field of view of the least one eye.
It is understood that both the foregoing summary and the following detailed descriptions are exemplary and explanatory only, neither being restrictive of the inventions claimed below.
The accompanying drawings are incorporated in and constitute a part of this specification. These drawings illustrate aspects of the present disclosure that, together with the written descriptions herein, serve to explain this disclosure. Each drawing depicts one or more exemplary aspects according to this disclosure, as follows:
Aspects of the present disclosure are now described with reference to operating room devices, methods, and systems. Some aspects are described with reference to urological procedures, wherein a treatment device (e.g., a scope) may be advanced through a path or passage in a body (e.g., a ureter) for removal of an unwanted object (e.g., a stone) from a cavity in the body (e.g., a calyx of a kidney). References to a particular type of procedure, such as a urological procedure; treatment device, such as a scope; unwanted material, such as a stone; or bodily part, such as a ureter, are provided for convenience and not intended to limit this disclosure. Accordingly, the devices, methods, and systems described herein may be utilized for any analogous purposes—medical or otherwise.
The terms “proximal” and “distal,” and their respective initials “P” and “D,” may be utilized along with terms such as “parallel” and “transverse” to describe relative aspects in this disclosure. Proximal refers to a position closer to the exterior of the body (or closer to a user), whereas distal refers to a position closer to the interior of the body (or further away from the user). Appending the initials “P” or “D” to an element number signifies a proximal or distal location or direction. The term “elongated” as used herein refers to any object that is substantially longer in relation to its width, such as an object having a length that is at least two times longer than its width. Unless claimed, however, these terms are provided for convenience and not intended to limit this disclosure to a particular location, direction, or orientation.
As used herein, terms such as “comprises,” “comprising,” or like variations, are intended to cover a non-exclusive inclusion, such that any aspect that comprises a list of elements does not include only those elements or steps, but may include other elements or steps not expressly listed or inherent thereto. Unless stated otherwise, the term “exemplary” is used in the sense of “example” rather than “ideal.” Conversely, the terms “consists of” and “consisting of” are intended to cover an exclusive inclusion, such that an aspect that consists of a list of elements includes only those elements. As used herein, terms such as “about,” “substantially,” “approximately,” or like variations, may indicate a range of values within +/−5% of a stated value.
Aspects of hardware and software are disclosed. Accordingly, some aspects may be entirely hardware, entirely software, or a combination of hardware and software. Some aspects may be described as a computer application, such as a computer program product stored on a computer-usable data storage medium or data source. Such applications may be executed by one or more processors in communication with the data source. Any data source may be utilized, including hard disks, CD-ROMs, optical storage devices, or other electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or like propagation medium.
Any combination of local or remote resources for data processing and storage may be used to execute the described applications, including the combinations described herein. The relative locations of these resources may be optimized to realize useful advances in processing power. For example, the one or more processors may be local to an operating room, such as behind a sterile field; and the data source may be remote from the operating room, such as at a server farm located miles away. Accordingly, some described aspects are particularly useful when obtaining large amounts data from the data source in real-time, such as during a urological procedure. Unless claimed, however, these terms are provided for convenience and not intended to limit this disclosure to a particular location and/or relativity.
Some aspects may be described using conceptual and/or flowchart illustrations, such as the exemplary method steps depicted in
Numerous aspects of the present disclosure are now described with reference to a base system 100. An exemplary base system 100 is depicted in
Display 10 may comprise at least one view of data associated with a patient, and at least one view of one or more control settings associated with one or more treatment devices. In
Navigation view 20 and map view 30 may be generated from the same or different types of data. As shown in
Control view 40 may be generated by processing unit 50 using data associated with any device. As shown in
Each treatment device 62 may have its own control settings, and control view 40 may include a toggle switch 43 for switching between one or more settings of the control settings. For example, in
Processing unit 50 may communicate with a plurality of devices, receive data from one or more of the devices, generate display 10 from the data, and control one or more of the devices with display 10 and/or input device 80. As shown in
An exemplary circuit architecture for processing unit 50 is depicted in
Processing unit 50 may serve as a local communication and control hub for base system 100, configured for use in the operating room, behind the sterile field. Unit 50 may be a stand-alone device. For example, as shown in
Processing unit 50 also may be part of another device. For example, processing unit 50 may be formed integral with: a sensor 61, allowing for more direct receipt of sensor data; or an imaging device 69, allowing for more direct receipt of image data (e.g., a real-time feed of fluoroscopic images). For a urology procedure, for example, unit 50 may be formed integral with a vitals monitoring device, allowing for direct receipt of sensor data associated with the patient's vitals (e.g., an EKG signal); or integral with a fluoroscope, allowing for direct receipt of image data including fluoroscopic images of the patient. Processing unit 50 may also be a peripheral attachment for another device. For example, memory 52 and processors 53 may be housed in a USB stick, and transceiver 55 may be configured to establish communications with the one more devices using WiFi and/or USB protocols. Processing unit 50 may be similarly incorporated into any device described herein.
Many functions of processing unit 50 described herein may be performed with one or more computer applications. Some portion of these applications may be stored on memory 52. Data sources 67 may be used to enhance the capabilities of processing unit 50 and/or memory 52 by providing and/or executing all or portions of the computer applications. In some aspects, data sources 67 may serve as an application store configured to promote selection of a computer applications via processing unit 50, and support the ongoing development of such applications. The various capabilities of each device in communication with processing unit 50 may guide the selection. For example, data sources 67 may be used to: provide and/or update computer applications that are stored locally on memory 52 and configured to identify the capabilities of any device in communication with processing unit 50; provide specialized applications configured to leverage the combined capabilities of one or more devices in communication with unit 50, including any combination of sensors 61, treatment devices 62, and/or imaging devices 69; execute analytical applications that would otherwise exceed the local capabilities of processors 53, including the diagnostic and feedback control methods described herein; and access any capabilities provided by third party computer applications, such as applications for environmental control of the operating room, inventory tracking and management, operating room monitoring.
The one or more treatment devices 62 may include (or be delivered with) a scope 70 configured for use in noninvasive procedures, such as any ureteroscope sold by Boston Scientific® under the brand name LithoVue®. As shown in
An exemplary input device 80 is depicted in
Transceiver 82 and/or display actuator 84 may be mounted on scope body 72. In
As described herein, base system 100 may be uniquely configured to realize operating room benefits by leveraging the capabilities of any device described herein, individually and/or in combination. For example, by utilizing processing unit 50 to obtain and/or access various computer applications for operating these devices (e.g., from data sources 67), system 100 may be further configured to utilize and/or enhance the capabilities of each device, and/or create new combinations of these capabilities. As a further example, by providing an application store (e.g., data sources 67), system 100 also promotes development and support of these computer applications by a greater development community, including doctors seeking medical advances, and software developers seeking technological advances. Different benefits of system 100 may be realized during the preoperative, intraoperative, and postoperative stages of a procedure, such as a urological procedure. Many exemplary benefits are now described with reference to methods 200, 300, and 400.
Exemplary uses for base system 100 are now described with reference to method 200, which may be a preoperative method. As shown in
Method 200 may comprise intermediate steps for receiving data from one or more data sources. As noted above, display 10 may be based on any data associated with a patient, including one or more of the following data types: (i) image data; (ii) sensor data; and (iii) geometric data; and processing unit 50 may be configured to receive the data in real-time or in advance, from any data source, local or remote. Accordingly, receiving step 210 may comprise receiving the data from any data source, including any intermediate steps required to establish a communication between processing unit 50 and the data source. For example, step 210 may comprise: establishing a communication with data sources 67, and obtaining the data therefrom.
Receiving step 210 may include data gathering steps. For example, receiving step 210 may comprise: generating image data (e.g., X-ray images) with one more imaging sources 69 in advance of a procedure, storing the image data on imaging sources 69 and/or data sources 67, and/or obtaining the image data from sources 69 and/or 67. If the image data is to be received in real-time, then receiving step 210 may comprise: establishing a communication with the imaging devices 69 (e.g., imaging element 78 of scope 70), and receiving the image data therefrom (e.g., as a real-time video feed). The sensor data may be similarly obtained from one or more sensors 61, such that receiving step 210 comprises: establishing a communication with and receiving sensor data from one or more sensors 61. For example, sensors 61 may include one or more of an ultrasound transducer, a pressure sensor, a light sensor, an irradiation sensor, or like sensor, each sensor 61 being configured to output sensor data, wherein receiving step 210 comprises receiving the data.
Geometric data may be received or generated in receiving step 210. For example, geometric data including a three-dimensional model of the patient, or a portion of the patient, may be generated in advance of a procedure and stored on a data source 67, such that receiving step 210 comprises: receiving the three-dimensional model from the data source 67; and selecting the at least one view from the three-dimensional model. Processing unit 50 may comprise a graphics module 57 (e.g.,
Step 210 may comprise generating the three-dimensional model using one or more computer applications. For example, receiving step 210 may comprise: analyzing the image and/or sensor data to identify a computer application configured to generate a three-dimensional model of the patient with said data; obtaining the identified application from data sources 67; and generating the three-dimensional model from said data with the application. Graphics module 57, for example, may be configured to execute the computer application using image and/or sensor data stored local or remote to memory 52.
Method 200 may be utilized to enrich the three-dimensional model. For example, receiving step 210 may comprise: identifying capabilities of one or more sensors 61 and/or imaging sources 69 in communication with processing unit 50; identifying one or more computer applications configured to generate layers for the three-dimensional model with the identified capabilities; and incorporating these layers into the model. For a urology procedure, for example, step 210 may comprise: identifying capabilities of sensors 61 and devices 69; identifying a stone identification application configured to map locations and characteristics of each stone at a treatment site 32 using the identified capabilities; utilizing the stone identification application to generate a layer including the mapped locations and characteristics; and incorporating the layer into map view 30. Additional examples are described below.
Determining step 220 may include intermediate planning steps. As shown in
Aspects of determining step 220 may be responsive to user-generated signals, such as a first tactile signal for operating portion 30A with display device 60, and/or a second tactile signal for operating portion 30B with input device 80. A computer application may be used to perform any portion of determining step 220. For example, the stone identification program described above may be used in step 220 to automatically identify and locate stones at each treatment site 32A-D based on image and/or sensor data. At any point in step 220, additional information 35 (e.g., procedure notes) also may be automatically and/or manually associated with each treatment site 32A-D and/or listing of potential treatments 34A-D.
Step 220 may include intermediate configuration steps. For example, as with other method steps described herein, determining step 220 may comprise: identifying capabilities of one or more sensors 61 in communication with processing unit 50; identifying one or more computer applications configured to leverage the identified capabilities; and/or obtaining the one or more computer applications. For a urology procedure, for example, step 220 may comprise: identifying the capabilities of any sensors 61 configured to detect heat and radiation at a treatment site 32; identifying a laser treatment application configured to control the discharge of laser energy in response to the detected heat and radiation; and obtaining the laser treatment application from data sources 67. Similar configured steps may be performed for any device described herein. Additional configuration steps also may comprise: updating planning views 34A-D; obtaining additional inputs for the aforementioned computer applications; and/or selecting or enabling selection of potential treatments using display device(s) 60 and/or input device 80.
Identifying step 230 may include intermediate steps for receiving control settings for the one or more treatment devices 62. The control settings may be received directly from each device 62. For example, step 230 may comprise receiving the control settings directly from each treatment device 62 during a handshaking process performed to establish communication with processing unit 50. In some aspects, the control settings may be obtained from one or more data sources 67. For example, step 230 also may comprise obtaining a device identifier from each treatment device 62 (e.g., during the handshaking process); delivering the device identifier to data sources 67; and receiving the control settings therefrom. The device identifier also may be used to promote the development of additional settings. For example, the identifier may be associated with a device specification that can be utilized by third-party developers to develop new control settings, and processing unit 50 may be configured to make those developments available for immediate use via sources 67.
A computer application may be used to determine the control settings. For example, identifying step 230 may comprise: identifying capabilities of one or more treatment devices 62 in communication with processing unit 50; identifying one or more computer applications configured to generate a procedure-specific set of control settings based on identified capabilities; and/or obtaining the one or more computer applications. For a urology procedure, for example, identifying step 230 may comprise: identifying the capabilities of a fluid source configured to deliver fluid to a treatment site 32; identifying a urology-specific fluid management application configured to generate control settings for controlling the fluid source; and/or obtaining the fluid management application from data sources 67. New and/or combined capabilities may be realized in this manner. For example, step 230 may comprise utilizing said computer applications to generate control settings for operating a first treatment device 62 together with a second treatment device 62 to perform a particular treatment. In keeping with the previous urology example, step 230 may further comprise: identifying the capabilities of a laser source configured to discharge laser energy to the treatment site 32; identifying a treatment application configured to generate control settings for controlling the laser source together with the fluid source to perform a treatment; and obtaining the treatment application from data sources 67.
Generating step 240 may include intermediate steps for generating display 10. In some aspects, step 240 may comprise: identifying a location 24 (e.g.,
Other uses for system 100 are now described with reference to method 300 of
Generating step 310 may include intermediate steps for configuring display 10, including any steps described above with respect to generating step 240 of method 200. For example, generating step 310 may comprise identifying location 24, establishing communications between tracking module 58 and locator beacon 79, and/or configuring display 10 for use with one or more display devices 60.
Overlaying step 320 may be utilized to augment display 10. Navigation view 20 of
Overlaying step 320 also may include intermediate steps for selecting or “tagging” portions of display 10. One example is provided in
Moving step 330 may include intermediate steps for using display 10 to guide movements of treatment device 62. As shown in
Moving step 330 also may include intermediate steps for operating scope 70 and/or delivery mechanism 76. As shown in
Obtaining step 340 may be utilized to obtain whatever inputs are necessary for operating treatment device 62 (or other device described herein). For example, obtaining step 340 may comprise receiving an input from user 1 including at least one of an audio signal, a tactile signal, and a visual signal; converting the input into a control signal; and outputting the control signal. Each signal may be user-generated and/or specific to treatment types, power levels, times, or like quantity. In some aspects, display 10 and/or input device 80 may be configured to receive the input. For example, control view 40 may be output to a display device 60, and the configuration of display actuator 84 may correspond with the configuration of control view 40, such that either display 10 or input device 80 may be utilized to receive a similar tactile signal. One or more sensors 61 also may be configured to receive the input. For example, sensors 61 may include a movement sensor configured to receive a visual signal by tracking movements of display actuator 84 and/or a portion of user 1.
Obtaining step 340 may comprise intermediate steps for modifying the control signal. For example, obtaining step 340 may comprise: identifying an object at a treatment area 32; determining a characteristic of the identified object with one or more sensors 61; and modifying the control signal based on the determined characteristic. Any characteristic may be determined and utilized within step 340, and any number of computer applications may be used to make these determinations. One example is depicted in
Operating step 350 may be utilized to apply the treatment energy. At this point in method 300, all of the control settings may have been determined, such that step 350 comprises: activating treatment device 62 with activation switch 42 of control view 40 and/or display actuator 84 of input device 80; and discharging the treatment energy towards a targeted object at the treatment site 32. Similar to above, operating step 350 may comprise intermediate modification steps. For example, control view 40 of
Method 300 may be utilized to establish a control loop for one or more treatment devices 62, such as a feedback control loop that modifies an output of a treatment device 62 responsive to a feedback signal generated by a sensor 61. For example, operating step 350 may comprise: determining a characteristic of the targeted object with one or more sensors 61; and modifying a treatment energy based on the determined characteristic. For a urology procedure, for example, treatment device 62 may be configured to discharge laser energy toward a stone located at a treatment site 32, and operating step 350 may comprise: receiving a reflected portion of the laser energy from the stone and/or site 32; analyzing the reflected portion with one or more sensor 61; and modifying the laser energy responsive to an output from sensors 61. In this example, the feedback control loop may prevent unwanted tissue damage by stopping or tapering the discharge of the laser energy if/when the output from sensors 61 indicates that the stone has been destroyed. Similar control loops may be established for any device described herein. Continuing the previous urology example, source elements 68 may include a fluid or medicine source, and step 350 may comprise: analyzing a characteristic of site 32 with one of sensors 61, such as internal pressure or irradiation levels; and modifying an amount of fluid or medicine flow responsive to the output from the sensor 61. One or more computer applications may be used to establish the feedback control loops, such that operating step 350 may include any intermediate steps for identifying, obtaining, and utilizing the applications.
Still other exemplary uses for system 100 are now described with reference to method 400, which may be a postoperative method. As shown in
Recording step 410 may include, for example, intermediate steps for recording any data generated during the performance of methods 200 and 300. The recorded data may assume any form. For example, recording step 410 may comprise generating a video stream of display 10 during a procedure, thereby recording each selection, input, or output relative thereto. Because display 10 includes at least one view of data associated with the patient (e.g., navigation view 20 and/or map view 30), and a view of the associated control settings (e.g., control view 40), said video stream may be used to archive any number of decisions by user 1 and/or system 100. The recorded data also may include any data generated by one or more patient monitors 63 (e.g., a vital signs monitor), and/or one or more operating room monitors 64 (e.g., an observation camera). For example, in step 410, one operating room monitor 64 may be configured to record a quantum of materials used during a procedure by tracking the usage and/or weight of each device in the operating room.
Reporting step 420 may be utilized to perform various archival functions with the recorded data. For example, reporting step 420 may comprise: analyzing the recorded data; and generating one or more reports therefrom. In one aspect, the one or more reports may be utilized to improve operating room efficiency by, for example, automatically summarizing treatments, treatment times, results, and like performance measures, any of which may be utilized to realize operating room benefits. These reports also may be used for inventory management purposes. For example, reporting step 420 may comprise: generating a report based on the quantum of materials used during the procedure; and delivering the report to a third party for automatic restocking of said materials. The quantum of materials may be determined from a user input, or with one or more operating room monitors 64, as noted above.
Sending step 430 may be utilized to archive the recorded data and/or one or more reports for future use. For example, step 430 may comprise: sending the recorded data and/or one or more reports to memory 52 and/or data sources 67 together with a patient identifier, such as a reference number associated with the patient's electronic medical records, like a social security number. Some portion of the recorded data and/or one or more reports also may be output to data source 67 without the patient identifier for use in various statistical reports.
Additional aspects of system 100 are now described with reference to input device 180 of
Aspects of input device 280 may be configured for hands-free operation of display 10 within the sterile field. As shown in
Visual sensor 288 may include any type of camera or motion sensor, any of which may be operable with any type of display device 60. As shown in
Numerous benefits may be realized with aspects of the operating room devices, methods, and systems described herein. For example, aspects of system 100 may be configured to increase operating room efficiency, reduce burdens placed on operating room assistants and administrative staff, and improve patient safety, all without sacrificing desirable outcomes. According to this disclosure, some benefits may be realized by utilizing system 100 as a hub configured to communicate with a plurality of devices local to or remote from the operating room, and generate display 10 as one means for controlling at least some of those devices from a position inside the operating room, such as behind a sterile field. Other benefits may be realized by expanding the capabilities of system 100 to account for the unique capabilities of each device in communication therewith, such as sensors 61, treatment devices 62, and/or imaging devices 69, any of which may be used to generate data associated with the patient, generate or modify a three-dimensional model of the patient with said data, analyze portions of the data, and/or perform like functions, any of which may be further expanded by use of one or more computer applications.
While principles of the present disclosure are described herein with reference to illustrative aspects for particular applications, the disclosure is not limited thereto. Those having ordinary skill in the art and access to the teachings provided herein will recognize additional modifications, applications, aspects, and substitution of equivalents all fall in the scope of the aspects described herein. Accordingly, the present disclosure is not to be considered as limited by the foregoing description.
This patent application is a continuation under 37 CFR § 1.53(b) of U.S. application Ser. No. 15/974,403, filed May 8, 2018, now U.S. Pat. No.10,881,482, which claims the benefit of priority under 35 U.S.C. § 119 to U.S. Provisional Patent Application No. 62/503,774, filed May 9, 2017, each of which is herein incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5383874 | Jackson | Jan 1995 | A |
6017354 | Culp | Jan 2000 | A |
6106460 | Panescu | Aug 2000 | A |
6540685 | Rhoads | Apr 2003 | B1 |
6603494 | Banks | Aug 2003 | B1 |
7263397 | Hauck | Aug 2007 | B2 |
7285117 | Krueger | Oct 2007 | B2 |
7567233 | Garibaldi | Jul 2009 | B2 |
7633502 | Willis | Dec 2009 | B2 |
7720520 | Willis | May 2010 | B2 |
7885707 | Hauck | Feb 2011 | B2 |
8073528 | Zhao | Dec 2011 | B2 |
8412307 | Willis | Apr 2013 | B2 |
8535303 | Avitall | Sep 2013 | B2 |
9078565 | Profio | Jul 2015 | B2 |
9192788 | Vahala | Nov 2015 | B2 |
9220567 | Sutherland | Dec 2015 | B2 |
9320646 | Todd | Apr 2016 | B2 |
9375288 | Robinson | Jun 2016 | B2 |
9439736 | Olson | Sep 2016 | B2 |
9457168 | Moll | Oct 2016 | B2 |
9545192 | Braun | Jan 2017 | B2 |
9629567 | Porath | Apr 2017 | B2 |
9636031 | Cox | May 2017 | B2 |
9730602 | Harlev | Aug 2017 | B2 |
9733119 | Schmid | Aug 2017 | B2 |
9818231 | Coffey | Nov 2017 | B2 |
9820802 | Boveja | Nov 2017 | B1 |
9888862 | Harlev | Feb 2018 | B2 |
9918792 | Boveja | Mar 2018 | B1 |
9955986 | Shah | May 2018 | B2 |
10034637 | Harlev | Jul 2018 | B2 |
10576263 | Botzer | Mar 2020 | B2 |
10758212 | Wiemker | Sep 2020 | B2 |
10779796 | Hiltner | Sep 2020 | B2 |
10888235 | Hagfors | Jan 2021 | B2 |
10973584 | Grunwald | Apr 2021 | B2 |
10978184 | Sorenson | Apr 2021 | B2 |
11071602 | Pereira | Jul 2021 | B2 |
11406352 | Coolidge | Aug 2022 | B2 |
20050154314 | Quistgaard | Jul 2005 | A1 |
20070066911 | Klingenbeck-Regn | Mar 2007 | A1 |
20070083193 | Werneth | Apr 2007 | A1 |
20070167702 | Hasser | Jul 2007 | A1 |
20080081982 | Simon | Apr 2008 | A1 |
20080194918 | Kulik | Aug 2008 | A1 |
20080269572 | Kanz | Oct 2008 | A1 |
20090153548 | Rabben | Jun 2009 | A1 |
20090156926 | Messerly | Jun 2009 | A1 |
20090234328 | Cox | Sep 2009 | A1 |
20090299175 | Bernstein | Dec 2009 | A1 |
20100228249 | Mohr | Sep 2010 | A1 |
20100268059 | Ryu | Oct 2010 | A1 |
20100286518 | Lee | Nov 2010 | A1 |
20100286519 | Lee | Nov 2010 | A1 |
20100286520 | Hazard | Nov 2010 | A1 |
20110060219 | Small | Mar 2011 | A1 |
20110176490 | Mehta | Jul 2011 | A1 |
20110282188 | Burnside | Nov 2011 | A1 |
20120016239 | Barthe | Jan 2012 | A1 |
20120029504 | Afonso | Feb 2012 | A1 |
20120136242 | Qi | May 2012 | A1 |
20120150035 | Seip | Jun 2012 | A1 |
20120323233 | Maguire | Dec 2012 | A1 |
20120330190 | Gliner | Dec 2012 | A1 |
20130041243 | Byrd | Feb 2013 | A1 |
20130053651 | Tarn | Feb 2013 | A1 |
20130096575 | Olson | Apr 2013 | A1 |
20130123773 | Schwartz | May 2013 | A1 |
20130274582 | Afonso | Oct 2013 | A1 |
20130316318 | Frank | Nov 2013 | A1 |
20130317351 | Case | Nov 2013 | A1 |
20130330701 | Rubinstein | Dec 2013 | A1 |
20140031808 | Phan | Jan 2014 | A1 |
20140046261 | Newman | Feb 2014 | A1 |
20140066764 | Subramaniam | Mar 2014 | A1 |
20140081262 | Koblish | Mar 2014 | A1 |
20140114297 | Woodley | Apr 2014 | A1 |
20140128881 | Tyc | May 2014 | A1 |
20140176554 | Cohen | Jun 2014 | A1 |
20140180083 | Hoseit | Jun 2014 | A1 |
20140181716 | Merritt | Jun 2014 | A1 |
20140188133 | Misener | Jul 2014 | A1 |
20140276036 | Collins | Sep 2014 | A1 |
20150018701 | Cox | Jan 2015 | A1 |
20150119725 | Martin | Apr 2015 | A1 |
20150164592 | Elhawary | Jun 2015 | A1 |
20150209013 | Tsymbalenko | Jul 2015 | A1 |
20150230863 | Youngquist | Aug 2015 | A1 |
20150238102 | Rubinstein | Aug 2015 | A1 |
20150265242 | Stonefield | Sep 2015 | A1 |
20150306340 | Giap | Oct 2015 | A1 |
20160038047 | Urman | Feb 2016 | A1 |
20160147308 | Gelman | May 2016 | A1 |
20160183824 | Severino | Jun 2016 | A1 |
20160183841 | Duindam | Jun 2016 | A1 |
20160294951 | Durrant | Oct 2016 | A1 |
20160331461 | Cheatham, III | Nov 2016 | A1 |
20170042449 | Deno | Feb 2017 | A1 |
20170084027 | Mintz | Mar 2017 | A1 |
20170086700 | Stewart | Mar 2017 | A1 |
20170119353 | Nielsen | May 2017 | A1 |
20170120080 | Phillips | May 2017 | A1 |
20170151027 | Walker | Jun 2017 | A1 |
20170161936 | Katz | Jun 2017 | A1 |
20170202534 | Crotty | Jul 2017 | A1 |
20170325901 | Harlev | Nov 2017 | A1 |
20170333125 | Lepak | Nov 2017 | A1 |
20170340389 | Otto | Nov 2017 | A1 |
20180160978 | Cohen | Jun 2018 | A1 |
20180177383 | Noonan | Jun 2018 | A1 |
20180199995 | Odermatt | Jul 2018 | A1 |
20180240237 | Donhowe | Aug 2018 | A1 |
20180296113 | Stewart | Oct 2018 | A1 |
20180296167 | Stewart | Oct 2018 | A1 |
20190110843 | Ummalaneni | Apr 2019 | A1 |
20190125361 | Shelton, IV | May 2019 | A1 |
20190125455 | Shelton, IV | May 2019 | A1 |
20190167366 | Ummalaneni | Jun 2019 | A1 |
20190216540 | Melsky | Jul 2019 | A1 |
20190254759 | Azizian | Aug 2019 | A1 |
20190282301 | Bonillas Vaca | Sep 2019 | A1 |
20190320878 | Duindam | Oct 2019 | A1 |
20190350659 | Wang | Nov 2019 | A1 |
20190365350 | Chiang | Dec 2019 | A1 |
20200054399 | Duindam | Feb 2020 | A1 |
20200078103 | Duindam | Mar 2020 | A1 |
20200242767 | Zhao | Jul 2020 | A1 |
20210121251 | Aljuri | Apr 2021 | A1 |
20230372673 | Gu | Nov 2023 | A1 |
Number | Date | Country |
---|---|---|
103221976 | Jul 2013 | CN |
2007516809 | Jun 2007 | JP |
2015526111 | Sep 2015 | JP |
2014053010 | Apr 2014 | WO |
WO 2014062219 | Apr 2014 | WO |
Entry |
---|
Office Action issued in Japanese Patent Application No. 2019-561216 dated Sep. 26, 2022 (2 pages). |
Office Action issued in Japanese Application No. 2019-561216 dated May 30, 2022 (3 pages). |
International Search Report and Written Opinion for corresponding International Application No. PCT/US2018/031655, dated Jul. 11, 2018 (12 pages). |
Anonymous: “Service Location Protocol—Wikipedia”, Apr. 12, 2014 (Apr. 12, 2014), XP055385733, Retrieved from the Internet: URL:https://en.wikipedia.org/w/index.php?title=Service_Location_Protocol&oldid=603840275 [retrieved on Jun. 27, 2017] (5 pages). |
Communication pursuant to Article 94(3) EPC in European Application No. 18726704.2, dated Feb. 24, 2023 (6 pages). |
Office Action in Chinese Application No. 201880030337.3, dated Mar. 7, 2023 (10 pages). |
Australian Examination Report in AU2018266132, dated Aug. 22, 2023 (5 pages). |
Number | Date | Country | |
---|---|---|---|
20210085425 A1 | Mar 2021 | US |
Number | Date | Country | |
---|---|---|---|
62503774 | May 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15974403 | May 2018 | US |
Child | 17111595 | US |