The invention relates to a control device for a medical apparatus, in particular to a control device by which a contactless operation of the medical apparatus is possible.
Medical apparatuses, e.g. surgical tables or surgical lamps which are operated at the apparatuses themselves, at operation panels at the wall, or at remote controls, are known. Thereto, it is necessary that the operator is in the vicinity of theses operating elements. Furthermore, a surgeon himself cannot operate the apparatuses since the operating elements usually are located outside of the sterile area and the surgeon would become nonsterile by touching the operating elements.
Therefore an increased coordination effort is necessary to operate the apparatuses. Firstly, the operator has to be addressed by the surgeon, he has to come near the operating elements or to find the remote control, as the case may be, to check the orientation of the operating elements with respect to the apparatus, to start and stop, and, as the case may be, to correct the action.
The invention has been made based on the object to remedy the above disadvantages and to provide a control device for medical apparatuses allowing a contactless, simple operation of the medical apparatuses so that the surgeon can operate the apparatuses himself without becoming nonsterile.
The object is achieved by a control device according to claim 1 and a method for controlling medical apparatuses according to claim 15.
By the control device according to the invention, it is possible to detect an object, e.g. a finger of a surgeon, and its direction. By means of a sensor, it is possible to detect to which element to be activated of a medical apparatus, the object is directed, wherein a predefined action of the element is executed. Thereby, a contactless operation of the medical apparatus is possible and an intuitive operation of the medical apparatus is also enabled.
Now, the invention is elucidated by means of embodiments referring to the attached drawings.
In particular
In
Furthermore, a 3D sensor 5 is connected to the control device 2. By the 3D sensor 5, objects in the room, their shape, position and motion are detected. Here, by detecting the motion is meant that a translation or rotation of the entire object in a certain motion sequence, a deformation by a motion of certain portions of the object, as well as a combination of the translation or rotation and the deformation of the object are detected. In an alternative embodiment, several 3D sensors are provided to detect the objects from different directions. Thus an enhanced reliability upon detection of the object and a larger space where the object can be detected is possible. Optionally, the 3D sensor is or the 3D sensors are fixed to a room ceiling so that the employment is less critical in view of deteriorations. Alternatively, the at least one 3D sensor 5 is attached to or integrated in the surgical table 1′ or the surgical lamp 1″, therefore, in the medical apparatus itself. The Utilization of a 3D sensor represents a suitable sensor selection, wherein the utilization is not limited to a 3D sensor but another suitable sensor can be employed. The gesture intended for the control can also be recognized by the utilization of camera systems for image recognition.
In one embodiment, the 3D sensor 5 is a so-called ToF (Time of Flight) camera. Here, the distances from the camera are measured for several pixels in that way that the object is illuminated by a light pulse, wherein the light is reflected by the object. The time needed by the light from emitting until the return to the ToF camera is measured for each pixel, wherein the time is proportional to the distance. Thereby, the object is not scanned but the entire object is simultaneously recorded.
The control device 2 is adapted to transform the motions of the objects into predefined actions of the medical devices.
The actions of the surgical lamp 1″ are e.g. a brightness change, a change of the light field diameter, a change of a focus situation, a change of a color temperature of the emitted light, or a change of the active zones where illuminants of the surgical lamp 1″ are activated.
The surgical table 1′ comprises a patient support plate 6 and a column 7. In the patient support plate 6 and in the column 7, several drives by which the patient support plate 6 can be travelled are provided.
The actions of the surgical table 1′ are e.g. a longitudinal displacement L whereupon the patient support plate 7 is displaced along its longitudinal axis, or a transversal displacement Q, whereupon the patient support plate 7 is displaced transversely with respect to its longitudinal axis. Furthermore, a swing K, whereupon the patient support plate 7 is pivoted around its longitudinal axis, and a Trendelenburg adjustment or an Antitrendelenburg adjustment, whereupon the patient support plate 7 is pivoted around its transversal axis are possible. Moreover, there is the option to change the height of the entire patient support plate 7 by a height adjustment H. Finally, individual segments of the patient support plate 7 can be pivoted with respect to one another to carry out specific supports for certain procedures at a patient.
The selection of the medical apparatus to be controlled amongst several medical apparatuses, therefore, here, whether an action of the surgical table 1′ or of the surgical lamp 1″ is to be triggered, is done by activation of the medical apparatus via a physical or virtual operating unit by the execution of a gesture in a detection space assigned to the medical apparatus or one of its components or to functions by the use of medical device-depending gestures and/or by a gesture-based selection as specified below.
Basically, depending on the medical apparatus and/or the selected component to be controlled, identical gestures can be transformed into various control instructions. The detection space assigned to a medical apparatus, components and/or functions can also only be relevant for a login or logout gesture, whereas the gesture for the control instruction itself can be again outside of this space if this seems advantageous. However, upon defined detection spaces, a login/logout gesture can be waived if the detection spaces is usually not used without intention to control.
Illustration A shows a waving by one single hand 8. Here, the hand is held flat and it is not deformed, therefore, e.g. not clenched, during the waving, thus, during the translation or rotation.
Illustration B shows a motion of two hands 8, 8′, wherein the hands 8, 8′ execute a deformation, namely, starting from a flat hand into a shape in which, based on a gripping, the tips of the fingers are joined, and a translation, namely top-down. The motion principle can also be directly transferred to a one-hand gesture.
In illustration C, a motion in which both hands 8, 8′ are maintained flat and are moved top-down is shown.
In illustration D, a gesture in which one of the hands 8 clenched to a first remains at one place, whereas the other hand 8′ is deformed as described to illustration B and is then moved bottom-up or in a bow around the first of the other hand.
The control device 2 is adapted to process a specific one of the gestures as login motion and as logout motion. Here, illustration A shows the login motion as an authorization gesture. Only after the login motion has been executed, the control device 2 is adapted to process further motions of the hand or of the hands such that they are transformed into control instructions for the surgical table 1′.
After an execution of the intended motion, the logout motion is executed in turn, wherein the control device 2 is adapted such that this motion is then understood as logout motion so that no further motions are, by the control device, transformed into a control instruction for an action of the surgical table 1′. Merely an anew login motion is processed in turn. Alternatively, the login motion and the logout motion can also be different.
In the illustration B, the gesture by which the height adjustment H is activated and the patient support plate 7 is upwardly moved in this embodiment of the surgical table 1′ is illustrated. The adjustment is possible for the entire table; however, it can also be provided only for selected segments, e.g. by an execution of a gesture in a detection space assigned to the segment. For the height adjustment, the gesture is executed by several hands 8, 8′ executing a predefined motion, namely, the deformation, in particular, starting from a flat hand into a shape in which the tips of the fingers are joined, and executing a subsequent bottom-up translation of the hands. The control device 2 is adapted to recognize whether several hands 8, 8′ execute the predefined motion and transforms the detected motions according to a combination of the detected motions into the predefined action, here the upward motion of the patient support plate 7. A one-hand gesture with identical or alternative gesture shape is also conceivable.
Also in the illustrations C and D, the gesture is executed by several hands 8, 8′, wherein in illustration C, both flat hands are downwardly moved in a mere translation so that the height adjustment H is activated and the patient support plate 7 is downwardly adjusted. In illustration D, the one hand is however not moved in a translational manner or deformed but merely the shape, therefore the hand 8 as a fist, is detected. The other hand 8′ executes a translation and a deformation of the hand. Thereby, a Trendelenburg adjustment or an Antitrendelenburg adjustment are actuated.
The motion as well as the deformation of the hand 8, 8′ are then detected in that case if they are executed at a certain speed within a predetermined tolerance range. If a motion is executed too fast or too slow, it is not detected as a predefined motion but it is ignored.
The speed of a motion instruction can basically correspond to the speed of the gesture motion or it can be in a defined relationship, it can be related to a distance of the object or to a plane in the detection space and/or it can depend on an additional gesture.
The predefined actions of the surgical table 1′ are optionally executed in real time which means that the control device 2 is adapted such that the detected motions are immediately executed by the surgical table 1′.
Besides the direct control of the medical apparatus, the user can alternatively control it via a model. The model of the medical apparatus is illustrated on a user interface or a reproducing unit, such as a monitor. If the model is activated by motion instructions, the medical apparatus itself analogously moves. Inasmuch as not a motion but another action shall be initiated, the position and/or the respective motion or activation instruction is comprehensible at the reproducing unit via a cursor, as the case may be, in different appearances depending on the instruction. Hereby, the safe operation increases by the visual comprehensibility of the action of the user. Moreover, the operation can be spatially separated from the medical apparatus to be operated while the visual surveillance of the instructions is maintained.
In alternative embodiments, the assignment of the gestures and the predefined actions of the medical device distinguishes from this embodiment. Furthermore, there is the option to additionally or alternatively detect also other gestures or gestures of other objects, e.g. of legs.
In
In this embodiment, the 3D sensor 5 detects several elements of the surgical table 1′ which are controllable, therefore to be activated. Here, the element is exemplarily the drive in the column 7 for the height adjustment. The control device 2 recognizes from the direction of the vector 10 and a position of the hand 8, thus from the vector 10 starting from the hand 8, that there is an intersection point of the vector 10 and the column 7 and recognizes the element to be activated that has to execute a predefined action by the position of the intersection point. As described above, the predefined action is then executed by detected predefined motions and/or the gestures and/or the detection space of the hand 8 or of the hands 8, 8′ assigned to a functionality/a control instruction. In an alternative embodiment, the control device 2 is adapted such that the predefined action is executed by the detection of an instruction being another than a gesture, for example by a voice input.
In
The control device 2 is here configured such that the vector 10 is in turn determined via the points on the finger 9 and via its axis. The control device 2 recognizes the direction of the vector 10 via the detection of the points by the 3D sensor 5. Furthermore, the control device 2 detects, by the 3D sensor, human bodies being located in its detection space.
Also here, an intersection point of the vector 10 and the human body (not shown) is detected. The control device 2 is adapted such that it executes an action selected in advance with respect to the patch of the human body. In
In
The use of the vector 10 can further be utilized to generate an intersection point with a surface of a target area, e.g. the patient support of the surgical table or the patient being located thereon. If this intersection point can be generated, at the first set-out, the indication of direction to a functionality to be joined therewith is reasonable and it is executed. If the plausibility check fails since a selection e.g. come to nothing, the control instruction is not executed or it has to be explicitly confirmed by the user.
The functions assigned to the surgical table 1′ and the surgical lamp 1″ can analogously be executed by all suitable medical apparatuses connected to the control device 2. The different embodiments can be combined with one another.
In use, the control device 2 firstly detects, by the 3D sensor, whether the predefined login motion is executed.
Subsequently to the predefined login motion, the motion of the object, therefore, of the hand 8, 8′ or of the arm is detected by the 3D sensor and the predefined action of the medical apparatus, assigned to the predefined motion of the object, is actuated by the control device 2.
After a completion of the requested predefined action of the medical apparatus, the logout motion is executed for terminating the operation so that, after the logout motion, no further motions are interpreted as actions to be executed.
Optionally, not only the motion of one of the objects but of several objects are detected. Here, the position and the motion of both hands 8, 8′ are detected and the predefined action is executed according to the combination of the motions of the several objects.
Upon the optional function in which it is possible to the select the element of the medical apparatus to be activated, after the activation and before the execution of the motion for activating, the requested element to be activated is selected by pointing to this element by the finger 9 (directing the rod-like object). Then, the predefined motion is executed by the objects and the control device 2 actuates the requested element to be activated so that the action according to the predefined motion is executed.
Upon the further optional function in which the patch to which the action of the element of the medical apparatus is directed is selected, it is pointed to the patch relating to which the action shall be executed by the finger 9 after an optionally necessary activating of the element. Thereby, the control device 2 activates the requested element such that the action relating to the patch is executed, therefore, the patch on the patient support face 6 is illuminated.
Upon the further optional function that, by the direction from where it is pointed, also the element that shall execute the action is activated, the direction of the finger 9 (of the rod-like object) is detected and the element to be activated opposite with respect to the direction of the finger which shall execute the action is selected additionally to the patch where the action is to be executed. The element to be activated is then actuated so that the action is executed from the direction of the finger 9.
Number | Date | Country | Kind |
---|---|---|---|
10 2014 212 660.6 | Jun 2014 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2015/064549 | 6/26/2015 | WO | 00 |