An embodiment of the invention relates to a robotic surgical pedal with integrated foot sensor for sensing a presence and/or location of a user's foot. Other embodiments are also described.
In a robotic surgical system, a robotic arm has a surgical tool attached to its distal end that is remotely operated by a surgeon. Applications include endoscopic surgery, which involves looking into a patient's body and performing surgery inside, for example the abdominal cavity, using endoscopes and other surgical tools that are attached to the ends of several robotic arms. The system gives the surgeon a close-up view of the surgery site, and also lets the surgeon operate the tool that is attached to the arm, all in real-time. The tool may be a gripper with jaws, a cutter, a video camera, or an energy emitter such as a laser used for coagulation. The tool is thus controlled in a precise manner with high dexterity in accordance with the surgeon manipulating a handheld controller. Some functions of the system such as control of the energy emitter may be assigned to a foot pedal controller that the surgeon manipulates with their foot.
An embodiment of the invention is directed to a foot pedal assembly (or system) for controlling a robotic surgical system, with integrated foot presence sensing. The foot pedal assembly may include a foot pedal with a foot presence sensor (or multiple sensors) built into the cover portion of the pedal. More specifically, in one embodiment, a sensor such as a proximity sensor may be built directly into the pedal. The proximity sensor may be used to detect the presence of a user's foot prior to the foot contacting a particular pedal (e.g., hovering). In this aspect, the system can notify the user their foot is over a pedal, about to press a given pedal, moving toward a pedal, or provide other similar information relating to the user's foot position with respect to the pedal individually. In addition, in some cases, the foot pedal assembly includes a number of foot pedals having sensors incorporated therein, so that the presence of a foot with respect to each of the pedals individually can be determined. This information can be used by the user and/or system to, for example, prevent unintentional pedal activation, anticipate a surgical operation and prepare the corresponding instrument, and/or determine a motion or speed of the foot. For example, when one pedal is active and the foot is detected over another pedal, the user can be notified and/or the other pedal deactivated to prevent unintentional activation. The hardware therefore allows for higher precision in notifying the user of the location of her/his foot prior to activation of any particular pedal. More specifically, the system achieves presence sensing of a foot over any given pedal individually so that a user can be alerted.
Representatively, in one embodiment, the invention may include an assembly of pedals and associated software for running algorithms, processing operations, processes, or the like, to optimize performance and user experience in applying the pedals with integrated presence (or hover) sensing. For example, in one aspect, the assembly may include a layout, arrangement or array of seven pedals. There may be three pedals arranged on the left half of the layout and four pedals arranged on the right half of the layout, at known positions. One or more of the four pedals arranged together on the right half may be used to activate energy or advanced tool functionality (e.g., laser, stapler, etc.), while the pedals on the left half may be used to operate cameras and/or switch instruments. Since notifications with respect to energy pedals may be important, each of the four pedals on the right half layout may have sensors built therein, although sensors may be built into all seven pedals, or any combination of pedals where presence sensing is desired.
The integration of the sensors into the pedals as discussed herein may have several additional applications. For example, pedal prioritization can be implemented based on the information obtained by the sensors. For example, when a user places their foot on an upper pedal, both the upper pedal and lower pedal sensors may be triggered (the user's foot overlaps both pedals). The system, however, can prioritize which pedal the system should alert the user about based on the function of the particular pedal. For example, the system knows that upper pedals map to energy functions, which if activated unintentionally, may be more undesirable than an inadvertent activation of any of the lower pedals. In this scenario, although hover is detected over both pedals, the system alerts the user of the hover over the upper pedal, since unintentional activation of this pedal will result in more harm.
In addition, by having two sensors placed a known distance apart, the system can detect foot motion and speed when, for example, two or more sensors are detected in sequence. For example, it can be determined that the user's foot is moving from the floor (no sensors triggered), toward a first pedal (first pedal sensor triggered), and then on to a second pedal (second pedal sensor triggered). This motion (and speed) knowledge can assist in providing the user with critical information about results of any action they're about to take and also assist in optimizing operations.
Still further, in one aspect, integrating sensors into the pedals on the left and/or right side layout may assist with training surgeons new to the system how to use the system, or to optimize performance on the system. Representatively, it is known that there is a correlation between procedure time and number of camera clutches. Therefore, if the system can inform a user that their feet are unnecessarily resting on the camera clutch pedal, this procedure variable can be optimized. In addition, when sensors are built into all seven pedals, the system may be configured to toggle on/off the left pedal sensors, or alter the size or frequency of the hover-related notifications.
More specifically, in one embodiment, the invention is directed to a foot pedal assembly for controlling a robotic surgical system. The foot pedal assembly including a foot pedal base, a foot pedal and a sensor built into the foot pedal. The foot pedal may be movably coupled to the foot pedal base, such that it moves with respect to the base, and have a contact surface extending from a distal end to a proximal end of the foot pedal. The sensor coupled to the contact surface of the foot pedal at a position closer to the proximal end than the distal end, the sensor is operable to sense a target object positioned a distance over the contact surface. In some cases, the foot pedal may pivot around an axle coupled to the foot pedal base. The axle may be positioned closer to the distal end than the proximal end, and the sensor may be closer to the proximal end than the axle. Still further, the sensor may be coupled to the contact surface at a position that is between the distal end and a point midway between the distal end and the proximal end. In addition, a cavity may be formed through the contact surface, and the sensor may be mounted within the cavity. In some embodiments, the sensor may be an optical sensor having an emitter and a detector, and the emitter emits a beam of electromagnetic radiation in a direction away from the contact surface. In other cases, the sensor may be a capacitive sensor. Still further, the sensor may be one of an array of sensors coupled to the contact surface of the foot pedal. In addition, the foot pedal assembly may be one of an array of foot pedal assemblies operable to control different surgical robotic operations. In addition, the assembly may include a foot pedal assembly platform having an upper platform and a lower platform to which the array of foot pedal assemblies are mounted. In some cases, a larger number of the foot pedal assemblies in the array of foot pedal assemblies are mounted to the upper platform than the lower platform of the foot pedal assembly platform.
In another embodiment, the invention is directed to a foot pedal system including a foot pedal assembly platform, a first foot pedal assembly and a second foot pedal assembly coupled to the foot pedal assembly platform, and a processor. Each of the first foot pedal assembly and the second foot pedal assembly may have a foot pedal movably coupled to a foot pedal base and an optical sensor coupled to a contact surface of the foot pedal that faces away from the foot pedal base. The optical sensor may be operable to emit a light beam in a direction away from the contact surface and detect a presence of a target object prior to activation of the first foot pedal assembly or the second foot pedal assembly by the target object. The processor may be configured to determine a position of the target object with respect to the first foot pedal assembly or the second foot pedal assembly based on the presence detected by the optical sensor coupled to the first foot pedal assembly or the second foot pedal assembly. In some embodiments, the optical sensor may be positioned closer to a proximal end of the contact surface than a distal end of the contact surface. In addition, the processor may further be configured to determine whether the target object is positioned closer to the proximal end of the contact surface than the distal end of the contact surface based on the presence detected by the optical sensor. The optical sensor may be one of an array of optical sensors coupled to different regions of the contact surface of a respective foot pedal, and the processor may further be configured to determine the target object is aligned with one of the different regions. The processor may also be configured to determine the target object is over the contact surface of the first foot pedal assembly prior to the target object contacting the contact surface of the first foot pedal assembly based on the presence of the target object being detected by the optical sensor of the first foot pedal assembly. In some cases, the processor may be configured to determine the position of the target object with respect to the first foot pedal assembly based on the presence of the target object being detected by the optical sensor of the second foot pedal assembly. The processor may further be configured to determine a lateral motion of the target object when the presence of the target object is detected by the optical sensors in sequence. Still further, the processor is further configured to alert a user that the presence of the target object is detected over one of the first foot pedal assembly or the second foot pedal assembly when the other of the first foot pedal assembly or the second foot pedal assembly is actively controlling a robotic operation. In addition, the processor may be configured to prevent activation of the first foot pedal assembly when the presence of the target object is detected over the first foot pedal assembly while the second foot pedal assembly is actively controlling a robotic operation. In some cases, the first foot pedal assembly is operable to control a first surgical robotic operation and the second foot pedal assembly is operable to control a second surgical robotic operation that is different than the first surgical robotic operation. The first surgical robotic operation may include an energy device and the second surgical robotic operation my include a non-energy instrument. In addition, the foot pedal assembly platform may include an upper platform and a lower platform, and the first foot pedal assembly is coupled to the upper platform and the second foot pedal assembly is coupled to the lower platform.
The above summary does not include an exhaustive list of all aspects of the present invention. It is contemplated that the invention includes all systems and methods that can be practiced from all suitable combinations of the various aspects summarized above, as well as those disclosed in the Detailed Description below and particularly pointed out in the claims filed with the application. Such combinations have particular advantages not specifically recited in the above summary.
The embodiments of the invention are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to “an” or “one” embodiment of the invention in this disclosure are not necessarily to the same embodiment, and they mean at least one. Also, in the interest of conciseness and reducing the total number of figures, a given figure may be used to illustrate the features of more than one embodiment of the invention, and not all elements in the figure may be required for a given embodiment.
Several embodiments of the invention with reference to the appended drawings are now explained. Whenever the shapes, relative positions and other aspects of the parts described in the embodiments are not explicitly defined, the scope of the invention is not limited only to the parts shown, which are meant merely for the purpose of illustration. Also, while numerous details are set forth, it is understood that some embodiments of the invention may be practiced without these details. In other instances, well-known circuits, structures, and techniques have not been shown in detail so as not to obscure the understanding of this description.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Spatially relative terms, such as “beneath”, “below”, “lower”, “above”, “upper”, and the like may be used herein for ease of description to describe one element's or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (e.g., rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
As used herein, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising” specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
The terms “or” and “and/or” as used herein are to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” or “A, B and/or C” mean “any of the following: A; B; C; A and B; A and C; B and C; A, B and C.” An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
In addition, the phrase “configured to,” as used herein, may be interchangeable with, or otherwise understood as referring to, for example, a device that is “suitable for”, “having the capacity to”, “designed to”, “adapted to”, “made to”, or otherwise “capable of” operating together with another device or other components. For example, a “processor configured to perform A, B, and C” may refer to a processor (e.g., a central processing unit (CPU) or an application processor) that may perform operations A, B and C by executing one or more software programs which stores a dedicated processor (e.g., an embedded processor) for performing a corresponding operation.
Referring to
Each surgical tool 107 may be manipulated manually, robotically, or both, during the surgery. For example, the surgical tool 107 may be a tool used to enter, view, or manipulate an internal anatomy of the patient 106. In an embodiment, the surgical tool 107 is a grasper that can grasp tissue of the patient. The surgical tool 107 may be controlled manually, by a bedside operator 108; or it may be controlled robotically, via actuated movement of the robotic surgical arm 104 to which it is attached. The robotic surgical arms 104 are shown as a table-mounted system, but in other configurations the surgical arms 104 may be mounted in a cart, ceiling or sidewall, or in another suitable structural support.
Generally, a remote operator 109, such as a surgeon or other operator, may use the user console 102 to remotely manipulate the surgical arms 104 and/or the attached surgical tools 107, e.g., teleoperation. The user console 102 may be located in the same operating room as the rest of the robotic surgical system 100, as shown in
In some variations, the bedside operator 108 may also operate the robotic surgical system 100 in an “over the bed” mode, in which the beside operator 108 (user) is now at a side of the patient 106 and is simultaneously manipulating a robotically-driven tool (end effector as attached to the surgical arm 104), e.g., with a handheld UID 114 held in one hand, and a manual laparoscopic tool. For example, the bedside operator's left hand may be manipulating the handheld UID to control a robotic surgical component, while the bedside operator's right hand may be manipulating a manual laparoscopic tool. Thus, in these variations, the bedside operator 108 may perform both robotic-assisted minimally invasive surgery and manual laparoscopic surgery on the patient 106.
During an example procedure (surgery), the patient 106 is prepped and draped in a sterile fashion to achieve anesthesia. Initial access to the surgical site may be performed manually while the arms of the robotic surgical system 100 are in a stowed configuration or withdrawn configuration (to facilitate access to the surgical site.) Once access is completed, initial positioning or preparation of the robotic surgical system 100 including its arms 104 may be performed. Next, the surgery proceeds with the remote operator 109 at the user console 102 utilizing the foot-operated controls 113 and the UIDs 114 to manipulate the various end effectors and perhaps an imaging system to perform the surgery. Manual assistance may also be provided at the procedure bed or table, by sterile-gowned bedside personnel, e.g., the bedside operator 108 who may perform tasks such as retracting tissues, performing manual repositioning, and tool exchange upon one or more of the surgical arms 104. Non-sterile personnel may also be present to assist the remote operator 109 at the user console 102. When the procedure or surgery is completed, the robotic surgical system 100 and the user console 102 may be configured or set in a state to facilitate post-operative procedures such as cleaning or sterilization and healthcare record entry or printout via the user console 102.
In one embodiment, the remote operator 109 holds and moves the UID 114 to provide an input command to move a robot arm actuator 117 in the robotic surgical system 100. The UID 114 may be communicatively coupled to the rest of the robotic surgical system 100, e.g., via a console computer system 116. The UID 114 can generate spatial state signals corresponding to movement of the UID 114, e.g. position and orientation of the handheld housing of the UID, and the spatial state signals may be input signals to control a motion of the robot arm actuator 117. The robotic surgical system 100 may use control signals derived from the spatial state signals, to control proportional motion of the actuator 117. In one embodiment, a console processor of the console computer system 116 receives the spatial state signals and generates the corresponding control signals. Based on these control signals, which control how the actuator 117 is energized to move a segment of the arm 104, the movement of a corresponding surgical tool that is attached to the arm may mimic the movement of the UID 114. Similarly, interaction between the remote operator 109 and the UID 114 can generate for example a grip control signal that causes a jaw of a grasper of the surgical tool 107 to close and grip the tissue of patient 106.
Robotic surgical system 100 may include several UIDs 114, where respective control signals are generated for each UID that control the actuators and the surgical tool (end effector) of a respective arm 104. For example, the remote operator 109 may move a first UID 114 to control the motion of an actuator 117 that is in a left robotic arm, where the actuator responds by moving linkages, gears, etc., in that arm 104. Similarly, movement of a second UID 114 by the remote operator 109 controls the motion of another actuator 117, which in turn moves other linkages, gears, etc., of the robotic surgical system 100. The robotic surgical system 100 may include a right surgical arm 104 that is secured to the bed or table to the right side of the patient, and a left surgical arm 104 that is at the left side of the patient. An actuator 117 may include one or more motors that are controlled so that they drive the rotation of a joint of the surgical arm 104, to for example change, relative to the patient, an orientation of an endoscope or a grasper of the surgical tool 107 that is attached to that arm. Motion of several actuators 117 in the same arm 104 can be controlled by the spatial state signals generated from a particular UID 114. The UIDs 114 can also control motion of respective surgical tool graspers. For example, each UID 114 can generate a respective grip signal to control motion of an actuator, e.g., a linear actuator, that opens or closes jaws of the grasper at a distal end of surgical tool 107 to grip tissue within patient 106.
In some aspects, the communication between the surgical platform 105 and the user console 102 may be through a control tower 103, which may translate user commands that are received from the user console 102 (and more particularly from the console computer system 116) into robotic control commands that transmitted to the arms 104 on the surgical platform 105. The control tower 103 may also transmit status and feedback from the surgical platform 105 back to the user console 102. The communication connections between the surgical platform 105, the user console 102, and the control tower 103 may be via wired and/or wireless links, using any suitable ones of a variety of data communication protocols. Any wired connections may be optionally built into the floor and/or walls or ceiling of the operating room. The robotic surgical system 100 may provide video output to one or more displays, including displays within the operating room as well as remote displays that are accessible via the Internet or other networks. The video output or feed may also be encrypted to ensure privacy and all or portions of the video output may be saved to a server or electronic healthcare record system.
A foot-operated control including a foot pedal assembly or system having an integrated sensor for sensing the presence and/or location of a user's foot on the foot-operated control will now be described. Referring now to
In some cases, the foot pedal 202 may be rotatably coupled to the foot pedal base 204, while in other cases, the foot pedal 202 may be, for example, a floating pedal that remains relatively parallel to the base 204, and moves up and down.
Referring now in more detail to foot pedal 202, foot pedal 202 may include a proximal portion or end 212 and a distal portion or end 214, and a contact surface 216 extending between the two portions or ends 212, 214. During operation, the proximal portion 212 will be near the heel of the foot, and the distal portion 214 will be farther from the heel (e.g., closer to the toe). The contact surface 216 may be a substantially flat or planar surface that, in the neutral pedal position, may be substantially parallel to, and face away from, base portion 204. On the other hand, in the active pedal position (e.g., when a user's foot contacts surface 216), contact surface 216 may be rotated such that it is at an angle with respect to base portion 204. Contact surface 216 may be referred to as a “contact” surface herein because this is the portion of foot pedal 202 which is contacted by the user's foot to activate the pedal assembly such as by rotating, or otherwise moving, foot pedal 202. For example, foot pedal 202 may be manually moved (e.g., rotate, pivot, move up/down) with respect to foot pedal base 204 when a force or pressure is applied against surface 216. Therefore in this configuration, when a user's foot is positioned on surface 216 (e.g. to cause the pedal to rotate), the toes of the user's foot may be near distal end 214 (and used to rotate the distal end 214 toward switch 210B) and the heel may be positioned near proximal end 212 (and used to rotate the proximal end 212 toward switch 210A). In addition, it is noted that, in this embodiment, axle 206 is shown positioned closer to the distal end 214 than the proximal end 212 of foot pedal 202. In other embodiments, however, axle 206 may be located closer to the foot pedal mid point 218 than the distal end 214, closer to the mid point 218 than the proximal end 212 or closer to the proximal end 212 than the distal end 214, or anywhere between the proximal and distal ends 212, 214, respectively.
Foot pedal assembly 200 may further include a sensor 220. As previously discussed, foot pedal assembly 200 is used to control, or otherwise help to control, an operation of a surgical tool 107 (e.g., robotic arm 104) on a patient 106. In this aspect, it is particularly important that the pedal assembly not be accidentally, or otherwise inadvertently, activated (e.g. rotated or pressed) during, for example, another surgical operation being performed by the user (e.g., using another pedal assembly). In addition, in cases where multiple foot pedal assemblies are present, it is further important that a user be aware of which pedal they are about to press before activation of the pedal. Sensor 220 is therefore provided to detect a presence of the user's foot (also referred to herein as a target object), prior to the foot contacting surface 216 and activating assembly 200. In other words, to detect the user's foot hovering a distance over surface 216, but not yet contacting surface 216. In this aspect, it is particularly important that sensor 220 be mounted at a specific, and known, location on foot pedal 202 so that the foot presence can be immediately detected (e.g., before the foot is entirely over the pedal). In this aspect, sensor 220 may, in one embodiment, be mounted, attached, connected, or otherwise built into, an end of foot pedal 202. Representatively, when the pedal is oriented so that the user's foot slides on foot pedal 202 in a direction from left to right (as illustrated by arrow 222), sensor 220 may be positioned closer to proximal end 212 than distal end 214. For example, in some embodiments, sensor 220 may be positioned anywhere between proximal end 212 and mid point 218 of surface 216. Said another way, the distance (D1) between sensor 220 and proximal end 212 is less than the distance (D2) between sensor 220 and distal end 214. Still further, the position of sensor 220 may be defined with respect to axle 206. For example, where axle 206 is positioned closer to distal end 214 than proximal end 212 as shown, sensor 220 may be positioned closer to the proximal end 212 than axle 206.
Referring now to sensor 220 in more detail, sensor may be any type of sensor suitable for detecting the presence of the user's foot 304 (or any target object) when the foot is spaced a distance (d) from surface 216 as shown in
In addition, it should be further emphasized that because sensor 220 is built directly into the foot pedal 202, the presence (or hovering) of the foot with respect to that particular pedal can be more precisely detected than, for example, where a sensor is positioned near (but not attached to) the pedal, or multiple pedals. In particular, where the sensor is instead built into a structure that is near the pedal or multiple pedals, the system may be more likely to detect false positives (a foot is sensed but is not actually on the pedal) or false negatives (the foot is not sensed but is actually on the pedal).
Referring now to
In some cases, cover 402 includes four sidewalls 404, and two of the sidewalls may be longer than two of the other sidewalls such that cover 402 has a substantially rectangular shape. Sensor 220 may, in some embodiments, be closer to one of the shorter sidewalls (or ends of cover 402), than the longer sidewalls. Other cover sizes and shapes are, however, contemplated (e.g., square, rectangular, elongated, elliptical, circular, semi-circular, or the like). In addition, although not shown, in some embodiments, portions of sidewalls 404 may be dimensioned (e.g., curve downward or protrude) and the axle (axle 206 as previously discussed in connection with
Cover 402 may further include a channel or cavity 406 extending below surface 216 (e.g., into chamber 502). Cavity 406 may include interior sidewalls 408 that are coupled to an opening 410 through surface 216, and therefore define a channel extending below surface 216. In some cases, cavity 406 may be open at both ends as shown. In other cases, however, the cavity 406 could have a closed bottom (e.g., be closed at an end opposite surface 216). Cavity 406 is dimensioned to receive sensor 220 as shown in
In addition, it is contemplated that in still further embodiments, a foot pedal could have an array of sensors built therein so that a position of the target object (e.g., the user's foot) with respect to the foot pedal (or perhaps an adjacent foot pedal) can be more precisely determined. Representatively,
In addition to determining the position or location of the object (in the x, y and/or z-axis) using the array of sensors, a motion and/or speed of the object may further be determined. For example, in addition to the location of each of sensors 220A-220C being known, the distance between each of sensors 220A-220C is also known. Therefore, when the presence of the target object is detected sequentially by sensors 220A, 220B and/or 220C the direction that the target object and/or speed of the object movement across pedal 800 can also be determined. For example, if the target object is first detected by sensor 220A, followed by sensor 220B, and then sensor 220C, it can be determined that the target object is moving in an axial direction, toward distal end 214.
It should further be understood that in some embodiments, one or more of the previously discussed foot pedal assemblies or foot pedals may be arranged in an array with respect to one another, and the position, motion and/or speed of the target object may be determined with respect to the assembles in the array. Representatively,
Each of pedal assemblies 200A-200G may be arranged on, or otherwise mounted to, a foot pedal assembly platform 1202. Platform 1202 may include a lower portion 1204 and an upper portion 1206. The lower portion 1204 is considered to be “lower” because it is in a plane below the upper portion 1206. In some embodiments, the pedal assemblies controlling the more specialized devices such as energy devices (e.g., assemblies 200B and 200D) are arranged on the upper portion 1206 and those that control non-energy devices such as scalpels (e.g., assemblies 200A and 200C) are positioned on the lower portion 1204. In addition, the assemblies may be understood as being within a right hand side or a left hand side of the platform 1202, and may correspond to right or left hand operations of surgical tool 107, as previously discussed. For example, assemblies 200E-200G may be considered on the left hand side, while assemblies 200A-200D may be considered on the right hand side of platform 1202. In addition, within the assemblies on the right hand side, those on the left side (e.g., 200C-200D) may control left hand operations while those on the right side (e.g., 200A-200B) may control right hand operations.
In a similar manner to the previously discussed sensor arrays, the array of pedal assemblies 200A-200G may further be used to determine a position, motion and/or speed of a target object with respect to a particular pedal assembly 200A-200G. In addition, the detection of a target object with respect to one of the assemblies 200A-200G may be used to anticipate, or otherwise control, an activation of an adjacent pedal assembly or prevent activation of an adjacent pedal assembly. Representatively, as illustrated by the schematic diagram in
One exemplary process flow for determining the object motion is illustrated in
In addition, a pedal prioritization algorithm can be used to alert a user of an unintended operation or automatically prevent activation of one pedal with respect to another. For example, in some cases, when a user places their foot on an upper pedal (e.g., pedal assemblies 200B, 200D), the pedal sensor of a lower pedal assembly (e.g., pedal assemblies 200A, 200C) in addition to the upper pedal sensor will be triggered. A prioritization is therefore implemented so that if both the upper and lower pedal sensors detect an object on either the left or right side, the user is alerted of the upper pedal hover. In this case, the user may only be alerted to the upper pedal hover because the upper pedal controls energy functions, which if unintentionally activated, may result in more harm than the lower pedal function. In addition, in some embodiments, the system may automatically prevent or otherwise deactivate the lower pedal function under this scenario.
While certain embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that the invention is not limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those of ordinary skill in the art. For example, while the Figures illustrate pedal assemblies for surgical operations, alternative applications may include any application in which it would be beneficial for a user of a system with multiple pedal-actuated functions to be alerted of which pedal their foot is on or over. Examples include medical devices, aviation, aerospace equipment, aviation equipment or the like. The description is thus to be regarded as illustrative instead of limiting.
This application is a continuation of pending U.S. patent application Ser. No. 18/079,696, filed Dec. 12, 2022, which is a continuation of U.S. patent application Ser. No. 17/115,587, filed Dec. 8, 2020, now U.S. Pat. No. 11,547,500, issued Jan. 10, 2023, which is a continuation of U.S. patent application Ser. No. 16/038,128, filed on Jul. 17, 2018, now U.S. Pat. No. 10,888,383, issued Jan. 12, 2021, which are hereby incorporated by this reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4837857 | Scheller et al. | Jun 1989 | A |
4965417 | Massie | Oct 1990 | A |
5204942 | Otera et al. | Apr 1993 | A |
5422521 | Neer et al. | Jun 1995 | A |
5553609 | Chen et al. | Sep 1996 | A |
5583407 | Yamaguchi | Dec 1996 | A |
5635777 | Telymonde et al. | Jun 1997 | A |
5704791 | Gillio | Jan 1998 | A |
5787760 | Thorlakson | Aug 1998 | A |
5855553 | Tajima et al. | Jan 1999 | A |
5876325 | Mizuno et al. | Mar 1999 | A |
5877760 | Onda et al. | Mar 1999 | A |
5883615 | Fago et al. | Mar 1999 | A |
5889510 | Klarlund | Mar 1999 | A |
5931832 | Jensen | Aug 1999 | A |
6179829 | Bisch et al. | Jan 2001 | B1 |
6600477 | Howell | Jul 2003 | B1 |
6646541 | Wang et al. | Nov 2003 | B1 |
6659939 | Moll et al. | Dec 2003 | B2 |
6852107 | Wang et al. | Feb 2005 | B2 |
6892112 | Wang et al. | May 2005 | B2 |
6951535 | Ghodoussi et al. | Oct 2005 | B2 |
7245202 | Levin | Jul 2007 | B2 |
7865266 | Moll et al. | Jan 2011 | B2 |
7877171 | Gassner | Jan 2011 | B2 |
7977171 | Yang et al. | Jul 2011 | B2 |
8340863 | Karatsinides | Dec 2012 | B2 |
8527094 | Kumar et al. | Sep 2013 | B2 |
8914150 | Moll et al. | Dec 2014 | B2 |
9039681 | Wang et al. | May 2015 | B2 |
9119654 | Ramans et al. | Sep 2015 | B2 |
9301811 | Goldberg et al. | Apr 2016 | B2 |
9333042 | Diolaiti et al. | May 2016 | B2 |
9375288 | Robinson et al. | Jun 2016 | B2 |
9439806 | Eastman et al. | Sep 2016 | B2 |
9666101 | Kumar et al. | May 2017 | B2 |
10368955 | Cone et al. | Aug 2019 | B2 |
10503199 | Cone et al. | Dec 2019 | B1 |
10888383 | Cone et al. | Jan 2021 | B2 |
11191528 | Bucina et al. | Dec 2021 | B2 |
11266385 | Tseng et al. | Mar 2022 | B1 |
20020029095 | Kosaka et al. | Mar 2002 | A1 |
20020107573 | Steinberg | Aug 2002 | A1 |
20030013949 | Moll et al. | Jan 2003 | A1 |
20030047434 | Hanson et al. | Mar 2003 | A1 |
20030050733 | Wang et al. | Mar 2003 | A1 |
20030060927 | Gerbi et al. | Mar 2003 | A1 |
20060014611 | Kitamura et al. | Jan 2006 | A1 |
20060166681 | Lohbihler | Jul 2006 | A1 |
20060178559 | Kumar et al. | Aug 2006 | A1 |
20060219049 | Horvath et al. | Oct 2006 | A1 |
20070093868 | Fugo | Apr 2007 | A1 |
20090036902 | Dimaio et al. | Feb 2009 | A1 |
20090253109 | Anvari et al. | Oct 2009 | A1 |
20100198200 | Horvath | Aug 2010 | A1 |
20100225209 | Goldberg et al. | Sep 2010 | A1 |
20100228264 | Robinson et al. | Sep 2010 | A1 |
20110098721 | Tran et al. | Apr 2011 | A1 |
20120029694 | Mueller | Feb 2012 | A1 |
20120283745 | Goldberg et al. | Nov 2012 | A1 |
20130023899 | Green | Jan 2013 | A1 |
20130245834 | Laxhuber et al. | Sep 2013 | A1 |
20130331859 | Kumar et al. | Dec 2013 | A1 |
20140195048 | Moll et al. | Jul 2014 | A1 |
20140328469 | Lee et al. | Nov 2014 | A1 |
20140364864 | Lynn et al. | Dec 2014 | A1 |
20140378986 | Eastman et al. | Dec 2014 | A1 |
20150003898 | Shiozaki | Jan 2015 | A1 |
20150029047 | Levasseur et al. | Jan 2015 | A1 |
20150038981 | Kilroy et al. | Feb 2015 | A1 |
20150051607 | Hajishah et al. | Feb 2015 | A1 |
20150150546 | Goldschmidt | Jun 2015 | A1 |
20150202014 | Kim et al. | Jul 2015 | A1 |
20150323953 | Klestil | Nov 2015 | A1 |
20160045365 | Foster et al. | Feb 2016 | A1 |
20170007218 | Lai | Jan 2017 | A1 |
20170129502 | Stoffels et al. | May 2017 | A1 |
20180083621 | Ekvall et al. | Mar 2018 | A1 |
20180099608 | Salter et al. | Apr 2018 | A1 |
20180132948 | Mercado | May 2018 | A1 |
20180250086 | Grubbs | Sep 2018 | A1 |
20180280099 | Cone et al. | Oct 2018 | A1 |
20180338806 | Grubbs | Nov 2018 | A1 |
20190220052 | Kihara et al. | Jul 2019 | A1 |
20190239971 | Lim | Aug 2019 | A1 |
20190314005 | Ishihara et al. | Oct 2019 | A1 |
20190388157 | Shameli et al. | Dec 2019 | A1 |
20200022761 | Cone et al. | Jan 2020 | A1 |
20210213942 | Kayano | Jul 2021 | A1 |
20210213974 | Shimbo et al. | Jul 2021 | A1 |
20220054187 | Mohr et al. | Feb 2022 | A1 |
20220063404 | Takabatake et al. | Mar 2022 | A1 |
Number | Date | Country |
---|---|---|
107874834 | Apr 2018 | CN |
102016105554 | Sep 2017 | DE |
0587948 | Mar 1994 | EP |
1095642 | May 2001 | EP |
3085318 | Oct 2016 | EP |
2011116332 | Sep 2011 | WO |
2014189969 | Nov 2014 | WO |
2014205166 | Dec 2014 | WO |
Entry |
---|
International Preliminary Report on Patenlability for International Application No. PCT/US2018/043554 mailed Jan. 28, 2021, 12 pages. |
International Search Report and Written Opinion of the PCT Patent Office dated Jun. 12, 2019 for related PCT Patent Application No. PCT/US2018/043554. |
International Search Report and Written Opinion of the PCT Patent Office dated May 14, 2019 for related PCT Patent Application No. PCT/US2018/043557. |
Invitation to Pay Additional Fees of the PCT Patent Office dated Apr. 11, 2019 for related PCT Patent Application No. PCT/US2018/043554. |
Invitation to Pay Additional Fees of the PCT Patent Office dated Mar. 18, 2019 for related PCT Patent Application No. PCT/US2018/043557. |
Number | Date | Country | |
---|---|---|---|
20240058078 A1 | Feb 2024 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 18079696 | Dec 2022 | US |
Child | 18470365 | US | |
Parent | 17115587 | Dec 2020 | US |
Child | 18079696 | US | |
Parent | 16038128 | Jul 2018 | US |
Child | 17115587 | US |