Patient support systems facilitate care of patients in a health care setting. Patient support systems comprise patient support apparatuses such as, for example, hospital beds, stretchers, cots, tables, wheelchairs, and chairs. A conventional patient support apparatus comprises a base and a patient support surface upon which the patient is supported. Often, the patient support apparatus has one or more powered devices to perform one or more functions on the patient support apparatus. These functions can include lifting and lowering the patient support surface, raising a patient from a slouched position, turning a patient, centering a patient, extending a length or width of the patient support apparatus, and the like. When a user such as a caregiver wishes to operate a powered device to perform a function, the user actuates a user interface. Conventional user interfaces may comprise a panel of buttons configured to selectively operate the various operational functions of the patient support apparatus.
The number and complexity of the operational functions integrated into the patient support apparatus continue to increase, and the evolution of user interfaces has been commensurate. Yet increasingly advanced user interfaces are inherently more difficult to operate, particularly to users not familiar with their operation. Users experiencing difficulty with operating the user interface lack adequate guidance and troubleshooting tools. Therefore, a need exists in the art for a patient support system providing improved guidance and/or troubleshooting tool to control the operations of the patient support apparatus. There is a further need for the guidance and/or troubleshooting tools to be easily and readily accessible through the user interface of the patient support apparatus itself.
Advantages of the present disclosure will be readily appreciated as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings.
A support structure 32 provides support for the patient. The support structure 32 illustrated in
A mattress 40 is disposed on the patient support deck 38. The mattress 40 comprises a secondary patient support surface 43 upon which the patient is supported. The base 34, intermediate frame 36, patient support deck 38, and patient support surfaces 42, 43 each have a head end 45 and a foot end 47 corresponding to a designated placement of the patient's head and feet on the patient support apparatus 30. The construction of the support structure 32 may take on any known or conventional design, and is not limited to that specifically set forth above. In addition, the mattress 40 may be omitted in certain embodiments, such that the patient rests directly on the patient support surface 42.
Side rails 44, 46, 48, 50 are coupled to the intermediate frame 36 and thereby supported by the base 34. A first side rail 44 is positioned at a right head end of the intermediate frame 36. A second side rail 46 is positioned at a right foot end of the intermediate frame 36. A third side rail 48 is positioned at a left head end of the intermediate frame 36. A fourth side rail 50 is positioned at a left foot end of the intermediate frame 36. If the patient support apparatus 30 is a stretcher or a cot, there may be fewer side rails. The side rails 44, 46, 48, 50 are movable between a raised position in which they block ingress into and egress out of the patient support apparatus 30, one or more intermediate positions, and a lowered position in which they are not an obstacle to such ingress and egress. In still other configurations, the patient support apparatus 30 may not include any side rails.
A headboard 52 and a footboard 54 are coupled to the intermediate frame 36. In other embodiments, when the headboard 52 and the footboard 54 are included, the headboard 52 and the footboard 54 may be coupled to other locations on the patient support apparatus 30, such as the base 34. In still other embodiments, the patient support apparatus 30 does not include the headboard 52 and/or the footboard 54.
Wheels 58 are coupled to the base 34 to facilitate transport over floor surfaces. The wheels 58 are arranged in each of four quadrants of the base 34 adjacent to corners of the base 34. In the embodiment shown, the wheels 58 are caster wheels able to rotate and swivel relative to the support structure 32 during transport. Each of the wheels 58 forms part of a caster assembly 60. Each caster assembly 60 is mounted to the base 34. It should be understood that various configurations of the caster assemblies 60 are contemplated. In addition, in some embodiments, the wheels 58 are not caster wheels and may be non-steerable, steerable, non-powered, powered, or combinations thereof. Additional wheels are also contemplated. For example, the patient support apparatus 30 may comprise four non-powered, non-steerable wheels, along with one or more powered wheels. In some cases, the patient support apparatus 30 may not include any wheels.
Referring to
The patient support system 28 comprises a control system 100 to control the operational devices 70-92 of the patient support apparatus 30, and a controller 102. The control system 100 controls the operational devices 70-92, or components thereof, to operate their associated actuators, control their pumps, control their valves, or otherwise cause the operational devices 70-92 to perform one of more of the desired functions. The controller 102 may be a functional subsystem of the control system 100. In other embodiments, the controller 102 may be a discrete system separate from the control system 100. In other words, the control system 100 and the controller 102 may be structurally integrated or separate. In one embodiment, the controller 102 is on-board the patient support apparatus 30 (e.g., coupled to the base 34, the footboard 54, or the like), and in another embodiment, the controller 102 is remotely located from the patient support apparatus 30 and in communication with the operational devices 70-92 disposed on-board the patient support apparatus 30. The controller 102 may communicate with the operational devices 70-92 via wired or wireless connections.
The controller 102 may comprise one or more microprocessors for processing instructions or for processing an algorithm stored in non-transitory memory 131 to control the operational devices 70-92. The control system 100 and/or controller 102 may comprise one or more microcontrollers, subcontrollers, field programmable gate arrays, systems on a chip, discrete circuitry, and/or other suitable hardware, software, or firmware that is capable of carrying out the functions described herein. Power to the operational devices 70-92 and/or the controller 102 may be provided by a battery power supply 104 or an external power source 106. Any type and number of sensors S may be included and in communication with the control system 100 and/or controller 102 to facilitate controlling the operational functions of the patient support apparatus 30.
The operational devices 70-92 may have many possible configurations for performing the predetermined functions of the patient support apparatus 30. Exemplary embodiments of the operational devices 70-92 are described further below, including the patient raising device 70, the immersion device 72, the patient turning device 74, the patient ingress/egress device 76, the lift device 78, the bed length extension device 80, the bed width extension device 82, the deck adjustment device 84, the temperature device 86, the entertainment device 88, and the lighting device 90. Further specifics regarding the exemplary devices are described in commonly owned U.S. patent application Ser. No. 15/353,179, filed on Nov. 16, 2016, which is hereby incorporated by reference herein in its entirety. Numerous devices other than those specifically described are contemplated, including a gatch adjustment device, a cleaning device, a coordinated motion device, a transport device, a cardiopulmonary resuscitation (CPR) device, an information transmission device (to the patient's electronic medical record (EMR) or electronic health record (EHR)), a sit-to-stand assist device, a cough detection device, a sleep detection device, among others. Any of the described and/or contemplated devices may be integrated into the user menus of the present disclosure.
The patient raising device 70 is configured to perform the function of moving the patient from a slouched position towards a non-slouched position by moving the patient towards the head end of the patient support apparatus 30. The patient raising device 70 may comprise a patient raising bladder structure within the mattress 40. The patient raising bladder structure may comprise patient raising inflation bladders that are connected together longitudinally so that each of the patient raising inflation bladders spans across a majority of a width of the mattress 40 below the patient, and the patient raising inflation bladders span a majority of a length of the mattress 40 below the patient. A progressive inflation scheme with the patient raising bladder structure is used to raise the patient from the slouched position to the non-slouched position. In response to a control signal from the controller 102, the patient raising inflation bladders are inflated and deflated to create a wave-like force directed towards the head end of the patient support apparatus 30 to push the patient toward the head end. In one example, only one of the patient raising inflation bladders is fully inflated at a time to create the wave-like force needed to raise the patient. Once fully inflated, each patient raising inflation bladder begins to deflate and the next adjacent patient raising inflation bladder toward the head end begins to inflate.
The immersion device 72 is configured to equalize and distribute pressure over a greater area of the surface of the body over the mattress 40, allowing for immersion of the patient. The immersion device 72 may comprise a bladder structure within the mattress 40 comprising, for example, elongate bladders spanning a majority of the length of the mattress 40 below the patient. In response to a control signal from the controller 102, the elongate bladders are selectively inflated or deflated to control the immersion of the patient within the mattress 40; i.e., the extent in which the patient “sinks into” the mattress. The bladder structure may also be configured move the patient from an off-center position toward a longitudinal centerline of the mattress 40, such as when the patient has shifted too far to one side or the other of the mattress 40. In response to a control signal from the controller 102, the elongate bladders are selectively inflated to guide the patient toward the longitudinal centerline of the mattress 40 when desired. Movement of the patient toward the longitudinal centerline may not be immediate, but may occur gradually as the elongate bladders remain inflated.
The patient turning device 74 is configured to perform the function of turning the patient and/or providing rotational therapy to the patient. The patient turning device 74 may utilize the patient centering/turning bladder structure as the patient centering device 72. In response to a control signal from the controller 102, the elongate bladders are independently inflated to raise one side or the other of the patient. If used for rotation therapy, then the elongate bladders are used for rotation therapy by sequentially inflating/deflating the elongate bladders to raise one side of the patient to a desired angle, lower the patient, and then raise the other side of the patient to the desired angle such that the patient experiences a side-to-side rotation that shifts pressures between the patient and the mattress 40.
The patient ingress/egress device 76 is configured to perform the function of easing ingress and/or egress of the patient to and/or from the patient support apparatus 30. The patient ingress/egress device 76 comprises a main air bladder positioned within the mattress 40. The main air bladder is sized to extend substantially the full width of the mattress 40 and a majority of the length of the mattress 40. In an exemplary embodiment, the main air bladder comprises a single air bladder that can be inflated and deflated, depending on the needs of the patient or the caregiver. The controller 102 transmits a control signal to fully inflate the main air bladder to ease ingress and egress of the patient. For instance, if the main air bladder is less than fully inflated, e.g., to soften the mattress 40 and provide additional comfort to the patient, it can be difficult for the patient to move across the mattress 40 for ingress or egress. Accordingly, by fully inflating, and stiffening the mattress 40, movement across the mattress 40 can be made easier for the patient.
The lift device 78 is configured to lift and lower the patient between the minimum and maximum heights of the patient support apparatus 30, and intermediate positions therebetween. Referring to
The bed length extension device 80 is configured to perform the function of adjusting a length of the patient support apparatus 30 to accommodate patients of greater than average height. In an exemplary embodiment, the bed length extension device 80 comprises a pair of actuators to move a bed extension between an unextended position and extended positions with respect to the intermediate frame 36. In some embodiments, the bed extension is movable from zero to at least twelve inches from the unextended position to a fully-extended position. In other embodiments, the bed extension is able to move less or more than twelve inches and may be extendable to any position between the unextended and fully-extended position with the actuators. The bed extension may have two, three, four, or nearly an infinite number of extended positions in which to be adjusted by the actuators.
The bed width extension device 82 is configured to perform a function of adjusting a width of the patient support apparatus 30 to accommodate patients of greater than average width. The bed width extension device 82 may operate in the same manner as the bed length extension device 80. The bed width extension device 82 may comprise two sets of actuators to move four bed extensions between unextended and extended positions with respect to the intermediate frame 36. In some cases only one actuator or one set of actuators is employed. In some embodiments, each of the bed extensions is movable from zero to at least twelve inches from the unextended position to a fully-extended position. In other embodiments, each of the bed extensions is able to move less or more than twelve inches and may be extendable to any position between the unextended and the fully extended position with the actuators. Each of the bed extensions may have two, three, four, or nearly an infinite number of extended positions in which to be adjusted by the actuators.
The deck adjustment device 84 is configured to articulate one or more of the deck sections of the patient support apparatus 30. In an exemplary embodiment, the deck adjustment device 84 comprises one or more deck actuators to move one or more of the deck sections of the patient support apparatus 30 including but not limited to the fowler section, the seat section, the thigh section, and the foot section. The actuators may comprise electric linear actuators extending between the intermediate frame 36 and the particular deck section being adjusted. For example, in response to a control signal from the controller 102, actuation of the deck actuator raises and lowers the fowler section at various inclination angles relative to the intermediate frame 36. Suitable linear actuators are supplied by LINAK A/S located at Smedevænget 8, Guderup, DK-6430, Nordborg, Denmark. It is contemplated that any suitable deck adjustment system may be utilized in conjunction with the patient support apparatus 30, so long as the deck adjustment is configured to move one or more of the deck sections.
The temperature device 86 is configured to adjust the temperature of the patient, the temperature of patient support apparatus 30, and/or the temperature of the room in which the patient resides for purposes of patient comfort, therapy, or recovery.
An entertainment device 88 may be activated or adjusted for patient comfort or therapeutic purposes. The entertainment device 88 may be activated or adjusted to provide soothing entertainment or background noise to the patient. In some embodiments the entertainment device 88 comprises at least one piece of entertainment equipment (e.g., television, radio, etc.).
The lighting device 90 may comprise one or more light sources and a dimmer apparatus connected to the light sources to provide lighting that makes the patient more comfortable. In some embodiments one or more of the light sources may be adjusted to be on, off, dimmed or brightened to provide soothing lighting to the patient. In other embodiments, active cancelling of noise may also be employed to make the patient more comfortable.
The low air loss device 92 is configured to reduce or relieve pressure and control moisture caused by the body of the patient in contact with the mattress. The low air loss device 92 may comprise bladders (e.g., the elongate bladders of the immersion device 72) that span a majority of the length of the mattress 40 below the patient. Further, the low air loss device 92 comprises microscopic holes within the patient support surface 43 of the mattress 40 that allow air to escape from the elongate bladders. The amount of pressure within each of the elongate bladders may be selectively controlled. The escaped air provides pressure and moisture reduction.
The operational devices 70-92 of the patient support apparatus 30 are controlled by the control system 100 in response to the user providing an input to a user interface 110. Referring to
In certain embodiments, the user interface 110 may be provided as a pendant (not shown) coupled to the patient support apparatus 30. The pendant may be handheld and coupled to the patient support apparatus 30 with a tether, which may also include the electrical and data connection. The pendant may serve as the control suite for some or all of the functions of the patient support system 28 described throughout the present disclosure. In certain embodiments, the pendant integrates the entertainment device 88 and the lighting device 90. In particular, the pendant includes a plurality of tactile and/or touch-sensitive buttons for actuating certain features of the entertainment device 88 and the lighting device 90. Exemplary features include “channel up,” “channel down,” “music up,” “music down,” “television,” “radio,” “room lights,” “reading lights,” and the like. An exemplary pendant suitable for the present application is included on the In-Touch Critical Care Bed manufactured by Stryker Corp. (Kalamazoo, Mich.).
The user interface 110 may be located on one of the side rails 44, 46, 48, 50, the headboard 52, the footboard 54, or other suitable locations.
In some embodiments, the user interface 110 comprises a voice integration system 137 in communication with the controller 102. The voice integration system 137 comprises a voice actuation interface such as microphone in communication with the controller 102 to receive voice commands from the user. The microphone may be mounted to the base 34, the intermediate frame 36, the side rails 44, 46, 48, 50, the headboard 52, the footboard 54, or other suitable locations on the patient support apparatus 30. The microphone may also be located on the mobile device 156 or otherwise remote from the patient support apparatus 30. Based on the vocal input from the user provided to the voice integration system 137, the voice integration system 137 provides input signals to the controller 102 for functions to be disclosed.
The patient support system 28 further comprises an information output device 112 in communication with the controller 102 and configured to provide instructions to the user, such as the caregiver or the patient. In one embodiment, the information output device 112 comprises a display displaying the instructions and other information to the user. In another embodiment, the information output device 112 comprises speakers providing audible instructions to the user. Combinations of the display and speakers are preferred in many embodiments. In a further preferred embodiment, the user interface 110 and the information output device 112 are embodied on the touchscreen display 114. Capacitive touchscreens and other types of displays capable of receiving a touch-sensitive input may be employed.
The user interface 110 and/or the information output device 112 may be located on one or more of the side rails 44, 46, 48, 50, the headboard 52, the footboard 54, or other suitable locations. In the embodiment shown in
Referring to
The controller 102 may be configured to execute the software application. The software application is configured to display user menus 130 navigable by the user to control the operational functions of the patient support apparatus 30, such as to control the operational devices 70-92. In general, the user menus 130 may comprise any suitable output displayed with the information output device 112 to facilitate efficient operation of the patient support system 28. Any suitable format of the user menus 130 is contemplated, including but not limited to lists, grids and/or arrays of text, graphics and/or icons comprising indicia 124. The indicia 124, as used herein, may comprise text, graphics, and the like, selectable by the user with the user interface 110. In the exemplary embodiments illustrated in
The user menus 130 may comprise a home menu (not shown). The home menu may comprise the output provided by the information output device 112 upon initializing the software application such as after non-use of the user interface 110 for a predetermined period, a reset of the system, or the like. The home menu may comprise one of the user menus 130 provided by the information output device 110 in response to the user actuating the home button HB. The user menus 130 may further comprise the submenus 152. The submenus 152, in a general sense, are the output provided by the information output device 112 in response to a user selection of the indicia 124, 125, 126 displayed visually on the information output device 110. Often, the submenus 152 provide indicia 124, 125, 126 representative of operational functions of the patient support apparatus 30 more specific relative to the home menu. The submenus 152 may comprise one, two, or three or more submenus for each of the indicia 124, 125, 126 displayed on the home menu. For example, the submenus 152 may comprise primary, secondary and tertiary submenus as the user navigates the software application.
Controlling the operational functions of the patient support apparatus 30 may require performing several steps with the software application. A navigation protocol may be defined as a series of user-performed actions to control any particular one of the operational functions of the patient support apparatus 30. In one example, the navigation protocol may require the user to provide multiple inputs to the user interface 110 to navigate the user menus 130 to control the desired one or more of the operational functions of the patient support apparatus 30. Should the user accidentally or erroneously provide an incorrect input to the user interface 110, as is not uncommon particularly with touchscreen displays, the information output device 112, in response, may display a submenu 152 unrelated to the desired one or more of the operational functions sought to be operated by the user. The user may have deviated from the navigation protocol. Depending on the familiarity of the user with the software application, touchscreen displays, technology generally, and other factors, any number of undesirable consequences may result. The user may be required to return to the home menu or other previous user menu 130 to reattempt navigating the user menus 130, adding undue time and frustration to the user experience. Alternatively, the user may simply lack the technological savvy to navigate the user menus 130 of the software application. It is therefore one of many advantages of the subject invention to provide improved guidance and/or troubleshooting that is accessible through the user interface 110 and/or information output device 112.
With continued reference to
The operations graphic 134 of the patient support apparatus 30 may also provide the user with information as to which of the operational devices 70-92 are in active state. For example,
Often, the one or more of the operational devices 70-92 to be controlled by the user may not be represented on the home menu (or one of the user menus 130) being displayed with information output device 112. The user may be required to perform one or more user-performed actions (e.g., providing input(s) to the user interface 110) in order to navigate the user menus 130 of the software application such that the user is provided with the option to control the one or more of the operational devices 70-92. Those unfamiliar with navigating the software application may experience appreciable difficulty with doing so.
According to an exemplary embodiment of the present disclosure, the controller 102 is configured to receive input signals from the user interface 110 based on the inputs from the user to the user interface 110. In certain embodiments, the inputs from the user to the user interface 110 comprise the user touching the touchscreen display 114. For any number of reasons, the user may provide a troubleshooting request to the user interface 110. For example, the user may have unsuccessfully attempted to navigate the user menus 130 of the software application to the menu configured to control the desired one or more of the operational devices 70-92. In another example, the user may anticipate difficulty with navigating the user menus 130 and/or prefers to save time by seeking assistance. In certain embodiments, the troubleshooting request comprises a virtual help button 140 on the user interface 110, and more particularly the touchscreen display 114.
The controller 102 is configured to determine a guidance protocol for the user based on the input signals from the user interface 110. Whereas the navigation protocol may be the user-performed actions performed by the user without the instructions being provided to the user, the guidance protocol comprises the user-performed actions to be performed by the user in response to the instructions provided to the user with the information output device 112. In other words, the navigation protocol may be considered the guidance protocol if the user did not require troubleshooting (i.e., correctly navigated the user menus 130). For example, the guidance protocol is the user-performed actions to be performed after the user has accidentally deviated from the navigation protocol.
In certain embodiments, the guidance protocol may comprise a plurality of steps needed to be taken by the user to result in the desired outcome associated with a troubleshooting request from the user provided to the user interface 110. For example, subsequent to the user actuating the virtual help button 140, the information output device 112 outputs a prompt requesting further information. Referring to
The user provides the troubleshooting request, and the troubleshooting request comprises the input from the user to the user interface 110. The input signals received by the controller 102 from the user interface 110 are based on the input comprising the troubleshooting request. The controller determines the guidance protocol determined by the input signals. The guidance protocol may comprise a plurality of steps needed to be taken by the user to result in the desired outcome associated with the troubleshooting request. For example, should the troubleshooting request involve operating the immersion device 72, the guidance protocol comprises the steps needed to be taken in order to do so. The steps may each comprise one or more instructions provided to the user with the information output device 112. The instructions 160 may comprise first and second instructions, first and second steps, and the like. The controller 102 is further configured to provide a first of the instructions to the user with the information output device 112, and provide a second one of the instructions to the user with the information output device 112 in response to the user performing a first of the user-performed actions.
An exemplary operation of the guidance and troubleshooting is described with reference to
In some embodiments, the information output device 112 may provide the user with a confirmatory request 146. The confirmatory request 146 may simply repeat the provisionally selected one of the operational functions of the operational devices 70-92 (e.g., “Turn Assist” of the patient turning device 84), and/or provide additional information about the same.
The controller 102 is configured to provide a first instruction to the user with the information output device 112. The first instruction may be the first of a plurality of instructions 160 or a first step of a plurality of steps. As previously mentioned the touchscreen display 114 may comprise the taskbar 122 with indicia 125, 126 representative of operational functions of the patient support apparatus 30. The indicia 125, 126 may be selectable by the user with the user interface 110, in many cases the touchscreen display 114. In certain embodiments, providing the instructions 160 to the user on the touchscreen display 114 comprises the controller 102 being configured to visually emphasize on the touchscreen display 114 at least one of the indicia 124, 125, 126.
The visual emphasis may include providing elements to and/or modifying elements of the indicia 124, 125, 126, such as line, shapes, forms, values, colors, textures, space, and the like, to focus the user on the emphasized indicia 124, 125, 126. In certain embodiments, color(s) of the indicia 124, 125, 126 (e.g., background or foreground color) may be changed to provide contrast different from the other displayed indicia 124, 125, 126. It should be appreciated that the term color comprises hue, tint, shade, tone, lightness, saturation, intensity, and/or brightness such that references made herein to different colors also encompasses different hue, tint, tone, lightness, saturation, intensity, and/or brightness. In certain embodiments, shapes may be provided and arranged in a manner to focus the user.
The guidance protocol further comprises the user-performed actions to be performed by the user in response to the instructions provided to the user. In other words, in response to the first of the instructions 160 (e.g., the visual emphasis of indicia 126′ and/or audible instruction(s)), the user performs a first user-performed action corresponding to the first instruction. In preferred embodiments, the first user-performed action is performing the action suggested by the information output device 112. In the example shown in
The user may perform a user-performed action that deviates from the first instruction provided to the user. For example, the user may accidentally actuate one of the indicia 124, 125, 126 other than the indicia 126′ visually emphasized and/or audibly described with the first instruction. In certain embodiments, the controller 102 is configured to determine whether the performed user-performed action is the first user-performed action. In other words, the controller 102 determines whether the input provided to the user interface 110 subsequent to providing the first of the instructions 160 correlates, matches, or is otherwise correct based on the first instruction provided to the user. Should the user-performed action be incorrect, the resulting information being displayed on the touchscreen display 114 may deviate from the guidance protocol. In exemplary embodiments, the controller 102 may be configured to automatically determine an updated guidance protocol. The updated guidance protocol is directed to effectuating the same end result as the guidance protocol, but it may require greater, fewer, or different instructions in order to achieve the result. For example, the updated guidance protocol may comprise the original guidance protocol, with the addition of the user first selecting a virtual “Back” button BB (see
The user-performed action preferably correlates, matches, or is otherwise correct based on the first instruction provided to the user such that the guidance protocol may proceed as originally determined. The controller 102 is configured to provide a second of the instructions to the user with the information output device 112 in response to the user performing the first of the user-performed actions. Based on the user-performed action of selecting the indicia 126′ representative of “Turn Assist,” one of the submenus 152 is provided with the information output device 112. Referring to
In many respects, providing the second of the instructions 160 is performed in the same manner as providing the first of the instructions 160. Providing the second of the instructions 160 to the user on the touchscreen display 114 comprises the controller 102 being configured to visually emphasize on the touchscreen display 114 at least one of the indicia 124, 125, 126.
The guidance protocol may proceed through subsequent iterations of providing instructions 160 or steps in response to user-performed actions consistent with the disclosure above. The guidance protocol may proceed to a subsequent one of the instructions 160 or steps after the user successfully performs the user-performed action, or after indicia such as play button PB is actuated to indicate the user is ready for the next one of the instructions 160 or steps. In some embodiments, the controller 102 is further configured to provide a third of the instructions 160 to the user with the information output device 112 in response to the user performing a second of the user-performed actions. After the user-performed action of selecting the indicia 124 representative of “Right Turn” in
Referring now to
Once the final user-performed action is completed such that the desired operational function of the patient support apparatus 30 is performed, the software application may return the user to the home menu or a previous user menu 130.
It is to be understood that the operation described above is but one non-exhaustive example. A user may receive troubleshooting for any operational feature of the patient support apparatus 30 controllable from the user interface 110. For example, the guidance protocol facilitates control of the immersion device 72 or the low air loss device 94. For another example, the guidance protocol facilitates ordering replacement parts for the patient support apparatus 30 with the user interface 110.
In some cases, the user experiencing difficulty navigating the user menus 130 may be unaware of the advanced guidance and troubleshooting capabilities of the patient support system 28. The user may be unaware of or failed to notice the virtual help button 140. The patient support system 28 further provides predicative troubleshooting to initiate the troubleshooting capabilities without being requested by the user. In other words, the controller 102 of the patient support system 28 is configured to determine if the user is experiencing difficulty navigating the user menus 130 to control the operational functions of the patient support apparatus 30. In a preferred embodiment, the controller 102 is configured to initiate the guidance protocol based on an uncorrelated sequence of inputs from the user to the user interface 110. As described above, controlling operational functions of the patient support apparatus 30 often requires a sequence of user-performed actions. The sequence of user-performed actions often comprise successive inputs to the user interface 110 to navigate the user menus 130 comprising the home menu and the submenus 152. When the inputs are advancing the user towards controlling the desired operational feature(s), the inputs are considered to be correlated. With the user menus 130 of increased complexity (e.g., relative to those shown in
For purposes of the embodiments with predictive troubleshooting, the user-performed actions are described as “selections.” Typically, the selections occur by actuating indicia 124 on the touchscreen display 114. The selections may comprise an initial selection, intermediate selections, and a final selection. The initial selection is associated with the home menu or other user menu 130 and, as described herein, generally directs the user to one of the submenus 152 associated with the initial selection (of indicia 124, 125, 126). The final selection is the actuation of the indicia 124, 125, 126 generally immediately prior to the desired operational feature moving to the active state or being made operational. In one example, the final selection causes the control system 100 to control one or more of the operational devices 70-92. In the earlier described example, the final selection was actuation of the play button PB (see
In some embodiments, the user may be prompted for troubleshooting assistance should the number of intermediate selections exceed a predetermined number. In one embodiment, the final selection is not considered in determining whether to prompt the user for troubleshooting assistance, since the final selection causes the desired action. In other words, if the user is making the final selection, it is unlikely the user is having difficulty navigating the user menus 130, otherwise he or she would not make the final selection (but rather continue navigating the submenus 152). It is to be understood that the final selection may be considered in determining whether the number of selections exceed the predetermined number such that the user is prompted for troubleshooting assistance.
The predetermined number may be based on any number of factors and/or metrics. In one example, empirical data may show that control of the operational functions of the patient support apparatus 30 averages three selections, excluding the initial and final selections. The predetermined number may be set at seven selections such that if the user makes seven or more intermediate selections, the information output device 112 provides a prompt to the user inquiring whether troubleshooting assistance is desired. In other words, the predetermined number could be, for example, the average number of a correlated sequence of inputs required to control the operational functions. Any number of selections exceeding the predetermined number is considered to be uncorrelated such that the user is “lost” within the submenus 152 of the software application. Often, the predetermined number may be sufficiently above the average such that a buffer is provided to avoid prompting the user too often, perhaps unwarrantedly, which may cause annoyance.
In another exemplary embodiment, the initial selection may be probative as to whether the intermediate selections comprise an uncorrelated sequence of inputs. In such an embodiment, it is assumed that the initial selection on the home menu was proper and the user is experiencing difficulty navigating the submenus 152. For example, to control the immersion device 72, it is unlikely the user would actuate the indicia 126 on the user menu 130 for “Turn Assist.” Subsequently, the intermediate selections are analyzed quantitatively and qualitatively relative to the initial selection. Should, for example, the user make several selections in the submenus 152 associated with the patient turning device 74, but the initial selection was the indicia 126 on the home menu for “Immersion,” the information output device 112 provides a prompt to the user inquiring whether troubleshooting assistance is desired. A database, algorithms, AI, and the like, may compile the relationships between the operational functions of the patient support apparatus 30 based on user inputs over time, and the predetermined number of intermediate selections before prompting may be adjusted accordingly. For example, a lower predetermined number of intermediate selections may be required between an initial selection directed to the immersion device 72 and intermediate selections involving submenus 152 directed to the patient raising device 78 than between an initial selection directed to the immersion device 72 and intermediate selections involving submenus 152 directed to low air loss device 94 (i.e., the immersion device 72 and the low air loss device 94 are related in many ways).
In some embodiments, actuating the “Back” button BB (see
The prompt for troubleshooting assistance may be provided with the information output device 112 in manners previously described and reasonably related thereto. For example, the prompt may comprise a pop-up window on the touchscreen display 114, or an audible question or instruction to the user. The user may elect to accept or forego proceeding with the troubleshooting request. Should the user elect to proceed, the user is further prompted to provide the troubleshooting request in the manners previously described. Subsequently, the guidance protocol is determined and executed.
In another exemplary embodiment, the determination of whether to prompt for troubleshooting assistance is based on analyzing the correlation between the known selections required for controlling a particular operational function and those selections made by the user. The guidance protocol, as mentioned, comprises the user-performed actions to be performed by the user in response to the instructions provided to the user. The navigation protocol may be the user-performed actions performed by the user without the instructions being provided to the user. In other words, the navigation protocol may be the guidance protocol if the user did not require troubleshooting. Consequently, in at least certain embodiments, the selections for controlling a particular operational function may be the same for the navigation protocol and its corresponding guidance protocol. As the user makes the initial selection and subsequent intermediate selections, an algorithm may be implemented to determine to which guidance protocol most closely matches the navigation protocol being enacted by the user. If, for example, the user has made a correlated sequence of three selections directed to controlling the operational function of the immersion device 72, the controller 102 determines and stores this information in non-transitory memory 131 in communication with the controller 102. Should the user subsequently deviate from the navigation protocol in an atypical manner, the software application may present the user with the prompt requesting troubleshooting assistance. An atypical manner would include deviating from the navigation protocol within the submenus 152. It may also be considered atypical to actuate the “home” button HB or the “back” buttons BB. In other embodiments, it would not be desirable to prompt the user for troubleshooting assistance after the user opts to merely return to the home menu or previous submenu 152, as the user should be able to freely navigate the software application to a reasonable extent. Yet, as the user makes multiple selections that evidence an intention of a certain course of action, it may be beneficial to prompt the user for troubleshooting assistance after the user deviates from completing the course of action in an atypical manner. AI and algorithms may be developed and implemented to learn and discern patterns of the user selections throughout the software application so as to optimize the timing and manner of the prompt for troubleshooting assistance.
In the above embodiments, regardless of whether the troubleshooting assistance is initiated by the system or by the user actuating the virtual help button 140, the controller 102 is configured to determine the guidance protocol based on the troubleshooting request from the user provided to the user interface 110. For example, as mentioned, the controller 102 determines the troubleshooting request, and consequently the guidance protocol, based on the input signals. Additionally or alternatively, the guidance protocol of the patient support system 28 is determined based on the input signals comprising the uncorrelated sequence of inputs to the user interface 110. Stated differently, the patient support system 28 utilizes predictive troubleshooting to determine what the user is attempting to accomplish without the user providing the troubleshooting request to the user interface 110. This may be in addition to the controller 102 being configured to automatically initiate the guidance protocol and/or prompt troubleshooting assistance based on the uncorrelated sequence of inputs to the user interface 110.
Determining the guidance protocol based on the uncorrelated sequence of inputs to the user interface 110 presents unique challenges beyond prompting for troubleshooting assistance. The present disclosure contemplates several methods for determining the guidance protocol, some of which are described below, and should be considered non-limiting examples. In one embodiment, each indicia 124, 125, 126 for the home menu and each of the submenus 152 are predefined as associated with one or more of the operational functions. In some respects, the predefined associations are inherent, as selecting one of the indicia 124, 125, 126 results in a consequent response from software application (e.g., directed to a submenu 152, etc.). The response from the software application moves the user towards an end result whether desired or not. These predefined associations are stored in the non-transitory memory 131 in, for example, a database format. A point value may be provided for each predefined association that is based on the nexus or relationship between the indicia 124, 125, 126 and the operational functions. For example, the indicia 126 of “Turn Assist” (see
The user makes the initial selection on the home menu or the intermediate selection(s) on the submenus 152. As the user navigates the home menu or other user menu 130 and the submenus 152, the point values may be assigned and summed for each operational function. In other words, for each selection from the user to the user interface 110, the controller 102 assigns and sums the point values for each operational function of the patient support apparatus 30 based on its predefined association.
Based on the summed point values for each operational function, the controller 102 determines which of operational function the user is intending to operate. For example, following the troubleshooting request from the user, the information output device 112 may provide the user with one or more of the operational features having the highest point total(s). In other words, the higher the point total indicates the selections made by the user to that point have been most relevant to the certain operational functions. For example, the initial selection is the indicia 126 of “Immersion” on the user menu 130 of
The information output device 112, for example, may provide two or more “suggested” operations from which the user may select. Due to the relatedness between, for example, the immersion device 72 and the low air loss device 94, operational function of the low air loss device 94 may also have a higher point total than the operational function of the patient turning device 74. A pop-up menu may be displayed on the information output device 112 titled “Would you like assistance?” with indicia 124 representative of the immersion device 72 and the low air loss device 94, perhaps among others. The predictive guidance and troubleshooting described throughout the present disclosure advantageously facilitates improved patient care and an improved user experience. The user menus 130 provide a well-organized, aesthetically appealing user experience. The indicia 124, 125, 126 may be actuated on the touchscreen display 114, and portions of the working area 128 including the selected operations list 132 and the operations graphic 134 also comprise selectable features that are intuitive for the user.
The patient support system 28 may further comprise a remote assistance system 170 remote from the patient support apparatus 30. Referring to
Referring to
The remote representative is a human at a location remote from the patient support apparatus 30. Often, the remote representative is stationed at a call center configured to field technical support requests. In one embodiment, the image or feed 172 is a static picture, perhaps a stock photograph, and the live support comprises voice conferencing akin to a telephone call. In another embodiment, the image or feed 172 is a video feed, and the live support comprises the videoconference. The videoconference may be one-way (i.e., the user sees and hears the remote representative, and the remote representative only hears the user) or two-way (i.e., the remote representative and the user see and hear one another). A video camera may be coupled to the patient support apparatus 30 in a suitable manner to facilitate the videoconferencing.
The incoming text messages displayed on the information output device 112 may be submitted by the remote representative at a location remote from the patient support apparatus 30. A virtual keyboard comprising alphanumeric keys may be provided on the touchscreen display 114 comprising the user interface 110 to permit the user to prepare and send outgoing text messages to the remote representative.
For the voice conferencing, the videoconferencing, and the text messaging, the remote representative provides the guidance protocol comprising the instructions 160 to the user with the information output device 112. The instructions are provided at least audibly with the information output device 112 with speakers as the remote representative assists the user through the guidance protocol. In certain embodiments, the remote representative may provide visual instructions to the user with the information output device 112 comprising the touchscreen display 114. In one example, the remote representative may visually emphasize the indicia 124, 125, 126 in a manner previously described (see
Exemplary methods of providing guidance to the user for the patient support system 28 are also provided. In certain embodiments, the patient support system 28 comprises the patient support apparatus 30, a user interface 110 configured to receive inputs from the user, and the information output device 112 configured to provide instructions to the user. One exemplary method 200 is shown in
A guidance protocol is determined based on the inputs (step 204). The guidance protocol comprises steps of instruction 160 to be provided to the user on the information output device 112. The guidance protocol is initiated (step 206), and the instructions are outputted with the information output device 112 (step 208). For example, a first of the steps of instruction 160 is outputted with the information output device 112. User-performed actions are received with the user interface 110 in response to the steps of instruction 160 (step 210). For example, a first of the user-performed actions is received with the user interface 110 in response to the first step. The steps of outputting instructions and receiving user-performed actions may continue iteratively. For example, a second of the steps of instruction is outputted with the information output device 112 after the first user-performed action. The outputting of the first or second of the steps of instruction 160 may comprises visually emphasizing one of the indicia 124, 125, 126 on the information output device. The steps 208 and 210 may continue until completion of the guidance protocol, after which the method 200 ends.
In certain embodiments, the inputs may further comprise a first input and a second input. A correlation between the first input and the second input may be determined. The guidance protocol may be determined (step 204) based the determined correlation between the first input and the second input. The determined correlation may comprise an uncorrelated sequence of inputs. In some aspects, the guidance protocol may be initiated (step 206) based on the determined correlation comprises an uncorrelated sequence of inputs. More particularly, the guidance protocol may be initiated (step 206) once it is determined, by the controller 102, that the sequence of inputs from the user is uncorrelated as previously described. In other aspects, the guidance protocol is initiated (step 206) in response to a troubleshooting request from the user to the user interface 110.
It may be determined with the controller 102 whether the first step provided to the user is performed by the user with the first-user performed action. If the first-user performed action performed by the user is not the first step provided to the user, the first step may be again provided to the user with the information output device 110, or a third of the steps of instruction may be provided or outputted (step 208) with the information output device 112 with the third of the steps being different than the first or second steps of instruction 160.
A remote assistance system 170 may be provided and configured to facilitate live support with a representative over the network. The remote assistance system 170 may be in communication with the user interface 110 and the information output device 112 over a network, and comprise a representative remote from the patient support apparatus 30. The remote assistance system 170 is accessed over the network to request the guidance protocol. The steps of instruction are received from the remote assistance system 170 to be provided to the user with the information output device 112. The remote assistance system 170 may determine the guidance protocol. Accessing the remote assistance system 170 may comprise voice or videoconferencing with a representative on the information output device 112.
Referring to
In some embodiments, indicia 124, 125, 126 may be visually displayed on the information output device 112 with the indicia 124, 125, 126 representative of the operational functions of the patient support apparatus 30. Provided the one or more instructions to the user comprises visually emphasizing the indicia 124, 125, 126 on the information output device.
As noted above, the subject patent application is related to U.S. Provisional Patent Application No. 62/525,363 filed on Jun. 27, 2017. In addition, the subject patent application is also related to: U.S. Provisional Patent Application No. 62/525,353 filed on Jun. 27, 2017 and its corresponding Non-Provisional patent application Ser. No. 16/020,068 filed on Jun. 27, 2018, now U.S. Pat. No. 11,337,872; U.S. Provisional Patent Application No. 62/525,359 filed on Jun. 27, 2017 and its corresponding Non-Provisional patent application Ser. No. 16/020,052 filed on Jun. 27, 2018, now U.S. Pat. No. 11,382,812; U.S. Provisional Patent Application No. 62/525,368 filed on Jun. 27, 2017 and its corresponding Non-Provisional patent application Ser. No. 16/019,973 filed on Jun. 27, 2018, now U.S. Pat. No. 11,096,850; U.S. Provisional Patent Application No. 62/525,373 filed on Jun. 27, 2017 and its corresponding Non-Provisional patent application Ser. No. 16/020,003 filed on Jun. 27, 2018, now U.S. Pat. No. 11,202,729; and U.S. Provisional Patent Application No. 62/525,377 filed on Jun. 27, 2017 and its corresponding Non-Provisional patent application Ser. No. 16/019.986 filed on Jun. 27, 2018, now U.S. Pat. No. 10,811,136. The disclosures of each of the above-identified Provisional Patent Applications and corresponding Non-Provisional patent applications are each hereby incorporated by reference in their entirety.
It will be further appreciated that the terms “include,” “includes,” and “including” have the same meaning as the terms “comprise,” “comprises,” and “comprising.”
Several embodiments have been discussed in the foregoing description. However, the embodiments discussed herein are not intended to be exhaustive or limit the invention to any particular form. The terminology which has been used is intended to be in the nature of words of description rather than of limitation. Many modifications and variations are possible in light of the above teachings and the invention may be practiced otherwise than as specifically described.
This application claims priority to and the benefit of U.S. Provisional Patent Application No. 62/525,363 entitled PATIENT SUPPORT SYSTEMS AND METHODS FOR ASSISTING CAREGIVERS WITH PATIENT CARE and filed on Jun. 27, 2017, the contents of which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5113214 | Nagata et al. | May 1992 | A |
5276432 | Travis | Jan 1994 | A |
5434621 | Yu | Jul 1995 | A |
5640953 | Bishop et al. | Jun 1997 | A |
5645667 | Kusen | Jul 1997 | A |
5664270 | Bell et al. | Sep 1997 | A |
5715548 | Weismiller | Feb 1998 | A |
5971913 | Newkirk et al. | Oct 1999 | A |
6320510 | Menkedick et al. | Nov 2001 | B2 |
6340977 | Lui et al. | Jan 2002 | B1 |
6362725 | Ulrich et al. | Mar 2002 | B1 |
6702314 | Crose | Mar 2004 | B1 |
6876303 | Reeder et al. | Apr 2005 | B2 |
6948592 | Kavounas | Sep 2005 | B2 |
7154397 | Zerhusen et al. | Dec 2006 | B2 |
7296312 | Menkedick et al. | Nov 2007 | B2 |
7319386 | Collins, Jr. et al. | Jan 2008 | B2 |
7336187 | Hubbard, Jr. et al. | Feb 2008 | B2 |
7389552 | Reed et al. | Jun 2008 | B1 |
7443302 | Reeder et al. | Oct 2008 | B2 |
7472439 | Lemire et al. | Jan 2009 | B2 |
7487562 | Frondorf et al. | Feb 2009 | B2 |
7490021 | Holland et al. | Feb 2009 | B2 |
7570152 | Smith et al. | Aug 2009 | B2 |
7690059 | Lemire et al. | Apr 2010 | B2 |
7747644 | Reihl et al. | Jun 2010 | B1 |
7888901 | Larson et al. | Feb 2011 | B2 |
7895519 | Allegrezza et al. | Feb 2011 | B1 |
8069157 | Jam | Nov 2011 | B2 |
8117701 | Bobey et al. | Feb 2012 | B2 |
8121856 | Huster et al. | Feb 2012 | B2 |
8143846 | Herman et al. | Mar 2012 | B2 |
8165908 | Bolle et al. | Apr 2012 | B2 |
8209608 | Linyard et al. | Jun 2012 | B1 |
8266742 | Andrienko | Sep 2012 | B2 |
8308237 | Kunou | Nov 2012 | B2 |
8319633 | Becker et al. | Nov 2012 | B2 |
8334779 | Zerhusen et al. | Dec 2012 | B2 |
8341777 | Hensley et al. | Jan 2013 | B2 |
8344860 | Collins, Jr. et al. | Jan 2013 | B2 |
8410943 | Metz et al. | Apr 2013 | B2 |
8413270 | Turner et al. | Apr 2013 | B2 |
8413271 | Blanchard et al. | Apr 2013 | B2 |
8432287 | O'Keefe et al. | Apr 2013 | B2 |
8442738 | Patmore | May 2013 | B2 |
8464380 | Bobey et al. | Jun 2013 | B2 |
8525682 | Dixon et al. | Sep 2013 | B2 |
8544126 | Elliott | Oct 2013 | B2 |
8552880 | Kopp et al. | Oct 2013 | B2 |
8604917 | CoIlins et al. | Dec 2013 | B2 |
8641301 | Yang et al. | Feb 2014 | B2 |
8650682 | Herman | Feb 2014 | B2 |
8674839 | Zerhusen et al. | Mar 2014 | B2 |
8716941 | Kim | May 2014 | B2 |
8756078 | Collins, Jr. et al. | Jun 2014 | B2 |
8768520 | Oexman et al. | Jul 2014 | B2 |
8789102 | Pickelsimer et al. | Jul 2014 | B2 |
8847756 | Tallent et al. | Sep 2014 | B2 |
8868542 | Kimball et al. | Oct 2014 | B2 |
8870812 | Alberti et al. | Oct 2014 | B2 |
8896524 | Birnbaum et al. | Nov 2014 | B2 |
8923994 | Laikari et al. | Dec 2014 | B2 |
8924218 | Corpier et al. | Dec 2014 | B2 |
8926535 | Rawls-Meehan | Jan 2015 | B2 |
8984685 | Robertson et al. | Mar 2015 | B2 |
9001038 | Kasahara | Apr 2015 | B2 |
9032510 | Sampathkumaran et al. | May 2015 | B2 |
9038217 | Elliot et al. | May 2015 | B2 |
9088282 | Holenarsipur et al. | Jul 2015 | B2 |
9126571 | Lemire et al. | Sep 2015 | B2 |
9138173 | Penninger et al. | Sep 2015 | B2 |
9173792 | Goffer | Nov 2015 | B2 |
9204823 | Derenne et al. | Dec 2015 | B2 |
9220650 | Bobey et al. | Dec 2015 | B2 |
9228885 | Zerhusen | Jan 2016 | B2 |
9230421 | Reeder et al. | Jan 2016 | B2 |
9233033 | Valentino et al. | Jan 2016 | B2 |
9259369 | Derenne et al. | Feb 2016 | B2 |
9262876 | Wood et al. | Feb 2016 | B2 |
9298884 | Ahmad | Mar 2016 | B1 |
9320664 | Newkirk et al. | Apr 2016 | B2 |
9342677 | Ali et al. | May 2016 | B2 |
9381125 | Herbst et al. | Jul 2016 | B2 |
9424699 | Kusens et al. | Aug 2016 | B2 |
9456938 | Blickensderfer et al. | Oct 2016 | B2 |
9463126 | Zerhusen et al. | Oct 2016 | B2 |
9466163 | Kusens et al. | Oct 2016 | B2 |
9486084 | Connell et al. | Nov 2016 | B2 |
9569591 | Vanderpohl, III | Feb 2017 | B2 |
9593833 | McMannon et al. | Mar 2017 | B2 |
9655798 | Zerhusen et al. | May 2017 | B2 |
9691206 | Kusens et al. | Jun 2017 | B2 |
9774991 | Kusens | Sep 2017 | B2 |
9814410 | Kostic et al. | Nov 2017 | B2 |
9838849 | Kusens | Dec 2017 | B2 |
9844275 | Nunn et al. | Dec 2017 | B2 |
9849051 | Newkirk et al. | Dec 2017 | B2 |
9858741 | Kusens et al. | Jan 2018 | B2 |
9892310 | Kusens et al. | Feb 2018 | B2 |
9892311 | Kusens et al. | Feb 2018 | B2 |
9916649 | Kusens | Mar 2018 | B1 |
9934427 | Derenne et al. | Apr 2018 | B2 |
9940810 | Derenne et al. | Apr 2018 | B2 |
9984521 | Kusens et al. | May 2018 | B1 |
9998857 | Kusens | Jun 2018 | B2 |
9999555 | Magill et al. | Jun 2018 | B2 |
10004654 | Zerhusen et al. | Jun 2018 | B2 |
10034979 | Bechtel et al. | Jul 2018 | B2 |
10052249 | Elliott et al. | Aug 2018 | B2 |
10098796 | Valentino et al. | Oct 2018 | B2 |
10136841 | Alghazi | Nov 2018 | B2 |
10172752 | Goffer | Jan 2019 | B2 |
10188569 | Elku et al. | Jan 2019 | B2 |
20020014951 | Kramer et al. | Feb 2002 | A1 |
20020059679 | Weismiller | May 2002 | A1 |
20030183427 | Tojo et al. | Oct 2003 | A1 |
20040083394 | Brebner et al. | Apr 2004 | A1 |
20050114140 | Brackett | May 2005 | A1 |
20060077186 | Park et al. | Apr 2006 | A1 |
20060102392 | Johnson et al. | May 2006 | A1 |
20060150332 | Weismiller | Jul 2006 | A1 |
20060277683 | Lamire | Dec 2006 | A1 |
20070130692 | Lemire | Jun 2007 | A1 |
20070157385 | Lemire | Jul 2007 | A1 |
20070163045 | Becker | Jul 2007 | A1 |
20070180616 | Newkirk | Aug 2007 | A1 |
20070219950 | Crawford | Sep 2007 | A1 |
20080141459 | Hamberg et al. | Jun 2008 | A1 |
20080172789 | Elliot | Jul 2008 | A1 |
20080235871 | Newkirk | Oct 2008 | A1 |
20080235872 | Newkirk | Oct 2008 | A1 |
20090049610 | Heimbrock | Feb 2009 | A1 |
20090153370 | Cooper et al. | Jun 2009 | A1 |
20100039414 | Bell | Feb 2010 | A1 |
20100212087 | Leib et al. | Aug 2010 | A1 |
20110080421 | Capener | Apr 2011 | A1 |
20110144548 | Elliott | Jun 2011 | A1 |
20110162067 | Shuart et al. | Jun 2011 | A1 |
20110169653 | Wang et al. | Jul 2011 | A1 |
20110205061 | Wilson | Aug 2011 | A1 |
20120023670 | Zerhusen et al. | Feb 2012 | A1 |
20120089419 | Huster | Apr 2012 | A1 |
20120137436 | Andrienko | Jun 2012 | A1 |
20120215360 | Zerhusen et al. | Aug 2012 | A1 |
20120239173 | Laikari et al. | Sep 2012 | A1 |
20120239420 | Stapelfeldt | Sep 2012 | A1 |
20130138452 | Cork et al. | May 2013 | A1 |
20130142367 | Berry et al. | Jun 2013 | A1 |
20130227787 | Herbst et al. | Sep 2013 | A1 |
20130238991 | Jung et al. | Sep 2013 | A1 |
20130300867 | Yoder | Nov 2013 | A1 |
20130318716 | Vanderpohl, III | Dec 2013 | A1 |
20130340169 | Zerhusen | Dec 2013 | A1 |
20140076644 | Derenne et al. | Mar 2014 | A1 |
20140237721 | Lemire | Aug 2014 | A1 |
20140259410 | Zerhusen et al. | Sep 2014 | A1 |
20140259414 | Hayes | Sep 2014 | A1 |
20140265181 | Lambarth et al. | Sep 2014 | A1 |
20140297327 | Heil et al. | Oct 2014 | A1 |
20140313700 | Connell et al. | Oct 2014 | A1 |
20140342330 | Freeman et al. | Nov 2014 | A1 |
20150002393 | Cohen et al. | Jan 2015 | A1 |
20150033295 | Huster | Jan 2015 | A1 |
20150060162 | Goffer | Mar 2015 | A1 |
20150070319 | Pryor | Mar 2015 | A1 |
20150077534 | Derenne et al. | Mar 2015 | A1 |
20150109442 | Derenne et al. | Apr 2015 | A1 |
20150154002 | Weinstein et al. | Jun 2015 | A1 |
20150182400 | Meyer | Jul 2015 | A1 |
20150250669 | Elliott et al. | Sep 2015 | A1 |
20150317068 | Marka et al. | Nov 2015 | A1 |
20160012218 | Perna et al. | Jan 2016 | A1 |
20160022039 | Paul et al. | Jan 2016 | A1 |
20160038361 | Bhimavarapu et al. | Feb 2016 | A1 |
20160045382 | Goffer | Feb 2016 | A1 |
20160049028 | Kusens et al. | Feb 2016 | A1 |
20160050217 | Mare et al. | Feb 2016 | A1 |
20160055299 | Yarnell | Feb 2016 | A1 |
20160063897 | Rusin | Mar 2016 | A1 |
20160065909 | Derenne et al. | Mar 2016 | A1 |
20160095774 | Bobey et al. | Apr 2016 | A1 |
20160140307 | Brosnan et al. | May 2016 | A1 |
20160180668 | Kusens et al. | Jun 2016 | A1 |
20160183864 | Kusens et al. | Jun 2016 | A1 |
20160193095 | Roussy et al. | Jul 2016 | A1 |
20160199240 | Newkirk | Jul 2016 | A1 |
20160213537 | Hayes | Jul 2016 | A1 |
20160224195 | Okabe | Aug 2016 | A1 |
20160247342 | Kusens et al. | Aug 2016 | A1 |
20160296396 | Kolar et al. | Oct 2016 | A1 |
20160324705 | Bach Castillo | Nov 2016 | A1 |
20160338891 | Agdeppa et al. | Nov 2016 | A1 |
20160366327 | Kusens | Dec 2016 | A1 |
20160367420 | Zerhusen et al. | Dec 2016 | A1 |
20160371786 | Kusens et al. | Dec 2016 | A1 |
20170027787 | Huster | Feb 2017 | A1 |
20170027789 | St.John et al. | Feb 2017 | A1 |
20170035631 | Tsusaka | Feb 2017 | A1 |
20170049642 | Valentino et al. | Feb 2017 | A9 |
20170055113 | Kusens | Feb 2017 | A1 |
20170076526 | Kusens et al. | Mar 2017 | A1 |
20170094477 | Kusens et al. | Mar 2017 | A1 |
20170097800 | Vanderpohl, III | Apr 2017 | A1 |
20170098048 | Brosnan et al. | Apr 2017 | A1 |
20170109770 | Kusens et al. | Apr 2017 | A1 |
20170111770 | Kusens | Apr 2017 | A1 |
20170116790 | Kusens et al. | Apr 2017 | A1 |
20170124844 | Huster et al. | May 2017 | A1 |
20170128296 | Kostic et al. | May 2017 | A1 |
20170143565 | Childs | May 2017 | A1 |
20170172827 | Schaaf | Jun 2017 | A1 |
20170193177 | Kusens | Jul 2017 | A1 |
20170193180 | Kusens et al. | Jul 2017 | A1 |
20170193279 | Kusens et al. | Jul 2017 | A1 |
20170193772 | Kusens et al. | Jul 2017 | A1 |
20170195637 | Kusens et al. | Jul 2017 | A1 |
20170213445 | Kusens | Jul 2017 | A1 |
20170224562 | Zerhusen et al. | Aug 2017 | A1 |
20170229009 | Foster et al. | Aug 2017 | A1 |
20170259811 | Coulter et al. | Sep 2017 | A1 |
20170281440 | Puvogel et al. | Oct 2017 | A1 |
20170352212 | Kusens et al. | Dec 2017 | A1 |
20170364644 | Johnson | Dec 2017 | A1 |
20180017945 | Sidhu et al. | Jan 2018 | A1 |
20180039743 | Dixon et al. | Feb 2018 | A1 |
20180040091 | Kusens | Feb 2018 | A1 |
20180041864 | Kusens | Feb 2018 | A1 |
20180055418 | Kostic et al. | Mar 2018 | A1 |
20180056985 | Coulter et al. | Mar 2018 | A1 |
20180084390 | Kusens | Mar 2018 | A1 |
20180096550 | Kusens et al. | Apr 2018 | A1 |
20180110445 | Bhimavarapu et al. | Apr 2018 | A1 |
20180114053 | Kusens et al. | Apr 2018 | A1 |
20180137340 | Kusens et al. | May 2018 | A1 |
20180151010 | Kusens et al. | May 2018 | A1 |
20180161225 | Zerhusen et al. | Jun 2018 | A1 |
20180167816 | Kusens et al. | Jun 2018 | A1 |
20180184984 | Zerhusen et al. | Jul 2018 | A1 |
20180189946 | Kusens et al. | Jul 2018 | A1 |
20180211464 | Kusens et al. | Jul 2018 | A1 |
20180218489 | Kusens | Aug 2018 | A1 |
20180250177 | Magill et al. | Sep 2018 | A1 |
20180271286 | Jacobs et al. | Sep 2018 | A1 |
20180271287 | Jacobs et al. | Sep 2018 | A1 |
20180303687 | Moreno et al. | Oct 2018 | A1 |
20180369035 | Bhimavarapu et al. | Dec 2018 | A1 |
20180369037 | Desaulniers et al. | Dec 2018 | A1 |
20180369038 | Bhimavarapu et al. | Dec 2018 | A1 |
20180369039 | Bhimavarapu et al. | Dec 2018 | A1 |
20180374573 | Bhimavarapu et al. | Dec 2018 | A1 |
20180374577 | Bhimavarapu | Dec 2018 | A1 |
20190008708 | Moreno et al. | Jan 2019 | A1 |
20190024882 | Jonsson et al. | Jan 2019 | A1 |
20190046373 | Coulter et al. | Feb 2019 | A1 |
20190046376 | Chiacchira | Feb 2019 | A1 |
Number | Date | Country |
---|---|---|
101789230 | Jul 2010 | CN |
19505162 | Mar 1996 | DE |
0727298 | Aug 1996 | EP |
0727298 | Aug 1999 | EP |
2489341 | Aug 2012 | EP |
2531159 | Dec 2012 | EP |
2619724 | Jul 2013 | EP |
2918255 | Sep 2015 | EP |
5132312 | Apr 1993 | JP |
2003140631 | May 2003 | JP |
20130076922 | Jul 2013 | KR |
0101913 | Jan 2001 | WO |
2006089399 | Aug 2006 | WO |
2011097569 | Aug 2011 | WO |
2012040554 | Mar 2012 | WO |
2014021873 | Feb 2014 | WO |
2015148578 | Oct 2015 | WO |
2015157402 | Oct 2015 | WO |
2015171365 | Nov 2015 | WO |
2016123595 | Aug 2016 | WO |
2016196403 | Dec 2016 | WO |
2016200556 | Dec 2016 | WO |
2017027427 | Feb 2017 | WO |
2017031111 | Feb 2017 | WO |
2017061471 | Apr 2017 | WO |
2017124056 | Jul 2017 | WO |
2017201513 | Nov 2017 | WO |
2018026979 | Feb 2018 | WO |
2018154819 | Aug 2018 | WO |
2018203476 | Nov 2018 | WO |
2018216387 | Nov 2018 | WO |
Entry |
---|
English language abstract and machine-assisted English translation for WO 2017/061471 extracted from espacenet.com database on Mar. 25, 2019, 26 pages. |
English language abstract and machine-assisted English translation for WO 2018/154819 extracted from espacenet.com database on Mar. 25, 2019, 35 pages. |
English language abstract and machine-assisted English translation for WO 2018/203476 extracted from espacenet.com database on Mar. 25, 2019, 37 pages. |
English language abstract and machine-assisted English translation for WO 2018/216387 extracted from espacenet.com database on Mar. 25, 2019, 43 pages. |
Apple, “Adjust the Brightness on you iPhone, IPad, or IPod Touch”, https://support.apple.com/en-us/HT202613, 2018, 2 pages. |
Astral Healthcare, “Opthalmology Day Surgery Chair Webpage”, Apr. 2018, http://astralhealthcare.com/?product=opthalmology-day-surgery-chair, 6 pages. |
Campbell, Mikey, “Apple Expected to Replace Touch ID With Two-Step Facial, Fingerprint Bio-Recognition Tech”, Apple Insider, Jan. 21, 2017, http://iphone.appleinsider.com/articles/17/01/21/apple-expected-to-replace-touch-id-with-two-step-facial-fingerprint-bio-recognition-tech, 4 pages. |
Doge Medical, “DOC Classic—DOC Surgery Chairs Webpage”, 2014, 2 pages, https://web.archive.org/web/20140214203605/http://www.dogemedical.com/pages/en/products/surgery-chairs/doc-classic.php?lang=EN. |
English language abstract and machine-assisted English translation for CN 101789230 extracted from espacenet.com database on Aug. 30, 2018, 31 pages. |
English language abstract and machine-assisted English translation for JP 2003-140631 extracted from espacenet.com database on Aug. 30, 2018, 19 pages. |
English language abstract and machine-assisted English translation for KR 2013-0076922 A extracted from espacenet.com database on Aug. 16, 2018, 8 pages. |
English language abstract for DE 195 05 162 C1 extracted from espacenet.com database on Aug. 16, 2018, 1 page. |
English language abstract for EP 0 727 298 A1 extracted from espacenet.com database on Aug. 16, 2018, 1 page. |
English language abstract for EP 0 727 298 B1 extracted from espacenet.com database on Aug. 16, 2018, 1 page. |
Hall, Stephen, “Nest's 3rd Generation Thermostat Gets Some New Views for Its Farsight Feature”, 9 to 5 Google, Jun. 14, 2016, https://9to5google.com/2016/06/14/nest-3rd-gen-thermostat-views-farsight/, 4 pages. |
Hill-Rom, “Centrella Smart+Bed Brochure” 2017, 11 pages. |
Imore, “How to Use Night Shift on your iPhone or iPad”, video also found at https://www.imore.com/night-shift, Nov. 1, 2017, 12 pages. |
Recliners.LA “Stellar 550 Large Lift Chair Recliner Webpage”, Apr. 2018, https://www.recliners.la/products/ultra-comfort-stellar-550-large-lift-chair, 4 pages. |
Stryker Medical, “InTouch Critical Care Bed Operations Manual”, Aug. 2014, 125 pages. |
Stryker, “InTouch Critical Care Bed Model FL27 (2130/2140) Operations Manual—Optional Pendant Control”, 2130-009-001 Rev C, Apr. 2008, p. 25. |
Supportec-Trade, “Portfolilio Webpage”, 2017, https://supportec-trade.nl/en, 2 pages. |
U.S. Appl. No. 16/019,973, filed Jun. 27, 2018, 90 pages. |
U.S. Appl. No. 16/019,986, filed Jun. 27, 2018, 57 pages. |
U.S. Appl. No. 16/020,003, filed Jun. 27, 2018, 37 pages. |
U.S. Appl. No. 16/020,052, filed Jun. 27, 2018, 48 pages. |
U.S. Appl. No. 16/020,068, filed Jun. 27, 2018, 125 pages. |
U.S. Appl. No. 16/020,085, filed Jun. 27, 2018, 67 pages. |
U.S. Appl. No. 62/525,359, filed Jun. 27, 2017. |
U.S. Appl. No. 62/525,363, filed Jun. 27, 2017. |
U.S. Appl. No. 62/525,368, filed Jun. 27, 2017. |
U.S. Appl. No. 62/525,373, filed Jun. 27, 2017. |
U.S. Appl. No. 62/525,377, filed Jun. 27, 2017. |
Youtube, “Memory Seat Escape Video”, Nov. 4, 2013, https://www.youtube.com/watch?v=xlghNmAK-7A, 1 page. |
Youtube, “Microsoft HoloLens: Partner Spotlight with Stryker Communications Video”, Feb. 21, 2017, https://www.youtube.com/watch?v=FTPxUGRGpnA, 3 pages. |
Number | Date | Country | |
---|---|---|---|
20180374577 A1 | Dec 2018 | US |
Number | Date | Country | |
---|---|---|---|
62525363 | Jun 2017 | US |