Needle procedures (such as injections, IV placements, and venipunctures) are some of the most frequently performed clinical interventions, with blood draws occurring in nearly 40% of emergency department visits. Particularly for children, needle sticks induce pain in nearly 76% of pediatric patients and are described by children as one of the most traumatic aspects of visits to the hospital and doctor. Thus, these procedures can be painful and distressing, and lead to anxiety, loss of sleep, and resistance to medical visits. In view of this, effective needle pain management is an important driver of whether a patient will receive adequate healthcare as a child and throughout their adult life. Indeed, needle pain and the accompanying distress are well-documented contributors to healthcare avoidance behaviors and vaccine refusal in children and their guardians, as well as later in life when the child becomes an adult. The efficacy of conventional needle pain management strategies, which may include topical anesthesia, tactile stimulation, and conversational distraction techniques, remains debated and the challenges surrounding needle pain are still a pervasive clinical issue.
Separately, seizure disorders (e.g., epilepsy) chronically affect tens of millions of people. Individual seizures are caused by abnormally neuronal activity in the brain, and can be evidenced by decreased consciousness, automatism, tonic-clonic convulsions, and the like. Because of these outward symptoms, seizure disorders can affect an individual's ability to work and perform daily functions, and can be stigmatized.
According to one example of the present disclosure a method comprises: operating a first sensory actuator with respect to an extremity of a user; operating a second sensory actuator with respect to a proxy extremity, the proxy extremity corresponding to the extremity of the user; and performing a medical procedure on the extremity of the user, wherein operation of the first sensory actuator is visually hidden from the user and induces a non-visual sensory response in the extremity of the user, and wherein operation of the second sensory actuator is visible to the user and does not induce the non-visual sensory response in the user.
In various embodiments of the above example, the extremity is a finger or hand; the extremity is a toe or foot; the first sensory actuator and the second sensory actuator forming a sensory actuator pair and the method comprising operating a plurality of sensory actuator pairs, wherein each sensory actuator pair corresponds to a different extremity of the user, and wherein the sensory actuators of each sensory actuator pair are operated simultaneously; the first sensory actuator and the second sensory actuator forming a sensory actuator pair and the method comprising operating a plurality of sensory actuator pairs, wherein each sensory actuator pair corresponds to a different extremity of the user, and wherein the sensory actuators of each sensory actuator pair are operated asynchronously; the first sensory actuator and the second sensory actuator are operated until the user at least partially disembodies the extremity; the first sensory actuator is not visible to the user during operation of the first and second sensory actuators, and the second sensory actuator is visible to the user during operation of the first and second sensory actuators; and/or the first sensory actuator is a rack and pinion actuator, and operating the first sensory actuator comprises controlling a servo motor of the rack and pinion.
According to another example of the present disclosure, a system comprises: a lower frame; an upper frame rotatably attached to the lower frame; a real hand plate attached to the lower frame and configured to hold a hand of a user; a proxy hand plate attached to the upper frame; and a proxy hand held on the proxy hand plate, wherein the real hand plate comprises a plurality of first sensory actuators and the proxy hand plate comprises a plurality of second sensory actuators, and wherein each one of the second sensory actuators is at a location of the proxy hand corresponding to a location of one of the first sensory actuators of the user's hand.
In various embodiments of the above example, each of the first sensory actuators is at a different finger of the user's hand, and each of the second sensory actuators is at a different finger of the proxy hand; the real hand plate and the proxy hand plate each comprise a plurality of finger clamps attached to the plate, each of the plurality of finger plates housing a different one of the first and second sensory actuators and being configured to hold a finger of the user's hand or the proxy hand; each of the first and second sensory actuators is a rack and pinion actuator comprising a servo configured to rotate the pinion; the system further comprises a processor configured to: simultaneously operate one of the first sensory actuators and the one of the second sensory actuators at the corresponding location of the proxy hand; the system further comprises a processor configured to: operate one of the first sensory actuators and the one of the second sensory actuators at the corresponding location of the proxy hand at different times, or simultaneously operate one of the first sensory actuators at a location of the hand of the user and one of the second sensory actuators at a different location of the proxy hand; an angle between the lower frame and the upper frame is greater than zero degrees such that when the sensory actuators are operated, the user's hand and the first sensory actuators are not visible to the user, and the proxy hand and the second sensory actuators are visible to the user; and/or the proxy hand is a digital representation of the hand of the user on a display, the display being held on the proxy hand plate.
According to still another example of the present disclosure, a system comprises: a touch feedback device comprising a plurality of first sensory actuators and configured to hold an extremity of a user; a proxy extremity; and a processor configured to operate one of the first sensory actuators a location of the extremity of the user corresponding to an identified location of the proxy extremity, wherein the extremity of the user and the proxy extremity are in a visually separated fields of view, and wherein the plurality of first sensory actuators are configured to induce a sensory response of the user when operated.
In various embodiments of the above example, the system further comprises: a wand, wherein the processor is configured to determine the identified location of the proxy extremity based on a proximity or touch of the wand to the proxy extremity; the processor is further configured to: predict the identified location of the proxy extremity based on a movement of the want or the proximity of the wand to the proxy extremity, and operate the one of the first sensory actuators prior to the wand touching the proxy extremity; and/or the proxy extremity is a digital representation of the extremity of the user.
The systems and methods of the present disclosure induce a ‘cognitive/perceptual illusion’ that can be used to reduce pain during needle procedures and diagnose and treat seizure disorders, such as epilepsy. The systems and methods are also applicable to other outpatient procedures including, but not limited to, grafts, minor excisions, trauma/wound treatments, stitching, and the like, which can cause pain to the user.
The cognitive illusion is based on the relationship between a person's internally modeled reality, an externally applied reality (e.g., an illusory reality), and the actual reality. This relationship is modeled in
The reality sampled by biological sensors 102 may be modified by an externally applied reality 106, where valve 108 represents the combination of the actual reality 100 and the externally applied reality 106. The actual reality 100 combined with the externally applied reality 106 combined at valve 108 are together sampled by the biological sensors 102. Modifications of the sampled reality can include, but are not limited to, inputs such as tactile, visual, auditory, olfactory, thermal, nociceptive, proprioceptive, and gustatory either alone or in relation to anticipated outcomes.
The output of the biological sensors 102 can again be modified by the externally applied reality 106 (at valve 110) before being sent through the individual's nervous system to comparator 112. Modifications through valve 110 can include, but are not limited to, stimulatory approaches such as electrical, magnetic, chemical, mechanical, illusory, ultrasound, optical, thermal, anesthetic, and nerve redirection. In other words, these modifications are not those that would be sampled by biological sensors 102. Rather, the modifications introduced at valve 110 are to the outputs of the biological sensors 102 themselves (e.g., to the electrophysiological signals generated by the sensors 102).
The internally modeled reality 114 then is the individual's ideas, assumptions, predictions, and the like about the individual's place. This internally modeled reality 114 is created from the senses, observations, and learned experiences, and provides the individual with a framework by which to anticipate, predict, expect, and/or modulate their engagement with their reality.
The comparator 112 continuously monitors sensory information for discrepancies from internally modeled expectations 114. Resulting differences between observations (from biological sensors 102) and expectations (of the internally modeled reality 114) are used to update the internally modeled reality 114. In other words, the internally modeled reality 114 is composed of the differences between observations and expectations arising from the comparator 112 and generates and/or modifies expectations or intent.
The output of the individual's internally modeled reality 114 is supplied to the biological actuators 104, as a person intends to interact with the actual reality 100. The output of the internally modeled reality 114 may be combined with (and thus modify) the externally applied reality 106 at valve 116, prior to being supplied through the individual's nervous system to the biological actuators 104. Similar to internal modifications of the biological sensors at valve 110, modifications through valve 116 can include, but are not limited to, stimulatory approaches such as electrical, magnetic, chemical, mechanical, illusory, ultrasound, optical, thermal, anesthetic, and nerve redirection.
Similarly, the output of biological actuators 104 may modify the externally applied reality 106 at valve 118. Such modifications can alter the individual's movements, and change physical outcomes. These changes affect the actual reality 100, which is then detected by biological sensors 102 as described above.
As this relates to pain induced by medical procedures (e.g., needle procedures), the model of
In generating the cognitive illusion of ownership over a proxy hand, a user feels a touch on their real hand, but sees the touch occurring on the proxy hand. This mismatch enters the model of
According to some embodiments, the user's hand is placed into a disembodiment device that holds both the user's hand and fingertips, and the hand and fingertips of a proxy hand. More particularly, the user's hand remains flat on a table with the palm up, while the proxy hand is also arranged palm up and oriented at an angle such that the proxy hand is visible to the user while obscuring the real hand. In this way, the proxy hand represents an externally applied reality 106 that replaces the user's real hand as detected by the biological sensors 102 (the user's eyes). In other words, as detected by the biological sensors 102, the actual reality 100 (the user's real hand) is modified by the externally applied reality (the proxy hand) at valve 108.
The disembodiment device has a first set of sensory actuators associated with the real hand, and a second set of sensory actuators associated with the proxy hand. The first set of sensory actuators is operated to cause a tactile sensory response on the user's real hand, but cannot be seen by the user. Conversely, the second set of sensory actuators is operated to be seen by the user to be engaging with the proxy hand, but does not cause a tactile sensory response. In other words, operation of the sensory actuators results in activation of the biological sensors 102 in the real hand (e.g., to produce a haptic feeling). In this regard, the sensory actuators may take any form that would cause a sensory response. For example, each sensory actuator may be a device that directly touches the real hand, or one that results in the sensation of a touch (e.g., by releasing pressurized air, or increasing air pressure at the hand, or the like).
Again operating at valve 108 to modify the actual reality 100, the first set of sensory actuators is synchronously (at the same time and relative place on the corresponding hand) operated with the second set of sensory actuators in the case of reducing pain during a medical procedure (e.g., needle procedures). The timing and location of sensory actuator operation may be random and last on the order of a few seconds to a few minutes. This induces a sensory mismatch between the visual and tactile sensory information streams from biological sensors 102 that are read and compared at the user's brain (the comparator 112). The sensory mismatch between what is seen and what is felt induces conflict in the user's internally modeled reality 114.
To resolve the conflict between what is seen and what is felt, the expectations generated by the internally modeled reality 114 are realigned with respect to the observed sensory mismatch to minimize the discrepancy identified by comparator 112. This cognitive realignment of the user's internally modeled reality 114 shifts attribution of the sensation of the felt touch on the user's real hand to the location of the proxy hand that is in view. The realignment of felt touch to the proxy hand causes the cognitive illusion that the proxy hand is part of the user's body.
Simultaneously, the realignment of felt touch to the proxy hand leads to cognitive neglect of the user's real hand and disembodiment of the user's real hand from their body image. As a result, pain felt by a needle (or other procedure) in the user's real hand is reduced.
And as this relates to the diagnosis and treatment of seizure disorders such as epilepsy, a comparator 112 of an epileptic brain is less stringent when resolving conflict between what is seen and what is felt. In this case the requirement of specific context related to the simultaneous presentation of both seen and felt sensations for cognitive realignment is lost. As such, simple correlations in time (such as seen touches on the proxy hand that follow felt touches to the real hand by a brief yet equidistant interval) or space (such as seen touches on a digit of the proxy hand that are simultaneously felt on a different digit of the real hand) results in maladaptive cognitive realignment of the user's internally modelled reality 114, which shifts attribution of the sensation of the felt touch on the user's real hand to the location of the proxy hand in view.
In the case of diagnosing and/or treating epilepsy, the first set of actuators is asynchronously operated with the second set of actuators. Particularly, operation of the first and second sets of actuators is offset either temporally or spatially. The shift to attribution of the sensation of the felt touch on the user's real hand to the location of the proxy hand in view under temporal or spatial mismatch thus reveals the maladaptive propensity for updating the user's internally modelled reality 114 based on non-contextual simple visual and temporal correlations.
Considering this, when diagnosing and/or treating epilepsy, the first and second sets of sensory actuators would not operate synchronously as described above. Rather, the sensory actuators are operated asynchronously, where operation of a pair of sensory actuators (e.g., one of the first set of sensory actuators and one of the second set of sensory actuators) are separated in time or space. For example, a pair of sensory actuators associated with the same digit are operated at different times. Alternatively, a pair of sensory actuators associated with different digits are operated at the same time. Because spatial distance can affect the comparator 112 of the epileptic brain, the digits associated with a pair of sensory actuators operated at the same time may be selected based on their relative locations.
In one example embodiment, digits D1-D3 (thumb, index, and middle, respectively) may be characterized as a first group and digits D4-D5 (ring and little/pinky, respectively) may be characterized as a second group. In such an embodiment, the sensory actuator of each pair may be associated with a digit in different groups. For example, the sensory actuator from the first set of sensory actuators may be associated with digit D2 and the sensory actuator from the second set of sensory actuators may be associated with digit D4. In this case, the sensory actuator associated with user's real digit D2 is operated at the same time the sensory actuator associated with the proxy digit D4 is operated.
According to another example embodiment, a first digit is selected (e.g., by random) and a second digit is then determined as the digit that is farthest away. For example, if digit D1 is selected on the user's real hand, then digit D5 is selected on the proxy hand. Thus, the sensory actuator associated with digit D1 of the first set of sensory actuators is operated at the same time the sensory actuator associated with digit D5 of the second set of sensory actuators is operated. In still another embodiment, there is at least one digit between the digits associated with the sensory actuator pair. In other words, sensory actuators are not operated at the same time for the same or adjacent digits. For example, the sensory actuator pairs may be associated with digits D1 and D3, D1 and D4, D1 and D5, D2 and D4, D2 and D5, or D3 and D5.
An example disembodiment device 200 according to the above embodiments is illustrated in
The frame portions 300, 302 preferably have tracks (e.g., extrusions, cutouts, grooves, rails, or the like) to which the hand plate 208 and brackets 304 can be attached. Accordingly, the attachment points of the hand plate 208 and brackets 304 to each portion 300, 302 can be adjusted along to any point of the frame portions 300, 302 to suitably accommodate a user's hand and arm. In other words, a total length L of the lower frame 202, and relative position of hand plate 208, can be adjusted according to a user's arm and hand length/size.
The hand plates 208, 210 further include finger clips 504 attached to the plates 208, 210 in a portion of the cutout 502 corresponding to at least one finger. Although the example of
The finger clips 504 are illustrated in more detail in
The upper portion 600 of the finger clip 504 may include a slot or like aperture through which the sensory actuator may engage with the finger. In the example of
Operation of the disembodiment device 200 may be wholly or partially controlled by electronics in the electronics enclosure 212. The electronics enclosure 212 may house at least a processor, memory, and power source. The power source may be an internally housed battery or a power supply connectable to a mains power (e.g., a wall outlet). The processor is configured to output control signals to each of the sensory actuators of the device 200. These control signals may be supplied to the sensory actuators, for example, via communication cables removably attached to the electronics enclosure 212 at one end, and to the sensory actuators of the corresponding hand plates 208, 210 at another end, via communication ports at the electronics enclosure 212 and the individual sensory actuators or the plates 208, 210.
The processor may further be configured to communicate with external or otherwise remote devices through wired or wireless connections. For example, the processor may be further configured to control operation of lights associated with each sensory actuator. Control signals for the lights may then be supplied from the electronics enclosure 212 via expansion ports of the electronics enclosure. In other examples, the processor may be configured to communicate with a remote control processor (e.g., a remote server) via a wired connection or wireless network. In this manner, the disembodiment device 200 may be remotely operated or monitored.
The electronics enclosure 212 may further include human-machine-interface (HMI) elements. For example, power switches may control the supply of power to the processor (and thus the whole device) and/or the hand plates 208, 210 individually. Further, the position of each sensory actuator may be adjusted by knobs or the like, for example, by adjusting potentiometers. The sensory actuator position adjustment may allow for accommodation of different finger diameters and placements within the finger clips 504. Furthermore, the sensory actuator position adjustment may allow for close proximity placement of the sensory actuator to the finger. The processor may be configured to adjust control of each sensory actuator (e.g., a power, speed, magnitude, or the like). Still further, a mode selection switch(es) may be used to select an operation program executed by the processor. Such programs may control the sequence, order, timing, and like operation of each of the sensory actuators. Changing the sequence, order, and timing of actuation may allow for modifications to the experience of the ownership illusion, and adjusting experimental controls and for exploring comparator function during research. For example, such modes can include a needle stick (or like) procedure mode in which the sensory actuators of each plate 208, 210 are operated synchronously, an epilepsy diagnostic or treatment mode in which the sensory actuators of each plate 208, 210 are operated asynchronously, and/or a research mode in which the sensory actuators of each plate 208, 210 can be specially controlled according to a research protocol.
With reference to
In the configuration of
When not in use, the disembodiment device 200 may be folded and carriable. As illustrated in
Turning now to
The sensed shared reality of each agent is then compared at comparator 912-1, 912-2 (analogous to comparator 112 of
Because both agents interact with the same shared reality 900, the internally modeled reality for each agent is affected by the internally modeled reality of the other agent. For example, the internally modeled reality 914-1 of Agent 1 corresponds to shared reality 900 as modified by Agent 2 (based on the internally modeled reality 914-2 of Agent 2) and sensed by Agent 1. In this sense, action of each agent on the shared reality has a similar effect to the externally applied reality 106 of
Based on the model of
In operation, a user 1006 (corresponding to Agent 1 of
In some embodiments such as that in
For example, if the proxy limb 1000 includes sensors 1008 at each fingertip, the touch feedback device 1004 preferably has actuators 1010 at locations in contact with the user's fingertips. In some embodiments, the touch feedback device 1004 may include finger clips or like devices for holding the user's real hand in the touch feedback device. Those finger clips may house the sensory actuators 1010, similar to the finger clips 504 discussed above. In other embodiments, the touch feedback device 1004 may be embodied as a glove.
Accordingly, upon touching the proxy limb 1000 with the wand 1002, the touch feedback device 1004 synchronously touches the user's real limb 1012 at the corresponding location. For example, if the user 1006 touches the wand 1002 at an index fingertip on the proxy limb 1000, the touch feedback device 1004 preferably actuates a sensory actuator 1010 in contact with the user's real index finger. The cooperation between the detection of the wand 1002 by the proxy limb 1000 and actuation of the sensory actuators 1010 in the touch feedback device is moderated by the controller. While the controller 1020 is shown in
The user's use of the wand 1002 may be part of an interactive game. In one embodiment, the user 1006 places their real hand (on the limb the procedure is to be performed) 1012, palm up, into the touch feedback device 1004, or in a glove of the touch feedback device 1004. The touch feedback device 1004 may be hidden behind a screen 1014 affixed to a procedure table 1016, or otherwise in a visually separate field of view than the proxy hand 1000. The sensory actuators 1010 are located at each fingertip position of the feedback device 1004. The proxy or like proxy hand/arm 1000 is provided on a side of the screen 1014 that the user can see, and is posed in the same way that the hidden real hand 1012 is posed in the feedback device 1004.
In use, as noted above, when the user 1006 touches the wand 1002 to one of the fingertips of the proxy hand 1000, the wand 1002 is detected by the sensor 1008 of the proxy fingertip. In some embodiments discussed in more detail below, the wand 1002 only needs to be within a predetermined proximity of the fingertip to be sensed by the sensors 1008. The sensor 1008 then transmits a signal to the controller 1020, which identifies the location of the detection and causes actuation of the corresponding sensory actuator 1010 of the feedback device 1004. Depending on the sensors used, a pressure, proximity, or the like may also be detected and analyzed by the controller 1020 so as to cause a haptic sensation caused by the sensory actuator 1010 to more closely correspond to the actual touch by the wand 1002. The wand, in turn, may contain actuators that use the predictive nature of the proximity sensing and touch feedback to prepare the system to provide low (or negligible) latency to the feedback device 1004 that give the user 1006 the impression of fingertip compliance. Such a realistic latency can provide a more realistic perceptual illusion of game and hand naturalness to facilitate a cooperative engagement with the cognitive mechanism (the shared reality 900).
For example, the controller 1020 may cause a high intensity actuation upon a pressure sensor's detection of a high pressure touch by the wand 1002. Preferably, the controller 1020 causes actuation of the sensory actuators 1010 simultaneously (or nearly simultaneously, if realistic latency based on predictive touch is employed) with the touch of the wand 1002 to the proxy 1000. As discussed above, such simultaneous touch and feedback further strengthens the illusion and the ability of the user to cognitively neglect the real procedure hand. With respect to
In some embodiments, the proxy hand 1000 may further include LEDs or like lights 1018 to indicate particular locations of the proxy hand 1000 for the user 1006 to touch with the wand 1002. For example, lights 1018 in each finger may be illuminated (as controlled by the controller 1020) according to a pattern in which the user is to touch each finger of the proxy hand 1000. The pattern may show only one light at a time—for example, thus waiting for a user 1006 to touch the finger corresponding to the illuminated light before illuminating another light—or may show a plurality of lights at a time—for example, requiring the user 1006 to remember a series of touches to perform after the illumination pattern is shown. Characteristics of the lights (e.g., color and intensity) may also be controlled to indicate a desired pressure of touch, speed of touch, or the like, or to help the user 1006 distinguish between the different fingers. The lights 1018 may be illuminated randomly or according to a predetermined pattern.
By performing dissociation with the above or a like game causes the user 1006 to further focus cognitive attention on the dissociation process. Further, this limits the interruption of clinical workflow because the user is in charge of establishing the cognitive disconnection while waiting for a clinician to prepare a workspace. Still further, presenting the dissociation as a game can be fun and distract the user from the impending procedure.
Referring back to
In view of this, the controller 1020 may control the sensory actuators based on projected anticipated touch events into the future in order to compensate for electrical, mechanical, and processing lag in the illusion game. For example, because proximity sensors can recognize the presence of the wand 1002 in the proximity of the fingertip of the proxy 1000 prior to an actual touch, the sensor 1008 may signal the controller 1020 of the impending touch. The controller 1020 can then properly time its output signal actuating the appropriate sensory actuator 1010 so that the actuation begins at the time of actual touch.
By knowing the location of each sensor 1008, the controller 1020 may identify the location of the wand 1002 based on which sensor 1008 senses the presence of the wand 1002. In some embodiments, the wand 1002 may be detected by multiple sensors 1008. In these cases, the controller 1020 may predict the location of the touch by identifying the sensor 1008 with which the proximity changes at the greatest rate, thus suggesting the sensor 1008 to which the wand 1002 is moving most directly toward.
In some embodiments, velocity, acceleration, and/or motion information of the wand 1002 may also be used by the controller 1020 to predict its location by recognizing that the user 906 may naturally (at least temporarily) slow movement of the wand 1002 as the wand 1002 approaches the desired touch location, in order to increase accuracy of the touch. Thus, a sensor 1008 detecting proximity at a time corresponding to a slowing of the wand 1002 may be predictive of the location of the future touch.
In some embodiments, the controller 1020 may predict the touch location based on activation of the above-discussed LEDs 1018. In other words, because the activated lights 1018 serve as instructions for body part locations to be touched by the user 1006, the controller 1020 may assume the user will follow the instructions and thus predict the location of a future touch based on the light instructions.
In still other embodiments, the controller 1020 may analyze sensor signals from the proxy limb 1000 and/or the wand 1002 to determine both a velocity/acceleration and directionality of the wand's movement. The controller 1020 may use the directionality information to predict the touch location, and the velocity/acceleration to predict the touch time. In short, the controller may predict when, and on which finger, an actual touch on the proxy hand 1000 will occur. The controller 1020 may then cause the sensory actuator 1010 for the corresponding finger on the real hand 1012 to actuate at the predicted time of touch.
In some embodiments, a system that compensates for lag is not limited to the full system lag or no lag at all. Rather, the lag may merely be reduced, zero, negative, dithered (e.g., having a zero-mean but deliberately noisy), and the like.
The controller 1020 may also be configured to learn the behavior of the user 1006, and adjust in real-time to make more accurate and effective predictions about the timing of future touch events. For example, by considering data from accelerometers or like sensors in the wand (e.g., as recorded in a memory), the controller 1020 may determine average rate of movement of the wand. By knowing the detection proximity of sensors 1008 in the proxy hand 1000, and the speed of the wand 1002, the controller may more accurately determine when the wand will touch the proxy. In some embodiments, the controller 1020 may be or include a learned controller including, for example, a machine learning system that is continually trained by the user's wand movement. Accordingly, the controller's predictions may improve during the illusion game. These predictions by the controller 1020 may also be unique to each user.
While various features are presented above, it should be understood that the features may be used singly or in any combination thereof. Further, it should be understood that variations and modifications may occur to those skilled in the art to which the claimed examples pertain.
This application claims priority to U.S. Provisional Patent Application No. 63/276,068 filed on Nov. 5, 2021, the entirety of which is incorporated herein by reference.
This invention was made with government support under NS081710 awarded by the National Institutes of Health and the Department of Defense (DARPA) P-1108-114403 and (CDMMRP) W81XWH-15-1-0575. The government has certain rights in the invention.
Number | Date | Country | |
---|---|---|---|
63276068 | Nov 2021 | US |