The subject disclosure is related generally to a robotic surgical system, and particularly to a system for decoupling vibrations from an end effector at a robotic arm.
This section provides background information related to the present disclosure which is not necessarily prior art.
An instrument can be navigated relative to a subject for performing various procedures. For example, the subject can include a patient on which a surgical procedure is being performed. During a surgical procedure, an instrument can be tracked in an object or subject space. The location of the instrument that is tracked can be displayed on a display device relative to an image of the patient.
The position of the patient can be determined with a tracking system. Generally, a patient is registered to the image, via tracking an instrument relative to the patient to generate a translation map between the subject or object space (e.g., patient space) and the image space.
After registration, the position of the instrument can be appropriately displayed on the display device while tracking the instrument. The position of the instrument relative to the subject can be displayed as a graphical representation, sometimes referred to as an icon on the display device.
The instrument in guided surgery, particularly in spinal surgery, has a particular challenge for holding the position in a rigid system. A tool such as drill is used with a guide at end effector at the end of a robotic arm. Typically, a robotic arm has high rigidity relative to the patient and relative to the robot arm. The high rigidity may result in undesired system dynamics. The system dynamics may occur in natural vibrations at a high frequency due to the high stiffness in the system.
This section provides a general summary of the disclosure and is not a comprehensive disclosure of its full scope or all of its features.
The present system allows is a variable stiffness end effector that allows quick decoupling of a rigid mechanism and coupling using a compliant mechanism of the parts of the end effector support. In particular, the present disclosure allows the end effector support to be quickly decoupled from the tool insertion device located within the end effector support. This allows the natural frequency of the system to rapidly decrease and prevent natural frequency vibrations.
In one aspect of the disclosure, a variable stiffness end effector assembly for a medical robotic guidance system includes an end effector support and a tool insertion device disposed within and spaced apart from the end effector support. The assembly further includes a coupler coupling the tool insertion device within the end effector support. The coupler includes a first mode wherein the tool insertion device is rigidly coupled within the end effector support and second mode wherein the tool insertion device is compliantly coupled within the end effector support.
In another aspect of the disclosure, a system has a robotic arm with a variable stiffness end effector assembly comprising a tool insertion device disposed within the end effector support and coupled thereto with a coupler. The coupler comprises a first mode wherein the tool insertion device is rigidly coupled within the end effector support and second mode wherein the tool insertion device is compliantly coupled within the end effector support. A robotic control system is configured to control movement of the end effector support assembly. An input system is configured to receive input from a user and send a signal to the robotic control system to move the end effector support relative to a subject.
According to various embodiments, the end effector assembly may be moved relative to a subject, such as by a user and/or with a robotic system. The robotic system may include any appropriate robotic system, such as a Mazor X™ Robotic Guidance System, sold by Mazor Robotics Ltd. having a place of business in Israel and/or Medtronic, Inc. having a place of business in Minnesota, USA and/or as disclosed in U.S. Pat. No. 11,135,025, incorporated herein by reference.
The tracking or navigation system may include the end effector assembly, may also be used in various techniques and include an imaging system or following selected portions in a procedure. For example, the end effector assembly may be associated with the robotic system. The robotic system may move the end effector in a selected manner during the procedure. During the procedure, the end effector assembly may be moved based upon predetermined characteristics to follow for a selected portion during the procedure, such as during drilling for implanting certain implantable devices such as a spinal implant.
Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
Example embodiments will now be described more fully with reference to the accompanying drawings.
The subject disclosure is directed to an exemplary embodiment of a surgical procedure on a subject, such as a human patient. It is understood, however, that the system and methods described herein are merely exemplary and not intended to limit the scope of the claims included herein. In various embodiments, it is understood, that the systems and methods may be incorporated into and/or used on non-animate objects. The systems may be used to, for example, to register coordinate systems between two systems for use on manufacturing systems, maintenance systems, and the like. For example, automotive assembly may use one or more robotic systems including individual coordinate systems that may be registered together for coordinated or consorted actions. Accordingly, the exemplary illustration of a surgical procedure herein is not intended to limit the scope of the appended claims.
Discussed herein, according to various embodiments, are processes and systems for allowing the use of and navigation and registration between various coordinate systems, including robotic coordinate systems. In various embodiments, a first coordinate system may be registered to a second coordinate system, such as a robotic coordinate system to an image coordinate system or space. A navigation space or coordinate system may then be registered to the robotic or first coordinate system and, therefore, be registered to the image coordinate system without being separately or independently registered to the image space. Similarly, the navigation space or coordinate system may be registered to the image coordinate system or space directly or independently. The robotic or first coordinate system may then be registered to the navigation space and, therefore, be registered to the image coordinate system or space without being separately or independently registered to the image space.
In various embodiments, the different systems used relative to the subject may include different coordinate systems (e.g., locating systems). For example, a robotic system may be moved relative to a subject that includes a robotic coordinate system. The robot may be fixed, including removably fixed, at a position relative to the subject. Thus, movement of a portion of the robot relative to the base of the robot (i.e., the fixed portion of the robot) may be known due to various features of the robot. For example, encoders (e.g., optical encoders, potentiometer encoders, or the like) may be used to determine movement or amount of movement of various joints (e.g., pivots) of a robot. A position of an end effector assembly (e.g., a terminal end) of the robot may be known relative to the base of the robot. Given a known position of the subject relative to the base and the known position of the base relative to the subject, the position of the end effector assembly relative to the subject may be known during movement of a robot and/or during a stationary period of the end effector assembly. Thus, the robot may define a coordinate system relative to the subject.
Various other portions may also be tracked relative to the subject. For example, a tracking system may be incorporated into a navigation system that includes one or more instruments that may be tracked relative to the subject. The navigation system may include one or more tracking systems that track various portions, such as tracking devices, associated with instruments. The tracking system may include a localizer that is configured to determine the position of the tracking device in a navigation system coordinate system. Determination of the navigation system coordinate system may include those described at various references including U.S. Pat. Nos. 8,737,708; 9,737,235; 8,503,745; and 8,175,681; all incorporated herein by reference. In particular, a localizer may be able to track an object within a volume relative to the subject. The navigation volume, in which a device, may be tracked may include or be referred to as the navigation coordinate system or navigation space. A determination or correlation between the two coordinate systems may allow for or also be referred to as a registration between two coordinate systems.
In various embodiments, the first coordinate system, which may be a robotic coordinate system, may be registered to a second coordinate system, which may be a navigation coordinate system. Accordingly, coordinates in one coordinate system may then be transformed to a different or second coordinate system due to a registration. Registration may allow for the use of two coordinate systems and/or the switching between two coordinate systems. For example, during a procedure, a first coordinate system may be used for a first portion or a selected portion of a procedure and a second coordinate system may be used during a second portion of a procedure. Further, two coordinate systems may be used to perform or track a single portion of a procedure, such as for verification and/or collection of additional information.
Furthermore, images may be acquired of selected portions of a subject. The images may be displayed for viewing by a user, such as a surgeon. The images may have superimposed on a portion of the image a graphical representation of a tracked portion or member, such as an instrument. According to various embodiments, the graphical representation may be superimposed on the image at an appropriate position due to registration of an image space (also referred to as an image coordinate system) to a subject space. A method to register a subject space defined by a subject to an image space may include those disclosed in U.S. Pat. Nos. U.S. Pat. Nos. 8,737,708; 9,737,235; 8,503,745; and 8,175,681; all incorporated herein by reference.
During a selected procedure, the first coordinate system may be registered to the subject space or subject coordinate system due to a selected procedure, such as imaging of the subject. In various embodiments, the first coordinate system may be registered to the subject by imaging the subject with a fiducial portion that is fixed relative to the first member or system, such as the robotic system. The known position of the fiducial relative to the robotic system may be used to register the subject space relative to the robotic system due to the image of the subject including the fiducial portion. Thus, the position of the robotic system or a portion thereof, such as the end effector assembly, may be known or determined relative to the subject. Due to registration of a second coordinate system to the robotic coordinate system may allow for tracking of additional elements not fixed to the robot relative to a position determined or tracked by the robot.
The tracking of an instrument during a procedure, such as a surgical or operative procedure, allows for navigation of a procedure. When image data is used to define an image space it can be correlated or registered to a physical space defined by a subject, such as a patient. According to various embodiments, therefore, the patient defines a patient space in which an instrument can be tracked and navigated. The image space defined by the image data can be registered to the patient space defined by the patient. The registration can occur with the use of fiducials that can be identified in the image data and in the patient space.
The navigation system 26 can be used to track the location of one or more tracking devices, tracking devices may include a robot tracking device 54, a subject tracking device 58, an imaging system tracking device 62, and/or a tool tracking device 66. A tool 68 or moveable member may be any appropriate tool such as a drill, forceps, or other tool operated by a user 72. The tool 68 may also include an implant, such as a spinal implant or orthopedic implant. It should further be noted that the navigation system 26 may be used to navigate any type of instrument, implant, or delivery system, including: guide wires, arthroscopic systems, orthopedic implants, spinal implants, deep brain stimulation (DBS) probes, etc. Moreover, the instruments may be used to navigate or map any region of the body. The navigation system 26 and the various instruments may be used in any appropriate procedure, such as one that is generally minimally invasive or an open procedure.
An additional or alternative, imaging system 80 may be used to acquire pre-, intra-, or post-operative or real-time image data of a subject, such as the subject 30. It will be understood, however, that any appropriate subject can be imaged and any appropriate procedure may be performed relative to the subject. In the example shown, the imaging device 80 comprises an O-arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado, USA. The imaging device 80 may have a generally annular gantry housing 82 in which an image capturing portion is moveably placed. The imaging device 80 can include those disclosed in U.S. Pat. Nos. 7,188,998; 7,108,421; 7,106,825; 7,001,045; and 6,940,941; all of which are incorporated herein by reference, or any appropriate portions thereof. It is further appreciated that the imaging device 80 may include in addition or alternatively a fluoroscopic C-arm. Other exemplary imaging devices may include fluoroscopes such as bi-plane fluoroscopic systems, ceiling mounted fluoroscopic systems, cath-lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm fluoroscopic systems, 3D fluoroscopic systems, etc. Other appropriate imaging devices can also include MRI, CT, ultrasound, etc.
The position of the imaging system 33, 80, and/or portions therein such as the image capturing portion, can be precisely known relative to any other portion of the imaging device 33, 80. The imaging device 33, 80, according to various embodiments, can know and/or recall precise coordinates relative to a fixed or selected coordinate system. For example, the robotic system 20 may know or determine its position and position the effector assembly 44 at a selected pose. Similarly, the imaging system 80 may also position the imaging portions at a selected pose. This can allow the imaging system 80 to know its position relative to the patient 30 or other references. In addition, as discussed herein, the precise knowledge of the position of the image capturing portion can be used in conjunction with a tracking system to determine the position of the image capturing portion and the image data relative to the tracked subject, such as the patient 30.
Herein, reference to the imaging system 80 may refer to any appropriate imaging system, unless stated otherwise.
The imaging device 80 can be tracked with a tracking device 62. Also, a tracking device 81 can be associated directly with the effector assembly 44. The effector assembly 44 may, therefore, be directly tracked with a navigation system as discussed herein. In addition or alternatively, the effector assembly 44 may be positioned and tracked with the robotic system 20. Regardless, image data defining an image space acquired of the patient 30 can, according to various embodiments, be registered (e.g., manually, inherently, or automatically) relative to an object space. The object space can be the space defined by a patient 30 in the navigation system 26.
The patient 30 can also be tracked as the patient moves with a patient tracking device, DRF, or tracker 58. Alternatively, or in addition thereto, the patient 30 may be fixed within navigation space defined by the navigation system 26 to allow for registration. As discussed further herein, registration of the image space to the patient space or subject space allows for navigation of the instrument 68 with the image data. When navigating the instrument 68, a position of the instrument 68 can be illustrated relative to image data acquired of the patient 30 on a display device 84. An additional and/or alternative display device 84′ may also be present to display an image. Various tracking systems, such as one including an optical localizer 88 or an electromagnetic (EM) localizer 92 can be used to track the instrument 68.
More than one tracking system can be used to track the instrument 68 in the navigation system 26. According to various embodiments, these can include an electromagnetic tracking (EM) system having the EM localizer 94 and/or an optical tracking system having the optical localizer 88. Either or both of the tracking systems can be used to track selected tracking devices, as discussed herein. It will be understood, unless discussed otherwise, that a tracking device can be a portion trackable with a selected tracking system. A tracking device need not refer to the entire member or structure to which the tracking device is affixed or associated.
The position of the patient 30 relative to the imaging device 33 can be determined by the navigation system 26. The position of the imaging system 33 may be determined, as discussed herein. The patient 30 can be tracked with the dynamic reference frame 58, as discussed further herein. Accordingly, the position of the patient 30 relative to the imaging device 33 can be determined.
Image data acquired from the imaging system 33, or any appropriate imaging system, can be acquired at and/or forwarded from an image device controller 96, that may include a processor module, to the navigation computer and/or processor system 102 that can be a part of a controller or workstation 98 having the display 84 and a user interface 106. It will also be understood that the image data is not necessarily first retained in the controller 96, but may also be directly transmitted to the workstation 98. The workstation 98 can provide facilities for displaying the image data as an image 108 on the display 84, saving, digitally manipulating, or printing a hard copy image of the received image data. The user interface 106, which may be a keyboard, mouse, touch pen, touch screen or other suitable device, allows the user 72 to provide inputs to control the imaging device 80, via the image device controller 96, or adjust the display settings of the display 84. The workstation 98 may also direct the image device controller 96 to adjust the image capturing portion of the imaging device 80 to obtain various two-dimensional images along different planes in order to generate representative two-dimensional and three-dimensional image data.
With continuing reference to
Wired or physical connections can interconnect the tracking systems, imaging device 80, etc. Alternatively, various portions, such as the instrument 68 may employ a wireless communications channel, such as that disclosed in U.S. Pat. No. 6,474,341, entitled “Surgical Communication Power System,” issued Nov. 5, 2002, herein incorporated by reference, as opposed to being coupled directly to the controller 110. Also, the tracking devices 62, 66, 54 can generate a field and/or signal that is sensed by the localizer(s) 88, 94.
Various portions of the navigation system 26, such as the instrument 68, and others as will be described in detail below, can be equipped with at least one, and generally multiple, of the tracking devices 66. The instrument can also include more than one type or modality of tracking device 66, such as an EM tracking device and/or an optical tracking device. The instrument 68 can include a graspable or manipulable portion at a proximal end and the tracking devices may be fixed near the manipulable portion of the instrument 68.
Additional representative or alternative localization and tracking system is set forth in U.S. Pat. No. 5,983,126, entitled “Catheter Location System and Method,” issued Nov. 9, 1999, which is hereby incorporated by reference. The navigation system 26 may be a hybrid system that includes components from various tracking systems.
According to various embodiments, the navigation system 26 can be used to track the instrument 68 relative to the patient 30. The instrument 68 can be tracked with the tracking system, as discussed above. Image data of the patient 30, or an appropriate subject, can be used to assist the user 72 in guiding the instrument 68. The image data, however, is registered to the patient 30. The image data defines an image space that is registered to the patient space defined by the patient 30. The registration can be performed as discussed herein, automatically, manually, or combinations thereof.
Generally, registration allows a translation map to be generated of the physical location of the instrument 68 relative to the image space of the image data. The translation map allows the tracked position of the instrument 68 to be displayed on the display device 84 relative to the image data 108. A graphical representation 68i, also referred to as an icon, can be used to illustrate the location of the instrument 68 relative to the image data 108.
With continuing reference to
Referring now to
The desired position for the procedure of the effector assembly 44 is located by movement of the robotic arm 40. The variable stiffness end effector 44 is rigidly coupled to the robotic arm 40 during a portion of the procedure. When a drill is used as the tool, it may rotate at a high speed such as over 70000 rpms. Both the robotic arm 40 and the end effector assembly 44 have a resonant or natural frequency. The motion of the tool may cause the end effector assembly 44 to vibrate at a frequency that corresponds to a resonant or natural frequency of the end effector assembly 44, which may be deemed undesirable because it can be felt by the surgeon. Further, the harmonics of the vibration at the robot arm 40 may align with the vibration frequency at the end effector assembly 44 which may cause the rigidly coupled robotic arm to vibrate as well, which may be undesired. The variable stiffness end effector refers to an assembly allowing a tool holder of the end effector 44 (as described below) that absorbs vibration.
A representation of the robotic arm 40 and elbow joint 50 of the effector assembly 44 are illustrated. The effector assembly 44 has the coupling to the arm 40 represented by a spring 210 though which vibration may be coupled. The natural or resonant frequency of the system is the square root of the quotient of the stiffness and mass of the robotic arm. One example of a suitable robotic system is found in U.S. Pat. No. 11,135,035, the disclosure of which is incorporated by reference herein.
Referring now to
The work of the rotation by the drill is the product of T, the drill torque, ω, the rotation speed of the drill, and t, the time over which the work is done. The drill longitudinal work is the product of the force, F, of the drill and the distance, d. The rotational spring work uses the spring constant G and the angular distance θ through which the drill travels. The linear spring energy uses the spring constant, k, and the distance in the X direction to determine the spring work. The work also manifests in the energy U and heat Q at the vertebra.
The work may be balanced in an equation summing the work being done at different components. The energy balance at the vertebra 212 of the spine is simplified to be:
where the work of the rotation by the drill is the product of T, the drill torque, w, the rotation speed of the drill, and t, the time over which the work is done. The drill work Z is the product of the force, F, of the drill and the distance, d, and the angular distance Θ through which the drill travels. The work also manifests in the energy U and heat Q at the vertebra.
Because patient rotation and Shanz arm force resistance is zero. The formula for energy balance is simplified above. The lateral forces are due to the surgeon and surgeon tool lateral forces at 216. In this example, the tool slides freely in the arm axially relative to the longitudinal axis 314 of the end effector assembly 44. There are no vertical forces to the arm 40 and there is no rotational torque to the tool insertion device described below.
Referring now to
All of the elements are described above except the patient translation which is half of the product of the stiffness spring constant due to the patient's anatomy's deflection under load, kptn, and the square of the lateral distance moved by the patient.
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
Referring now to
A rigid coupling mechanism such as a plurality of radially extending pins 316A, 316B and 316C are disposed axially and hold the tool insertion device 312 fixed relative to the end effector support 310. As mentioned above, the forces of a tool being inserted into the tool insertion 312 may have a plurality of forces that are transmitted to the end effector support 310. The end effector support 310 may then transmit the forces or vibrations to the robotic arm 40 illustrated above. The end effector support 310 is affixed rigidly to the robotic arm 40 and therefore the vibrations travel from the effector support 310 to the robotic arm 40. The pins 316A, 316B and 316C are retractably coupled to the end effector support 310 so as to allow compliant coupling.
Referring now to
In this manner, the amount of forces or vibration from the effector assembly 44 is reduced or eliminated relative to the robot arm 40. In a compliant mode, the tool insertion device 312 moves within and relative to the end effector support 310. The movement may be caused by the vibration of the tool and thus the vibration at the tool insert device 312. The compliant mechanism locally expands and contracts to absorb he vibrations and prevents or reduced the vibration at the end effector support 310. In compliant mode, the tool insertion device 312 may not be concentric relative to the end effector support 310. The tool insertion device 312 is held within the end effector support 310 with the complaint mechanism.
Referring now to
In
Referring now to
As illustrated in
Referring now to
Referring now to
The example of
Referring now to
In
As illustrated best in
Referring now to
The system may also be used in a manual mode. A user interface 820 may be located external to the effector assembly 44 to initiate manually decoupling of the end effector support 310 from the tool insertion device 312. A foot pedal, button, dial or the like, may be used by the surgeon as the user interface 820. Upon sensing vibration, the effector assembly 44 is switched from the tool insertion device 312 being coupled in the rigid mode to a compliant mode. The surgeon may do this by feel or in response to a warning indicator 822 such as a buzzer or a light. That is, when a vibration in the natural frequency greater than a predetermined amplitude is sensed at the robotic arm or the effector assembly, a warning signal (light and/or sound), may be generated. The warning signal may be generated at the display 84 of the workstation 98. After the vibration or movement ceases, the user interface 820 may be used to re-engage the actuator 812 to re-enter a rigid mode. Of course, the controller 810 may be part of the surgery system and integrated in the controller 110.
With reference to
The process 430 may start in start Block 440.
In Block 442. the patient position is registered into the system. The robotic arm in Block 444 is also registered and located into the robotic surgery system.
In Block 450, the robotic system 20 may then move the effector assembly 44 in a manner based on known anatomy. As illustrated above, in
In Block 452, the tool is inserted into the tool insertion device. In Block 454, the tool, such as a drill, is activated or operated.
In Block 456, the vibration sensors are monitors to determine whether unwanted drill vibration or skiving is detected. As mentioned above, the detection of vibration or the vibration associated with skiving may be detected at a vibration sensor that detects a certain frequency or a vibration that is above a certain amplitude threshold. The surgeon may also detect the unwanted vibration through visual, audible or a tactile feeling of the effector assembly 44, the robotic arm or the tool.
In response to the vibrations from the vibration sensor 814A or 814B, or the user interface 820, Block 462 may generate a warning signal and the tool insertion device 312 is automatically rigidly decoupled from the effector support 310. That is, the tool insertion device is placed into a compliant mode in Block 464. The surgeon may then continue operating the tool within the tool insertion device in block 466.
In Block 468, if the vibration is not present, the tool insertion device remains decoupled from the effector in Block 464. In Block 468, if the vibration has ended, Block 454 continues the process of operating the tool.
Example embodiments are provided so that this disclosure will be thorough and will fully convey the scope to those who are skilled in the art. Numerous specific details are set forth such as examples of specific components, devices, and methods, to provide a thorough understanding of embodiments of the present disclosure. It will be apparent to those skilled in the art that specific details need not be employed, that example embodiments may be embodied in many different forms and that neither should be construed to limit the scope of the disclosure. In some example embodiments, well-known processes, well-known device structures, and well-known technologies are not described in detail.
Instructions may be executed by a processor and may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects. The term shared processor circuit encompasses a single processor circuit that executes some or all code from multiple modules. The term group processor circuit encompasses a processor circuit that, in combination with additional processor circuits, executes some or all code from one or more modules. References to multiple processor circuits encompass multiple processor circuits on discrete dies, multiple processor circuits on a single die, multiple cores of a single processor circuit, multiple threads of a single processor circuit, or a combination of the above. The term shared memory circuit encompasses a single memory circuit that stores some or all code from multiple modules. The term group memory circuit encompasses a memory circuit that, in combination with additional memories, stores some or all code from one or more modules.
The apparatuses and methods described in this application may be partially or fully implemented by a processor (also referred to as a processor module) that may include a special purpose computer (i.e., created by configuring a processor) and/or a general purpose computer to execute one or more particular functions embodied in computer programs. The computer programs include processor-executable instructions that are stored on at least one non-transitory, tangible computer-readable medium. The computer programs may also include or rely on stored data. The computer programs may include a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services and applications, etc.
The computer programs may include: (i) assembly code; (ii) object code generated from source code by a compiler; (iii) source code for execution by an interpreter; (iv) source code for compilation and execution by a just-in-time compiler, (v) descriptive text for parsing, such as HTML (hypertext markup language) or XML (extensible markup language), etc. As examples only, source code may be written in C, C++, C#, Objective-C, Haskell, Go, SQL, Lisp, Java®, ASP, Perl, Javascript®, HTML5, Ada, ASP (active server pages), Perl, Scala, Erlang, Ruby, Flash®, Visual Basic®, Lua, or Python®.
Communications may include wireless communications described in the present disclosure can be conducted in full or partial compliance with IEEE standard 802.11-2012, IEEE standard 802.16-2009, and/or IEEE standard 802.20-2008. In various implementations, IEEE 802.11-2012 may be supplemented by draft IEEE standard 802.11ac, draft IEEE standard 802.11ad, and/or draft IEEE standard 802.11ah.
A processor, processor module, module or ‘controller’ may be used interchangeably herein (unless specifically noted otherwise) and each may be replaced with the term ‘circuit.’ Any of these terms may refer to, be part of, or include: an Application Specific Integrated Circuit (ASIC); a digital, analog, or mixed analog/digital discrete circuit; a digital, analog, or mixed analog/digital integrated circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor circuit (shared, dedicated, or group) that executes code; a memory circuit (shared, dedicated, or group) that stores code executed by the processor circuit; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system-on-chip.
Instructions may be executed by one or more processors or processor modules, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” or “processor module” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the invention, and all such modifications are intended to be included within the scope of the invention.
This application claims the benefit of and priority to U.S. Provisional Patent Application No. 63/462,330 filed Apr. 27, 2023 and U.S. Provisional Patent Application No. 63/462,301 filed Apr. 27, 2023, and the disclosures of each of the above-identified applications are hereby incorporated by reference in their entirety.
| Number | Date | Country | |
|---|---|---|---|
| 63462330 | Apr 2023 | US | |
| 63462301 | Apr 2023 | US |