A computer-assisted surgical system that employs robotic and/or teleoperation technology typically includes a stereoscopic image viewer configured to provide, for display to a surgeon, images of an imaging space (e.g., a surgical space) as captured by an imaging device such as an endoscope. While the surgeon's eyes are positioned in front of viewing lenses of the stereoscopic image viewer, the surgeon may view the images of the surgical space while remotely manipulating one or more surgical instruments located within the surgical space. The surgical instruments are attached to one or more manipulator arms of a surgical instrument manipulating system included as part of the computer-assisted surgical system.
The stereoscopic image viewer of a computer-assisted surgical system has to display a pair of images (one to each eye) which are precisely overlapped. In addition, a compact stereoscopic image viewer typically involves using one or more optical elements that apply optical power to locate an image-render plane at a specified distance from the user. However, such optical elements may create distortion (e.g., barrel distortion, pincushion distortion, etc.) of the displayed images for each eye. This distortion inhibits overlapping the pair of images. If not corrected, such distortion can degrade the user's experience (e.g., by causing nausea, headaches, etc.) while the user operates the computer-assisted surgical system. To correct this distortion, some computer-assisted surgical systems include distortion-correction lens elements placed close to a flat panel display of the stereoscopic image viewer. However, such distortion-correction lens elements are complex, large, costly, and difficult to manufacture/source. Accordingly, there remains room to improve distortion correction in an image viewer of a computer-assisted surgical system.
An example system comprises one or more processors configured to perform a process comprising: obtaining a distortion factor representative of an amount of distortion imparted by one or more optical elements of an image viewer of a computer-assisted surgical system on images displayed by way of the image viewer; obtaining image data representative of an image that is displayable by way of the image viewer; modifying, based on the distortion factor, the image data to correct for the distortion imparted by the one or more optical elements of the image viewer, the modifying resulting in modified image data representative of the image; and directing the computer-assisted surgical system to display, based on the modified image data, the image by way of the image viewer.
An example computer-assisted surgical system comprises an image viewer including: a display device; and an optical assembly through which a user views images displayed by the display device, the optical assembly provided along an optical path between the display device and an eye of the user, the optical assembly imparting a distortion on images displayed by the display device when viewed through the optical assembly; wherein the display device is configured to: receive images to be displayed by way of the image viewer; and display the images based on a distortion factor that corrects for the distortion imparted by the optical assembly.
An additional example computer-assisted surgical system comprises one or more manipulator arms; and one or more processors configured to perform a process comprising: obtaining a distortion factor representative of an amount of distortion imparted by one or more optical elements of an image viewer; obtaining image data representative of an image that is displayable by way of the image viewer; modifying, based on the distortion factor, the image data to correct for the distortion imparted by the one or more optical elements of the image viewer, the modifying resulting in modified image data representative of the image; and presenting, based on the modified image data, the image by way of the image viewer.
An example method comprises obtaining, by an image processing system, a distortion factor representative of an amount of distortion imparted by one or more optical elements of an image viewer of a computer-assisted surgical system on images displayed by way of the image viewer; obtaining, by the image processing system, image data representative of an image that is displayable by way of the image viewer; modifying, by the image processing system and based on the distortion factor, the image data to correct for the distortion imparted by the one or more optical elements of the image viewer, the modifying resulting in modified image data representative of the image; and directing, by the image processing system, the computer-assisted surgical system to display, based on the modified image data, the image by way of the image viewer.
The accompanying drawings illustrate various embodiments and are a part of the specification. The illustrated embodiments are merely examples and do not limit the scope of the disclosure. Throughout the drawings, identical or similar reference numbers designate identical or similar elements.
Systems and methods for correcting distortion caused by optical elements of an image viewer of a computer-assisted surgical system are described herein. As will be described in more detail below, an illustrative system includes one or more processors configured to perform a process. The process may comprise obtaining a distortion factor representative of an amount of distortion imparted by one or more optical elements of an image viewer of a computer-assisted surgical system on images displayed by way of the image viewer, obtaining image data representative of an image that is displayable by way of the image viewer, modifying, based on the distortion factor, the image data to correct for the distortion imparted by the one or more optical elements of the image viewer, the modifying resulting in modified image data representative of the image, and directing the computer-assisted surgical system to display, based on the modified image data, the image by way of the image viewer.
Various advantages and benefits are associated with systems and methods described herein. For example, systems and methods such as those described herein may facilitate correcting distortion caused by one or more optical elements of an image viewer in a manner that improves user experience, is more cost effective, and/or is more efficient than conventional image viewers. For example, systems and methods such as those described herein do not require the use of costly and difficult to manufacture distortion-correction lenses. In addition, by eliminating the need for such distortion-correction lenses, it is possible to reduce or eliminate undesired stray light paths in the image viewer. These and other benefits that may be realized by the systems and methods described herein will be evident from the disclosure that follows.
Example systems described herein may be configured to operate as part of or in conjunction with a plurality of different types of computer-assisted surgical systems. The different types of computer-assisted surgical systems may include any type of computer-assisted surgical system as may serve a particular implementation, such as a computer-assisted surgical system designed for use in minimally-invasive medical procedures, for example. In certain examples, a type of computer-assisted surgical system may include a system in which one or more surgical devices (e.g., surgical instruments) are manually (e.g., laparoscopically) controlled by a user. In certain examples, a type of computer-assisted surgical system may include a robotic surgical system configured to facilitate operation of one or more smart instruments (e.g., smart sub-surface imaging devices) that may be manually and/or robotically controlled by a user. In certain implementations, the plurality of different types of computer-assisted surgical systems may be of different types at least because they include different types of surgical instrument manipulating systems. For example, a first computer-assisted surgical system may include a first type of surgical instrument manipulating system, a second computer-assisted surgical system may include a second type of surgical instrument manipulating system, and a third computer-assisted surgical system may include a third type of surgical instrument manipulating system.
Each type of surgical instrument manipulating system may have a different architecture (e.g., a manipulator arm architecture), have a different kinematic profile, and/or operate according to different configuration parameters. An illustrative computer-assisted surgical system with a first type of surgical instrument manipulating system will now be described with reference to
Surgical system 100 may be utilized by a surgical team to perform a computer-assisted surgical procedure on a patient 108. As shown, the surgical team may include a surgeon 110-1, an assistant 110-2, a nurse 110-3, and an anesthesiologist 110-4, all of whom may be collectively referred to as “surgical team members 110.” Additional or alternative surgical team members may be present during a surgical session as may serve a particular implementation.
While
As shown in
In the example shown in
Manipulator arms 112 and/or robotic instruments attached to manipulator arms 112 may include one or more displacement transducers, orientational sensors, and/or positional sensors (hereinafter “surgical system sensors”) used to generate raw (e.g., uncorrected) kinematics information. One or more components of surgical system 100 may be configured to use the kinematics information to track (e.g., determine positions of) and/or control the robotic instruments.
In addition, manipulator arms 112 may each include or otherwise be associated with a plurality of motors or actuators that control movement of manipulator arms 112 and/or the surgical instruments attached thereto. For example, manipulator arm 112-1 may include or otherwise be associated with a first internal motor (not explicitly shown) configured to yaw manipulator arm 112-1 about a yaw axis. In like manner, manipulator arm 112-1 may be associated with a second internal motor (not explicitly shown) configured to drive and pitch manipulator arm 112-1 about a pitch axis. Likewise, manipulator arm 112-1 may be associated with a third internal motor (not explicitly shown) configured to slide manipulator arm 112-1 along insertion axis. Manipulator arms 112 may each include a drive train system driven by one or more of these motors in order to control the pivoting of manipulator arms 112 in any manner as may serve a particular implementation. As such, if a robotic instrument attached, for example, to manipulator arm 112-1 is to be mechanically moved, one or more of the motors coupled to the drive train may be energized to move manipulator arm 112-1.
Robotic instruments attached to manipulator arms 112 may each be positioned in an imaging space. An “imaging space” as used herein may refer to any space or location where an imaging operation may be performed by an imaging device such as described herein. In certain examples, an imaging space may correspond to a surgical space. A “surgical space” may, in certain examples, be entirely disposed within a patient and may include an area within the patient at or near where a surgical procedure is planned to be performed, is being performed, or has been performed. For example, for a minimally invasive surgical procedure being performed on tissue internal to a patient, the surgical space may include the tissue, anatomy underlying the tissue, as well as space around the tissue where, for example, robotic instruments and/or other instruments being used to perform the surgical procedure are located. In other examples, a surgical space may be at least partially disposed external to the patient at or near where a surgical procedure is planned to be performed, is being performed, or has been performed on the patient. For instance, surgical system 100 may be used to perform an open surgical procedure such that part of the surgical space (e.g., tissue being operated on) is internal to the patient while another part of the surgical space (e.g., a space around the tissue where one or more instruments may be disposed) is external to the patient. A robotic instrument may be referred to as being positioned or located at or within a surgical space when at least a portion of the robotic instrument (e.g., a distal portion of the robotic instrument) is located within the surgical space. Example imaging spaces and/or images of imaging spaces will be described herein.
User control system 104 is configured to facilitate control by surgeon 110-1 of manipulator arms 112 and robotic instruments attached to manipulator arms 112. For example, surgeon 110-1 may interact with user control system 104 to remotely move, manipulate, or otherwise teleoperate manipulator arms 112 and the robotic instruments. To this end, user control system 104 may provide surgeon 110-1 with one or more images of a surgical space associated with patient 108 as captured by an imaging device. In certain examples, user control system 104 may include a stereoscopic image viewer having two displays where stereoscopic image pairs of a surgical space associated with patient 108 and generated by a stereoscopic imaging system may be viewed by surgeon 110-1. Surgeon 110-1 may utilize the image or images to perform one or more procedures with one or more robotic instruments attached to manipulator arms 112.
To facilitate control of robotic instruments, user control system 104 may include a set of master controls (not shown). These master controls may be manipulated by surgeon 110-1 to control movement of robotic instruments (e.g., by utilizing robotic and/or teleoperation technology). The master controls may be configured to detect a wide variety of hand, wrist, and finger movements by surgeon 110-1. In this manner, surgeon 110-1 may intuitively perform a surgical procedure using one or more robotic instruments.
In some examples, user control system 104 may be further configured to facilitate control by surgeon 110-1 of other components of surgical system 100. For example, surgeon 110-1 may interact with user control system 104 to change a configuration or operating mode of surgical system 100, to change a display mode of surgical system 100, to generate additional control signals used to control surgical instruments attached to manipulator arms 112, to facilitate switching control from one robotic instrument to another, to facilitate interaction with other instruments and/or objects within the surgical space, or to perform any other suitable operation. To this end, user control system 104 may also include one or more input devices (e.g., foot pedals, buttons, switches, etc.) configured to receive input from surgeon 110-1.
In some examples, auxiliary system 106 includes one or more computing devices configured to perform primary processing operations of surgical system 100. The one or more computing devices included in auxiliary system 106 may control and/or coordinate operations performed by various other components (e.g., manipulating system 102 and/or user control system 104) of surgical system 100. For example, a computing device included in user control system 104 may transmit instructions to manipulating system 102 by way of the one or more computing devices included in auxiliary system 106. As another example, auxiliary system 106 may receive, from manipulating system 102, and process image data representative of images captured by an imaging device attached to one of manipulator arms 112.
In some examples, auxiliary system 106 is configured to present visual content to surgical team members 110 who may not have access to the images provided to surgeon 110-1 at user control system 104. To this end, as shown, auxiliary system 106 includes a display monitor 114 configured to display one or more user interfaces, such as images of the surgical space, information associated with patient 108 and/or the surgical procedure, and/or any other visual content as may serve a particular implementation. For example, display monitor 114 may display images of the surgical space together with additional content (e.g., representations of target objects, graphical content, contextual information, etc.) concurrently displayed with the images. In some embodiments, display monitor 114 is implemented by a touchscreen display with which surgical team members 110 may interact (e.g., by way of touch gestures) to provide user input to surgical system 100.
Manipulating system 102, user control system 104, and auxiliary system 106 may be communicatively coupled one to another in any suitable manner. For example, as shown in
As shown in
The depiction of image viewer 202 in
Image viewer 202 may be implemented in any suitable manner as part of the computer assisted surgical system. For example, image viewer 202 may be part of a surgeon console (e.g., user control system 104) that is communicatively coupled to one or more manipulator arms (e.g., manipulator arms 112) and an imaging device 210 communicatively coupled to a computer-assisted surgical system.
As shown in
Display device 206 may include any suitable display device as may serve a particular implementation. For example, in certain implementations display device 206 may correspond to a flat panel light-emitting diode (LED) display device, a liquid crystal display (LCD) device, a liquid crystal on silicon (LCOS) display device, a micro-electromechanical systems (MEMS) display device, a digital light processing (DLP) display device, organic light-emitting diode (OLED) display device, and/or any other suitable type of display device. In certain examples, display device 206 may correspond to a rectilinear display device. In certain alternative implementations, display device 206 may correspond to a non-rectilinear display device (e.g., a display device having a curved display architecture with a non-rectilinear pixel grid). Implementations using a non-rectilinear display device are described further herein.
Display device 206 is configured to receive images captured by imaging device 210 and display the images such that they are viewable along an optical path through optical assembly 204. Imaging device 210 may include any suitable type of imaging device as may serve a particular implementation. For example, imaging device 210 may be implemented by an endoscope that is engaged with a manipulator arm (e.g., manipulator arm 112-2) of the computer-assisted surgical system.
As shown in
As shown in
Image viewer 202 may be configured in any suitable manner to correct for the distortion imparted by optical assembly 204. In certain examples, image viewer 202 may further include or otherwise be communicatively coupled to one or more processors configured to modify images captured by imaging device 210 based on the distortion factor. In such examples, distortion correction processing may be performed on an image to be displayed before the image is displayed by way of display device 206. Such distortion correction processing is configured to counteract the distortion caused by the optical assembly 204 such that, when the image is viewed by way of optical assembly 204, it does not appear distorted to the user.
System 300 may include additional or alternative elements as may serve a particular implementation. For example, system 300 may further include a memory selectively and communicatively coupled to processor 302. Such a memory and processor 302 may each include or be implemented by hardware and/or software components (e.g., processors, memories, communication interfaces, instructions stored in memory for execution by the processors, etc.). In some examples, such a memory and processor 302 may be implemented by a single device (e.g., a single computing device). In certain alternate examples a memory and processor 302 may be distributed between multiple devices and/or multiple locations as may serve a particular implementation.
In certain examples, a memory may maintain (e.g., store) executable data used by processor 302 to perform any of the operations described herein. For example, a memory may store instructions that may be executed by processor 302 to perform any of the operations described herein. Such instructions may be implemented by any suitable application, software, code, computer program, and/or other executable data instance.
A computer program may be written in any form of programming language including compiled and/or interpreted languages, and can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computing program may be deployed to be executed by processor 302 at one site or distributed across multiple sites and interconnected by a network.
In certain examples, a memory may also maintain any data received, generated, managed, used, and/or transmitted by processor 302. For example, a memory may maintain any suitable data associated with correcting a distortion caused by one or more optical elements. Such data may include, but is not limited to, data associated with distortion factors, factors associated with use of an image viewer, image data (e.g., endoscopic images) of an imaging space, user interface content (e.g., graphical objects, notifications, etc.), and/or any other suitable data.
Processor 302 may be configured to perform (e.g., execute instructions stored in a memory) various processing operations associated with correcting a distortion caused by one or more optical elements. For example, processor 302 may modify, based on a distortion factor, image data to correct for distortion imparted by one or more optical elements of an image viewer, the modifying resulting in modified image data representative of the image. Processor 302 may further direct a computer-assisted surgical system to display, based on the modified image data, the image by way of the image viewer. These and other operations that may be performed by processor 302 are described herein.
At operation 402, system 300 obtains a distortion factor. This may be performed in any suitable manner. For example, system 300 may access the distortion factor from any suitable memory or storage device communicatively coupled to system 300. In certain examples, system 300 may additionally or alternatively obtain the distortion factor by accessing a pre-programmed distortion factor from logic of an FPGA.
In certain examples, system 300 may obtain the distortion factor by generating the distortion factor based on one or more factors or parameters associated with operation of an image viewer. For example, system 300 may generate the distortion factor based on the focal length of optical assembly 204, a distance between display device 206 and optical assembly 204, a distance between eye 208 of the user and optical assembly 204, an interpupillary distance between eye 208 and the other eye of the user, a distortion caused by imaging device 210, and/or any other suitable factor or parameter.
At operation 404, system 300 obtains image data representative of an image that is displayable by way of the image viewer. In certain examples, the image may be captured by an imaging device associated with a computer-assisted surgical system. This may be accomplished in any suitable manner. For example, system 300 may receive image data captured by imaging device 210 during a surgical procedure performed by the computer-assisted surgical system.
At operation 406, system 300 modifies, based on the distortion factor, the image data to correct for the distortion imparted by one or more optical elements of the image viewer. This may be accomplished in any suitable manner. For example, system 300 may pre-distort image data utilizing at least some pixels included in a plurality of pixels of the image. To illustrate,
In certain examples, system 300 may modify the image data line by line for each pixel included in the image data.
In certain alternative examples, system 300 may modify the image data for only a subset of pixels included in a plurality of pixels of the image data. In such examples, there may be less distortion imparted on certain portions of an image viewed through an optical assembly than other portions. For example, pixels along a vertical center line of an image may either not appear distorted or may appear less distorted than pixels at the corners of an image. In such examples, system 300 may be configured to pre-distort pixel values for only a subset of pixels included in a plurality of pixels of an image. For example, the pre-distortion may only be applied to pixels that are positioned more than a pre-defined threshold distance from the vertical center line of an image.
In certain examples, system 300 may automatically switch from modifying each pixel to modifying only a subset of the pixels based on one or more factors associated with operation of a computer-assisted surgical system. For example, system 300 may automatically switch to modifying only a subset of the pixels based on bandwidth constraints, the type of surgical procedure (e.g., some types of surgical procedures may be less intricate and require less distortion), and/or any other suitable factor.
Returning to
In certain examples, one or more factors associated with operation of an image viewer may change in a manner that affects the amount of distortion and/or the type of distortion imparted on images viewed through an image viewer. For example, a distance between eye 208 of the user and optical assembly 204 may change during use by the user physically moving eye 208 away from image viewer 202. Such a change may result in more or less distortion being visible on images viewed through image viewer 202. Additionally or alternatively, the presence or absence of certain lenses (e.g., in a system with lenses swapped in and out) may result in a change that affects the amount of distortion. Additionally or alternatively, an optical element changing shape (e.g., deformable mirror, tuneable lens, etc.) may result in a change that affects the amount of distortion. Additionally or alternatively, a change in a thermal state of an image viewer may affect the amount of distortion. Additionally or alternatively, relative distances of elements in an optical assembly may change, which may affect the amount of distortion. Additionally or alternatively, where the user is looking in a surgical scene while using an image viewer may affect the amount of distortion that needs to be corrected. Accordingly, at operation 410, system 300 determines whether there has been a change in factors associated with operation of the image viewer. If the answer at operation 410 is “No,” the flow may return to operation 408. If the answer at operation 410 is “Yes,” system 300 may obtain an updated distortion factor at operation 412 that takes into consideration the change in the factors. The flow may then return to operation 406 where system 300 may modify the image data based on the updated distortion factor.
Operations 406 may be repeated any suitable number of times to ensure that distortion caused by one or more optical elements of an image viewer is dynamically corrected during operation of the image viewer. In certain examples, system 300 may continually monitor factors associated with operation of image viewer 202 to ensure that the distortion is corrected as the factors change during use.
In certain examples, system 300 may be configured to minimize the amount of time used to process the image data to correct for the distortion. For example, system 300 may process the image data such that a latency from image capture to displaying the image by way of the image viewer is less than 50 milliseconds.
In certain additional or alternative examples, the distortion caused by an optical assembly of an image viewer and/or other components associated with an image viewer may be corrected by a hardware configuration of a display device (e.g., display device 206). In such examples, the display device may include pixels that are arranged in an array based on the distortion factor to correct for the distortion imparted by the optical assembly. This may be accomplished in any suitable manner. For example, an arrangement of the pixels of the display device may be non-rectilinear such that, when an image displayed by the display device is viewed by way of the optical elements of an image viewer, the image does not appear distorted to a user. To illustrate,
The number, shape, and/or arrangement of pixels 602 shown in
In certain additional or alternative examples, the distortion caused by an optical assembly of an image viewer and/or other components associated with an image viewer may be corrected in hardware by using a deformable optical device (deformable mirror, deformable lens, adjusting how a laser is scanned, etc.). System 300 may be configured in any suitable manner to control deformation of such a deformable device to correct for distortion in images viewable by way of an image viewer. For example, system 300 may dynamically control the deformation of a deformable element as one or more factors associated with an image viewer change during operation of the image viewer.
At operation 702, an image processing system (e.g., image processing system 300) obtains a distortion factor representative of an amount of distortion imparted by one or more optical elements of an image viewer of a computer-assisted surgical system on images displayed by way of the image viewer. Operation 702 may be performed in any of the ways described herein.
At operation 704, the image processing system obtains image data representative of an image that is displayable by way of the image viewer. Operation 704 may be performed in any of the ways described herein.
At operation 706, the image processing system modifies, based on the distortion factor, the image data to correct for the distortion imparted by the one or more optical elements of the image viewer, the modifying resulting in modified image data representative of the image. Operation 706 may be performed in any of the ways described herein.
At operation 708, the image processing system directs the computer-assisted surgical system to display, based on the modified image data, the image by way of the image viewer. Operation 708 may be performed in any of the ways described herein.
Although the preceding disclosure describes correcting for distortion imparted by one or more optical elements of an image viewer on images captured by an imaging device, it is understood that the distortion correction concepts described herein may be applied to any type of content or combination thereof that is displayable by way of an image viewer. For example, distortion correction may be applied to one or more images and/or other content that are not captured by an imaging device such as computer-generated renderings, augmented reality content, virtual reality content, user interface elements, text, graphics, images and/or video supplied by a third party (e.g., a customer), and/or any other type of content.
In some examples, a non-transitory computer-readable medium storing computer-readable instructions may be provided in accordance with the principles described herein. The instructions, when executed by a processor of a computing device, may direct the processor and/or computing device to perform one or more operations, including one or more of the operations described herein. Such instructions may be stored and/or transmitted using any of a variety of known computer-readable media.
A non-transitory computer-readable medium as referred to herein may include any non-transitory storage medium that participates in providing data (e.g., instructions) that may be read and/or executed by a computing device (e.g., by a processor of a computing device). For example, a non-transitory computer-readable medium may include, but is not limited to, any combination of non-volatile storage media and/or volatile storage media. Illustrative non-volatile storage media include, but are not limited to, read-only memory, flash memory, a solid-state drive, a magnetic storage device (e.g., a hard disk, a floppy disk, magnetic tape, etc.), ferroelectric random-access memory (“RAM”), and an optical disc (e.g., a compact disc, a digital video disc, a Blu-ray disc, etc.). Illustrative volatile storage media include, but are not limited to, RAM (e.g., dynamic RAM).
Communication interface 802 may be configured to communicate with one or more computing devices. Examples of communication interface 802 include, without limitation, a wired network interface (such as a network interface card), a wireless network interface (such as a wireless network interface card), a modem, an audio/video connection, and any other suitable interface.
Processor 804 generally represents any type or form of processing unit capable of processing data and/or interpreting, executing, and/or directing execution of one or more of the instructions, processes, and/or operations described herein. Processor 804 may perform operations by executing computer-executable instructions 812 (e.g., an application, software, code, and/or other executable data instance) stored in storage device 806.
Storage device 806 may include one or more data storage media, devices, or configurations and may employ any type, form, and combination of data storage media and/or device. For example, storage device 806 may include, but is not limited to, any combination of the non-volatile media and/or volatile media described herein. Electronic data, including data described herein, may be temporarily and/or permanently stored in storage device 806. For example, data representative of computer-executable instructions 812 configured to direct processor 804 to perform any of the operations described herein may be stored within storage device 806. In some examples, data may be arranged in one or more databases residing within storage device 806.
I/O module 808 may include one or more I/O modules configured to receive user input and provide user output. I/O module 808 may include any hardware, firmware, software, or combination thereof supportive of input and output capabilities. For example, I/O module 808 may include hardware and/or software for capturing user input, including, but not limited to, a keyboard or keypad, a touchscreen component (e.g., touchscreen display), a receiver (e.g., an RF or infrared receiver), motion sensors, and/or one or more input buttons.
I/O module 808 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen), one or more output drivers (e.g., display drivers), one or more audio speakers, and one or more audio drivers. In certain embodiments, I/O module 808 is configured to provide graphical data to a display for presentation to a user. The graphical data may be representative of one or more graphical user interfaces and/or any other graphical content as may serve a particular implementation.
In some examples, any of the systems, computing devices, and/or other components described herein may be implemented by computing device 800. For example, processor 302 may be implemented by processor 804.
In the preceding description, various example embodiments have been described with reference to the accompanying drawings. It will, however, be evident that various modifications and changes may be made thereto, and additional embodiments may be implemented, without departing from the scope of the invention as set forth in the claims that follow. For example, certain features of one embodiment described herein may be combined with or substituted for features of another embodiment described herein. The description and drawings are accordingly to be regarded in an illustrative rather than a restrictive sense.
The present application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 63/623,656, filed on Jan. 22, 2024, which is incorporated herein by reference in its entirety.
| Number | Date | Country | |
|---|---|---|---|
| 63623656 | Jan 2024 | US |