This disclosure relates to systems and methods for interfacing with a medical imaging system. Specifically, this disclosure relates to systems and methods for interfacing with an ultrasound imaging system that utilize a touchpad displayed on a touch screen interface.
Systems and methods are presented for enabling a user to interact with a medical imaging system using a touch screen display. In certain embodiments, a primary imaging area displaying images captured by the medical imaging system may be displayed in a first area of the touch screen display. A touchpad configured to receive input from a user may be displayed in a second area of the touch screen display that is different than the first area. Input provided by the user using the touchpad display may be used to control and/or interact with the medical imaging system. By utilizing a touchpad displayed seperate from the primary imaging area, a user may precisely position a cursor, annotation, and/or measurement marker point within the primary imaging area without obscuring the primary imaging area with their body.
A detailed description of systems and methods consistent with embodiments of the present disclosure is provided below. While several embodiments are described, it should be understood that disclosure is not limited to any one embodiment, but instead encompasses numerous alternatives, modifications, and equivalents. In addition, while numerous specific details are set forth in the following description in order to provide a thorough understanding of the embodiments disclosed herein, some embodiments can be practiced without some or all of these details. Moreover, for the purpose of clarity, certain technical material that is known in the related art has not been described in detail in order to avoid unnecessarily obscuring the disclosure.
As illustrated, the exemplary interface 100 may include a primary imaging area 102. The primary imaging area 102 may display images (e.g., real time or near-real time images) captured by the ultrasound imaging system. For example, images may be displayed in the primary imaging area 102 taken during an abdominal examination, a kidney examination, an early obstetrical examination, a late obstetrical examination, a gynecological examination, a thyroid examination, a breast examination, a testicular examination, an adult or pediatric cardiac examination, an upper or lower extremity arterial or venous vascular examination, a carotid vascular examination, and/or any other type of ultrasound imaging examination.
In certain embodiments, the interface 100 may be displayed on a touch screen panel that may be capable of detecting the presence and location of a touch (e.g., by a finger, hand, stylus, and/or the like) within the display area. The touch screen panel may implement any suitable type of touch screen technology including, for example, resistive touch screen technology, surface acoustic wave touch screen technology, capacitive touch screen technology, and/or the like. In certain embodiments, the touch screen panel may be a customized touch screen panel for the ultrasound imaging system. In further embodiments, the touch screen panel may be part of a discrete computing system incorporating a touch screen panel (e.g., an iPad or other suitable tablet computing device) configured to operate with the ultrasound imaging system.
A user may interact (i.e., provide input) with the touch screen panel and captured ultrasound images by touching the touch screen panel in relevant areas. For example, a user may touch the interface 100 within the primary imaging area 102 to interact with and/or control a displayed image. In certain embodiments, the interface 100 may include a touchpad 104. In some embodiments, a user's ability to interact with the interface 100 may be bounded within an area defined by the touchpad 104 and/or one or more function menus and buttons displayed on the interface 100. For example, a user may interact with the interface 100 within areas defined by the touchpad 104 and/or one or more function menus and not within other areas of the interface 100. Accordingly, if a user's finger crosses outside the area defined by the touchpad 104, the motion of the user's finger may not be utilized to interact with the primary imaging area 102 until the user's finger returns to the area defined by the touchpad 104. The touchpad 104 may further be configured to interact with and/or control any other area displayed on the interface 100.
A set button 106 may be disposed on the interface 100 proximate to the touchpad 104. The set button 106 may be used in conjunction with a the touchpad 104 to interact with and/or control the ultrasound system. For example, a user may utilize the touchpad 104 to position a cursor over a particular area of the interface 100 and utilize the set button 106 to perform a certain function involving the area (e.g., selecting a particular function button and/or menu, placing a particular annotation and/or measurement marker, etc.) Alternatively, or in addition to, a user may utilize the touchpad 104 to both position a cursor and to perform a certain function involving the cursor. For example, a user may utilize the touchpad 104 to position a cursor over a particular area of the interface 100 and also utilize the touchpad 104 (e.g., by tapping the touchpad twice or the like) to perform a certain function involving the area.
When interacting with the primary imaging area 102 and/or other areas displayed on the interface 100 using the touchpad 104, the user may utilize one or more functional tools. For example, a user may utilize the touchpad 104 to operate one or more marker tools, measurement tools, annotation tools, region of interests tools, and/or any other functional tools while interacting with the primary imaging area 102. Certain exemplary functional tools are described in more detail below.
In some embodiments, interacting with the touch screen panel via the touchpad 104 and/or one or more function menus and buttons may help to keep the primary imaging area 102 substantially clean from fingerprints, smudges, and/or any materials deposited by a user's fingers and hands. Interacting with the discrete touchpad 104 may also allow the user to interact with the primary imaging area 102 with a high degree of precision and without obscuring the primary imaging area 102. Further, utilizing a touch screen panel system may reduce mechanical malfunctions due to broken moving parts and may reduce the areas where contaminants may be deposited, thereby preserving the cleanliness of medical examination, operating, and/or hospital rooms.
The interface may include one or more system status indicators 108. In certain embodiments, the system status indicators 108 may include a power status indicator, a system configuration indicator, a network connectivity indicator, and/or any other type of system status indicator. The power status indicator may indicate whether the ultrasound system is coupled to AC power or, alternatively, powered by a battery. The system configuration indicator may indicate the status of certain system configurations. The network connectivity indicator may indicate the network connectivity status of the ultrasound system (e.g., connected via Wifi). In certain embodiments, a user may access system status indicator sub-menus by touching any of the system status indicators 108 on the interface 100. For example, a user may touch the system configuration indicator and be presented with a sub-menu allowing the user to modify the configuration of the ultrasound system. Similarly, a user may touch the network connectivity indicator and be presented with a sub-menu allowing the user to view and/or modify the network connectivity of the ultrasound system.
The interface 100 may also display examination and probe type indicators 110. The examination indicator may indicate a type of examination being performed using the ultrasound system. For example, as illustrated, the examination indicator may indicate that the ultrasound system is being used to perform an abdominal examination. The probe type indicator may indicate a type of probe being used with the ultrasound system. In certain embodiments, a user may adjust the examination and/or probe type indicators 110 by touching the examination and/or probe type indicators 110 on the interface 100 and selecting an examination and/or probe type from sub-menu displayed in response to the user's touch. In further embodiments, the ultrasound system may automatically detect an examination and/or probe type, and update the examination and probe type indicators 110 accordingly.
The interface 100 may further display patient identification information 112. In some embodiments, the patient identification information 112 may comprise a patient's name, gender, assigned identification number, and/or any other information that may be used to identify the patient. A user may adjust the patient identification 112 information by touching the patient identification information 112 on the interface 100 and entering appropriate patient identification information 112 into a sub-menu displayed in response to the user's touch. In certain embodiments, the patient identification information may be utilized to identify and access certain images captured by the ultrasound system.
A date and time indication 114 may further be displayed on the interface. In certain embodiments, the date and time indication 114 may be utilized to identify and access certain images captured by the ultrasound system (e.g., time-stamped images). A user may adjust the date and time information displayed in the date and time indication 114 by touching the date and time indication 114 on the interface 100 and entering appropriate date and time information into a sub-menu displayed in response to the user's touch.
Display scaling information 116 may be displayed on the interface 100 that provides information useful in viewing and/or interpreting ultrasound images displayed in the primary imaging area 102. For example, when ultrasound images displayed in the primary imaging area 102 are displayed in a grey scale format, the display scaling information 116 may provide an indication as to relative measurement degrees represented by each shade in the grey scale format. In embodiments where images displayed in the primary imaging area 102 are displayed in a color format, the display scaling information 116 may provide an indication as to relevant measurement degrees represented by each color in the color format. In certain embodiments, a user may adjust the display format of the images displayed in the primary imaging area 102 by touching the display scaling information 116 on the interface and selecting an appropriate display format in a sub-menu displayed in response to the user's touch.
The interface 100 may further display measurement parameter information 118. In certain embodiments, the measurement parameter information 118 may display measurement parameters associated with ultrasound images displayed in the primary imaging area 102. In some embodiments, the measurement parameter information 118 may be updated in real time or near real time with updates to the ultrasound images displayed in the primary imaging area 102. As illustrated, the measurement parameter information 118 may include an indication of acoustic power (“AP”), an indication of mechanical index (“MI”), an indication of the soft tissue thermal index (“TIS”), an indication of gain, and indication of frequency, and/or any other relevant measurement parameter information.
Primary imaging area scale information 120 may be displayed on the interface proximate to the primary imaging area 102. In certain embodiments, the primary imaging area scale information 120 may display a measurement scale that may assist a user in interpreting ultrasound images displayed in the primary imaging area 102. For example, using the primary imaging area scale information 120, a user may be able to determine a relative distance between two or more points included in an ultrasound image displayed in the primary imaging area 102. In certain embodiments, a user may adjust the relative scaling of the primary imaging area scale information 120 and/or the primary imaging area 102 by touching the primary imaging area scale information 120 on the interface 100 and selecting an appropriate relative scaling in a sub-menu displayed in response to the user's touch.
The interface 100 may include one or more top-level function menus 122. The top-level function menus 122 may provide one or more menu buttons defining one or more top-level functions a user may utilize to interact with and/or control the ultrasound imaging system. For example, as illustrated, the top-level function menus 122 may include a patient information menu button, an examination type menu button, a measure menu button, an annotate menu button, a review menu button, and/or menu buttons corresponding to any other type of top-level functions a user may wish to utilize.
In response to a user touching the patient information menu button, the user may be presented with a menu showing relevant patient information including, for example, patient identification information. Other relevant patient information may include patient history information, diagnosis information, and/or the like. In the patient information menu, the user may enter and/or adjust patient information as required. In response to the user touching the exam type menu button, the user may be presented with a menu relating to the particular exam type. In this menu, the user may enter and/or adjust examination type information. In certain embodiments, adjusting examination type information may result in a corresponding adjustment of operating parameters and/or settings for the ultrasound imaging system to optimize system performance for a particular examination type.
In response to a user touching the review menu button, the user may be presented with a menu allowing the user to review, organize, and/or interact with previously captured images. In certain embodiments, these previously captured images may be still ultrasound images. In further embodiments, these previously captured images may be moving ultrasound images. In response to touching the measure menu button, the user may be presented with a menu related to certain measurement functions, described in more detail below. Similarly, in response to touching the annotate measure button, the user may be presented with a menu relating to certain annotation functions, also described in more detail below.
After touching one of the top-level function menus 122, a user may be presented with a sub-menu that, in certain embodiments, may include one or more sub-level function menus 124. In certain embodiments, the one or more sub-level function menus 124 may relate to one or more sub-level functions associated with a selected top-level function menu 122. For example, as illustrated, when a user touches the measure menu button, a sub-menu that includes a library sub-level function menu and a caliper sub-level function menu may be presented. In certain embodiments, the library sub-level function menu may include one or more predefined measurement functional tools that a user may utilize to interact with and/or interpret images displayed in the primary imaging area 102.
In certain embodiments, after touching one of the sub-level function menus 124, the user may be presented with one or more associated function buttons 126 allowing the user to perform certain functions associated with the function buttons 126. For example, as illustrated, when a user touches the caliper sub-level function menu, associated function buttons 126 including a zoom button, an edit button, a delete button, a delete all button, a linear button, a trace button, and/or any other related function button may be presented. When the zoom button is touched, a user may perform zooming operations on the images displayed in the primary imaging area 102. In certain embodiments, zooming operations may be performed using the touchpad 104. For example, a user may utilize a “spread” gesture (i.e., drawing two fingers on the touchpad 104 apart) to perform a zooming operation on an image displayed in the primary imaging area 102. Any other suitable gesture using one or more contact points on the touchpad 104 may also be utilized to perform zooming operations.
When the linear button is touched, a user may be presented with a cursor that may be used to perform linear measurement of the image displayed in the primary imaging area 102. Similarly, when the trace button is touched, a user may be presented with a tracing cursor for performing a multi-segment measurement of the image displayed in the primary imaging area 102. If a user wishes to change certain markers utilized in measurements, the user may touch the edit button, thereby allowing them to reposition the markers relative to the image displayed in the primary imaging area 102 using, for example, the touchpad 104. If a user wishes to delete a particular marker utilized in measurements, the user may touch the delete button, thereby allowing them to delete the particular marker using, in some instances, the touchpad. Similarly, if a user wishes to delete all markers utilized in measurements, the user may touch the delete all button.
Depending on the selected top-level function menu 122, the touchpad 102 may be displayed as part of the sub-menu associated with the top-level function menu 122. For example, as illustrated in
The interface 100 may further include one or more image capture buttons 130 that may be utilized to capture certain still and/or moving images displayed in the primary imaging area 102. As illustrated, the one or more capture buttons 130 may include a print button, a save button, and a freeze button. Touching the print button may print a copy of one or more images displayed in the primary imaging area 102. In certain embodiments, touching the print button may open a print sub-menu that the user may utilize to control printer settings and print a copy of the one or more images. Touching the save button may save a copy of one or more moving and/or still images displayed in the primary imaging area 102. In certain embodiments, touching the save button may open up a save sub-menu that the user may utilize to control image saving properties. Touching the freeze button may cause a certain still image or frame of a moving image displayed in the primary imaging area 102 to freeze, thereby allowing a user to study the frozen image in more detail.
One or more display function buttons 132 may be included on the interface 100. For example, as illustrated, an adjust image button, a quick function button, a depth function button, a gain button, and/or a mode button may be included on the interface. Touching the adjust image button may open up a menu allowing the user to make one or more adjustments to images displayed in the primary imaging area 102. Touching the quick function button may open up a menu allowing the user to select one or more functions and/or operations that may be used in controlling, viewing, and/or interpreting images displayed in the primary imaging area 102. Touching the depth button may allow a user to adjust a depth of view within a 3-dimensional image displayed in the primary imaging area 102. For example, in certain embodiments a “pinch” gesture using two fingers on the touchpad 104 may adjust a depth of view within a 3-dimensional medical image displayed in the primary imaging area 102. Touching the gain button may open up a menu that allows a user to adjust a gain of the ultrasound imaging system. Finally, touching the mode button may open up a menu that allows a user to adjust an operating mode of the ultrasound imaging system.
In certain embodiments, a user may wish to prevent inadvertent input from being provided to the interface 100. Accordingly, a user may touch a screen lock button 134 configured to cause the interface 100 to lock, thereby preventing a user from providing input by inadvertently touching the interface 100. If a user wishes to restore functionality to the interface 100, the user may touch the screen lock button again, thereby unlocking the interface 100.
It will be appreciated that a number of variations can be made to the architecture, relationships, and functions presented in connection with
In certain embodiments, the cursor 200 may be utilized in certain annotation functions and/or operations associated with the aforementioned annotate menu button of the top-level function menus 122. As illustrated, the annotate menu button may be associated with one or more function buttons 126 including a comment button, an arrow button, a delete button, and an edit button. When a user 202 touches the comment button, a menu may be displayed that allows the user 202 to enter a comment associated with the image displayed in the primary imaging area 102. In certain embodiments, the menu may include a touch screen keyboard allowing the user 202 to enter the comment. The comment may be associated with a particular portion of the image displayed in the primary imaging area 102 or, alternatively, the entire image. In embodiments where the comment is associated with a portion of the image, a flag, cross, arrow, or similar annotation may be placed on the particular portion of the image. In embodiments where the comment is associated with the entire image, an indication that there is a comment associated with the image may be displayed on the interface 100. Further, the comment and/or any other annotations disclosed herein may be included in any saved copy of the image.
When a user 202 touches the arrow button, the user 202 may annotate the image displayed in the primary imaging area 102 by placing an arrow or other marker over the image. For example, after touching the arrow button, the user 202 may position an arrow over the image displayed in the primary imaging area 102 by touching the primary imaging area 102 and/or by utilizing the touchpad 104. After positioning the arrow in a desired location, the user 202 may place the arrow over the image by touching the set button 106 and/or touching the primary imaging area 102 in a manner that places the arrow in the particular location (e.g., double tapping the primary imaging area 102 at the desired location).
When a user 202 touches the delete button, the user 202 may position the cursor 200 over an annotation or comment made in the primary imaging area 102 by touching the primary imaging area 102 at the annotation or comment and/or by utilizing the touchpad 104. The user 202 may delete the annotation by either touching the set button 106 or by touching the primary imaging area 102 in a manner that deletes the annotation (e.g., double tapping the primary imaging area 102 at the location of the annotation).
When a user 202 touches the edit button, the user 202 may position the cursor 200 over an annotation or comment made in the primary imaging area 102 by touching the primary imaging area 102 at the annotation or comment and/or by utilizing the touch pad 104. The user may then select the annotation or comment for editing by either touching the set button 106 to open up an editing menu or by touching the primary imaging area 102 in a manner that opens up an editing menu for the selected annotation or comment. In certain embodiments, the editing menu may include a touch screen keyboard allowing the user 202 to edit the comment and/or annotation as desired.
A menu button may be provided for certain common functions and/or annotation operations that, in certain embodiments, may be dependent on a selected examination type. For example, as illustrated, marking an area of the image displayed in the primary imaging area 102 for a future biopsy may be common. Accordingly, a menu button for a biopsy annotation may be displayed in the interface 100, thereby streamlining the ability of a user 202 to make such an annotation.
A user 202 may touch the interface 100 at the touch area 302, which in certain embodiments, may be positioned anywhere on the interface 100. At a particular distance and orientation from the touch area 302, an off-set cursor 300 may appear. When the user 202 moves the position of where they are touching the interface 100 (i.e., the touch area 302), their movements may be translated into a corresponding movement in the off-set cursor 300. In this manner, a user 202 may precisely move the off-set cursor 300 as desired while maintaining a clear view of the interface 100 and/or primary imaging area 102.
As illustrated, in some embodiments, a line (e.g., a dotted line) may be displayed between the touch area 302 and the off-set cursor 300, thereby aiding a user in identifying the relative position of the off-set cursor 300 with respect to the touch area 302. Moreover, a user may utilize the touch area 302 to interact with the interface 100 using single-point touch screen commands. Further, in certain embodiments, a user may utilize a plurality of touch areas 302 and/or off-set cursors 300 to interact with the interface 100 using any number of multi-point gesture commands. For example, a user may zoom into an image displayed in the primary imaging area 102 defined by two off-set cursors 300 by moving the two respective touch points 302 associated with the off-set cursors 300 apart in a “spread” gesture. The touch area 302 may be similarly utilized to select an item displayed on the interface under an off-set cursor (e.g., by tapping the touch area 302 twice or the like).
To define a region of interest 600, the user 202 may touch the touchpad 104 at a plurality of contact points. For example, as illustrated the user 202 may touch the touchpad 104 at two contact points. The user 202 may then define a region of interest 600 by utilizing a “spread” gesture on the touchpad 104 (i.e., by drawing two fingers on the touchpad 104 apart to points “A” and “B” as illustrated). In embodiments where two contact points are utilized, the region of interest 600 may be defined by a square or rectangle having opposing corners at the two contact points. Any other suitable number of contact points, region of interest shapes, and/or gestures may also be utilized to define a region of interest 600.
In certain embodiments, a user 202 may measure images displayed in the primary imaging area 102 by defining one or more measurement marker points within the displayed images. For example, as illustrated, a user 202 may define a first measurement marker point “C” within the primary imaging area 102. In certain embodiments, the first measurement marker point may be defined by positioning the measurement marker point “C” in a particular location in the primary imaging area 102 using the touchpad 104 and/or by touching the primary imaging area 102 directly. The user 202 may place the measurement marker point “C” by touching the set button 106 and/or by using an appropriate gesture (e.g., a double tap at the location) on the primary imaging area 102. The user 202 may then define a second measurement marker point “D” within the primary imaging area 102 by positioning the measurement marker point “D” in a particular location in the primary imaging area 102 using the touchpad 104 and/or by touching the primary imaging area 102 directly. The user 202 may place the measurement marker point “D” by touching the set button 106 and/or by using an appropriate gesture on the primary imaging area 102. The interface 100 may then display a measurement “E” indicating the relative distance between the measurement marker point “C” and measurement marker point “D.”
In some embodiments, the multi-segment trace path may be used for measurement purposes. For example, a measurement length of the multi-segment trace path may be displayed in the interface. In further embodiments, the multi-segment trace path may be utilized in zooming operations, in annotation operations, and/or the like.
The system 900 may include a processor 902, a random access memory (“RAM”) 904, a communications interface 906, a touch screen panel interface 908, other user interfaces 914, and/or a non-transitory computer-readable storage medium 910. The processor 902, RAM 904, communications interface 906, touchscreen panel interface 908, other user interfaces 914, and computer-readable storage medium 910 may be communicatively coupled to each other via a common data bus 912. In some embodiments, the various components of the computer system 900 may be implemented using hardware, software, firmware, and/or any combination thereof.
The touchscreen panel interface 908 may be used to display an interactive interface to a user such as, for example, the interface 100 described in reference to and illustrated in
The processor 902 may include one or more general purpose processors, application specific processors, microcontrollers, digital signal processors, FPGAs, or any other customizable or programmable processing device. The processor 902 may be configured to execute computer-readable instructions stored on the non-transitory computer-readable storage medium 910. In some embodiments, the computer-readable instructions may be computer executable functional modules. For example, the computer-readable instructions may include one or more functional modules configured to implement all or part of the functionality of the systems, methods, and interfaces described above in reference to
Some of the infrastructure that can be used with embodiments disclosed herein is already available, such as general-purpose computers, ultrasound imaging systems, touch screen panels, computer programming tools and techniques, digital storage media, and communications networks. A computing device may include a processor such as a microprocessor, microcontroller, logic circuitry, or the like. The processor may include a special purpose processing device such as an ASIC, PAL, PLA, PLD, FPGA, or other customized or programmable device. The computing device may also include a computer-readable storage device such as non-volatile memory, static RAM, dynamic RAM, ROM, CD-ROM, disk, tape, magnetic, optical, flash memory, or other computer-readable storage medium.
Various aspects of certain embodiments may be implemented using hardware, software, firmware, or a combination thereof. As used herein, a software module or component may include any type of computer instruction or computer executable code located within or on a non-transitory computer-readable storage medium. A software module may, for instance, comprise one or more physical or logical blocks of computer instructions, which may be organized as a routine, program, object, component, data structure, etc., that performs one or more tasks or implements particular abstract data types.
In certain embodiments, a particular software module may comprise disparate instructions stored in different locations of a computer-readable storage medium, which together implement the described functionality of the module. Indeed, a module may comprise a single instruction or many instructions, and may be distributed over several different code segments, among different programs, and across several computer-readable storage media. Some embodiments may be practiced in a distributed computing environment where tasks are performed by a remote processing device linked through a communications network.
The systems and methods disclosed herein are not inherently related to any particular computer or other apparatus and may be implemented by a suitable combination of hardware, software, and/or firmware. Software implementations may include one or more computer programs comprising executable code/instructions that, when executed by a processor, may cause the processor to perform a method defined at least in part by the executable instructions. The computer program can be written in any form of programming language, including compiled or interpreted languages, and can be deployed in any form, including as a standalone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. Further, a computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network. Software embodiments may be implemented as a computer program product that comprises a non-transitory storage medium configured to store computer programs and instructions that, when executed by a processor, are configured to cause the processor to perform a method according to the instructions. In certain embodiments, the non-transitory storage medium may take any form capable of storing processor-readable instructions on a non-transitory storage medium. A non-transitory storage medium may be embodied by a compact disk, digital-video disk, a magnetic tape, a Bernoulli drive, a magnetic disk, a punch card, flash memory, integrated circuits, or any other non-transitory digital processing apparatus memory device.
Although the foregoing has been described in some detail for purposes of clarity, it will be apparent that certain changes and modifications may be made without departing from the principles thereof. It should be noted that there are many alternative ways of implementing both the processes and apparatuses described herein. Accordingly, the present embodiments are to be considered illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.
The foregoing specification has been described with reference to various embodiments. However, one of ordinary skill in the art will appreciate that various modifications and changes can be made without departing from the scope of the present disclosure. For example, various operational steps, as well as components for carrying out operational steps, may be implemented in alternate ways depending upon the particular application or in consideration of any number of cost functions associated with the operation of the system. Accordingly, any one or more of the steps may be deleted, modified, or combined with other steps. Further, this disclosure is to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope thereof. Likewise, benefits, other advantages, and solutions to problems have been described above with regard to various embodiments. However, benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced, are not to be construed as a critical, a required, or an essential feature or element. As used herein, the terms “comprises,” “comprising,” and any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, a method, an article, or an apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, system, article, or apparatus. Also, as used herein, the terms “coupled,” “coupling,” and any other variation thereof are intended to cover a physical connection, an electrical connection, a magnetic connection, an optical connection, a communicative connection, a functional connection, and/or any other connection.
Those having skill in the art will appreciate that many changes may be made to the details of the above-described embodiments without departing from the underlying principles of the invention. The scope of the present invention should, therefore, be determined only by the following claims.