The present disclosure generally relates to electronic devices having camera sensors and in particular to a method for fusing media captured by multiple camera sensors to create a composite media.
Modern image capturing devices, such as cameras associated with cellular phones, are equipped with cameras that can be used to capture images and/or video. These devices use one or more dedicated cameras within the device to focus on a scene and capture an image and/or video associated with the scene. However, movement during capture of the scene may cause the captured image/video to be blurry and/or unfocused.
The description of the illustrative embodiments is to be read in conjunction with the accompanying drawings. It will be appreciated that for simplicity and clarity of illustration, elements illustrated in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements are exaggerated relative to other elements. Embodiments incorporating teachings of the present disclosure are shown and described with respect to the figures presented herein, in which:
The illustrative embodiments provide a method, a system, and a computer program product for repositioning lenses of at least one camera of a plurality of cameras during capture of media and creating a fused media from media captured by the plurality of camera sensors. The method includes receiving, via at least one input device, a request to capture a media of a current scene. The method further includes capturing a primary media, via a primary camera sensor that includes an optical image stabilization (OIS) sensor, and simultaneously capturing at least one secondary media via at least one secondary camera sensor. The method further comprises, during capture of the primary media and the at least one secondary media, repositioning at least one lens of at least one of the primary camera sensor and the at least one secondary camera sensor to compensate for a detected movement of at least one of the primary camera sensor and the at least one secondary camera sensor. The method further includes automatically fusing the primary media and the at least one secondary media to create a fused media. The method further includes providing the fused media to at least one output device.
The above contains simplifications, generalizations and omissions of detail and is not intended as a comprehensive description of the claimed subject matter but, rather, is intended to provide a brief overview of some of the functionality associated therewith. Other systems, methods, functionality, features, and advantages of the claimed subject matter will be or will become apparent to one with skill in the art upon examination of the following figures and the remaining detailed written description. The above as well as additional objectives, features, and advantages of the present disclosure will become apparent in the following detailed description.
In the following description, specific example embodiments in which the disclosure may be practiced are described in sufficient detail to enable those skilled in the art to practice the disclosed embodiments. For example, specific details such as specific method orders, structures, elements, and connections have been presented herein. However, it is to be understood that the specific details presented need not be utilized to practice embodiments of the present disclosure. It is also to be understood that other embodiments may be utilized and that logical, architectural, programmatic, mechanical, electrical and other changes may be made without departing from the general scope of the disclosure. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present disclosure is defined by the appended claims and equivalents thereof.
References within the specification to “one embodiment,” “an embodiment,” “embodiments”, or “one or more embodiments” are intended to indicate that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearance of such phrases in various places within the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Further, various features are described which may be exhibited by some embodiments and not by others. Similarly, various aspects are described which may be aspects for some embodiments but not other embodiments.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Moreover, the use of the terms first, second, etc. do not denote any order or importance, but rather the terms first, second, etc. are used to distinguish one element from another.
It is understood that the use of specific component, device and/or parameter names and/or corresponding acronyms thereof, such as those of the executing utility, logic, and/or firmware described herein, are for example only and not meant to imply any limitations on the described embodiments. The embodiments may thus be described with different nomenclature and/or terminology utilized to describe the components, devices, parameters, methods and/or functions herein, without limitation. References to any specific protocol or proprietary name in describing one or more elements, features or concepts of the embodiments are provided solely as examples of one implementation, and such references do not limit the extension of the claimed embodiments to embodiments in which different element, feature, protocol, or concept names are utilized. Thus, each term utilized herein is to be provided its broadest interpretation given the context in which that term is utilized.
Those of ordinary skill in the art will appreciate that the hardware components and basic configuration depicted in the following figures may vary. For example, the illustrative components within image capturing device 100 are not intended to be exhaustive, but rather are representative to highlight components that can be utilized to implement the present disclosure. For example, other devices/components may be used in addition to, or in place of, the hardware depicted. The depicted example is not meant to imply architectural or other limitations with respect to the presently described embodiments and/or the general disclosure.
Within the descriptions of the different views of the figures, the use of the same reference numerals and/or symbols in different drawings indicates similar or identical items, and similar elements can be provided similar names and reference numerals throughout the figure(s). The specific identifiers/names and reference numerals assigned to the elements are provided solely to aid in the description and are not meant to imply any limitations (structural or functional or otherwise) on the described embodiments.
Now turning to
As shown, image capturing device 100 may include input devices and output devices that enable a user to interface with image capturing device 100. In the illustrated embodiment, image capturing device 100 includes at least two camera sensors 142a-n, camera flash(es) 146, display 145, hardware buttons 106a-n, microphone(s) 108, and speaker(s) 144. While two camera sensors (camera sensors 142a-n) are illustrated, image capturing device 100 may include additional camera sensors, in other embodiments. Hardware buttons 106a-n are selectable buttons which are used to receive manual/tactile input from a user to control specific operations of image capturing device 100 and/or of applications executing thereon. In one embodiment, hardware buttons 106a-n may also include, or may be connected to, one or more sensors (e.g. a fingerprint scanner) and/or may be pressure sensitive. Hardware buttons 106a-n may also be directly associated with one or more functions of a graphical user interface (not pictured) and/or functions of an OS, application, or hardware of image capturing device 100. In one embodiment, hardware buttons 106a-n may include a keyboard. Microphone(s) 108 may be used to receive spoken input/commands from a user. Speaker(s) 144 is used to output audio.
CPU(s) 104 is also coupled to sensors 122a-n and display 145. Sensors 122a-n can include, but are not limited to, at least one of: infrared (IR) sensors, thermal sensors, light sensors, motion sensors and/or accelerometers, proximity sensors, and camera/image sensors. Display 145 is capable of displaying text, media content, and/or a graphical user interface (GUI) associated with or generated by firmware and/or one or more applications executing on image capturing device 100. The GUI can be rendered by CPU(s) 104 for viewing on display 145, in one embodiment, or can be rendered by a graphics processing unit (GPU), in another embodiment. In one embodiment, display 145 is a touch screen that is also capable of receiving touch/tactile input from a user of image capturing device 100, when the user is interfacing with a displayed GUI. In at least one embodiment, image capturing device 100 can include a plurality of virtual buttons or affordances that operate in addition to, or in lieu of, hardware buttons 106a-n. For example, image capturing device 100 can be equipped with a touch screen interface and provide, via a GUI, a virtual keyboard or other virtual icons for user interfacing therewith.
Image capturing device 100 also includes serial port 132 (e.g., a universal serial bus (USB) port), battery 134, and charging circuitry 136. Serial port 132 can operate as a charging port that receives power via an external charging device (not pictured) for charging battery 134 via charging circuitry 136. Battery 134 may include a single battery or multiple batteries for providing power to components of image capturing device 100. Serial port 132 may also function as one of an input port, an output port, and a combination input/output port. In one embodiment, battery 134 may include at least one battery that is removable and/or replaceable by an end user. In another embodiment, battery 134 may include at least one battery that is permanently secured within/to image capturing device 100.
Image capturing device 100 may also include one or more wireless radios 140a-n and can include one or more antenna(s) 148a-n that enable image capturing device 100 to wirelessly connect to, and transmit and receive voice and/or data communication to/from, one or more other devices, such as devices 152a-n and server 154. As a wireless device, image capturing device 100 can transmit data over a wireless network 150 (e.g., a Wi-Fi network, cellular network, Bluetooth® network (including Bluetooth® low energy (BLE) networks), a wireless ad hoc network (WANET), or personal area network(PAN)). In one embodiment, image capturing device 100 may be further equipped with infrared (IR) device (not pictured) for communicating with other devices using an IR connection. In another embodiment, wireless radios 140a-n may include a short-range wireless device, including, but not limited to, a near field communication (NFC) device. In still another embodiment, image capturing device 100 may communicate with one or more other device(s) using a wired or wireless USB connection.
In the various embodiments described herein, at least one of camera sensors 142a-nincludes an optical image stabilization (OIS) sensor 224a-n and is identified as a primary camera sensor (e.g., primary camera sensor 142a). In one embodiment, a particular camera sensor having an OIS sensor may be pre-identified as primary camera sensor 142a. In another embodiment, prior to capturing media 202a-n, CPU(s) 104 identifies a particular camera sensor having an OIS sensor 224a-n as primary camera sensor 142a. It should be noted that camera sensors 142a-n may include color camera sensors (e.g., Red Green Blue (RGB) and/or Bayer sensors), monochromatic camera sensors, or any combination thereof. For example, primary camera sensor 142a may be a monochromatic camera sensor and secondary camera sensor 142n may be a color camera sensor. In one embodiment, each of camera sensors 142a-n have a same pixel size (e.g., 13 megapixels). In another embodiment, camera sensors 142a-n have different pixel sizes. For example, primary camera sensor 142a has a pixel size of 13 megapixels and secondary camera sensor 142n has a pixel size of 8 megapixels. In one embodiment, each of camera sensors 142a-n has a same lens angle (e.g., 55 millimeters). In another embodiment, camera sensors 142a-n have different lens angles. For example, primary camera sensor 142a has a lens angle of 55 millimeters and secondary camera sensor 142n has lens angle of 24 millimeters. Additionally, while two camera sensors (primary camera 142a and secondary camera 142n) are illustrated in
OIS sensors 224a-n include one or more sensors (e.g., gyroscopes) that detect a movement of a corresponding camera sensor. OIS sensors 224a-n also include a plurality of actuators/motors that are used to manipulate a position of a lens of a corresponding camera sensor based on the detected movement. In one or more embodiments, OIS sensors 224a-n may directly manipulate an X, Y, and/or Z-axis position of a lens of a corresponding camera during capture of a media, based on a detected movement of the camera sensor. The manipulation of an X, Y, and/or Z-axis position of a lens is also referred to herein as self-correction. By manipulating a position of a lens based on a detected movement, OIS sensor 224a-n ensures the lens is properly aligned with an imaging sensor of the camera sensor so that light passing through the lens is properly projected and/or focused on the capture sensor. This ensures that a clear and focused media is captured by the camera sensor. In one or more embodiments, at least one of OIS sensors 224a-n does not self-correct based on a detected movement, but instead provides the detected movement to transformation module 214 as movement data 204a-n. Movement data 204a-n identifies a movement of a corresponding camera sensor in each of X, Y, and Z axes during recording of media. In response to transmitting movement data 204a-n to transformation module 214, OIS sensors 224a-n may subsequently receive, from transformation module 214, correction ratio 208, which identifies corrections to apply to a position of a lens of a corresponding camera in each of X, Y, and Z directions, as described in greater detail below.
CPU(s) 104a-n receives, from at least one of input devices 216a-n, request 218 to capture media 202a-n including images and/or video. Input devices 216a-n may include, but are not limited to, hardware buttons (e.g., buttons 106a-n) and/or microphones (e.g., microphone 108), and touch screen displays. For example, in response to detecting depression/selection of a shutter button (e.g., input device 216a) of image capturing device 100, request 218 may be generated and automatically transmitted to CPU(s) 104. In another embodiment, request 218 may be received from a software application executing on CPU(s) 104. In still another embodiment, request 218 may be received from another device (e.g., device 152a) that is communicatively coupled to image capturing device 100.
In response to receiving request 218, CPU(s) 104 automatically initializes a capture of media 202a-n by camera sensors 142a-n. During capture of media 202a-n by camera sensors 142a-n, transformation module 214 may receive movement data 204a-n from select camera sensors 142a-n that are equipped with an OIS sensor (e.g., OIS sensors 224a-n). Transformation module 214 calculates correction ratio(s) 208a-n for at least one of camera sensors 142a-n based on movement data 204a-n and calibration data 212a-n. Calibration data 212a-n specifies a distance between primary camera sensor 142a and secondary camera sensor(s) 142n. Calibration data 212a-n may further include geometry data that identifies, for a corresponding camera 142a-n, an angle, alignment, and/or sensor flex of a corresponding camera sensor 142a-n relative to a chassis of image capturing device 100 and/or a particular reference point (e.g., a center point or another camera) on image capturing device 100. In one embodiment, calibration data 212a-n for camera sensors 142a-n is stored in memory (e.g., memory 110) that is accessible to transformation module 214. In another embodiment, calibration data 212a-n of camera sensors 142a-n is stored within a read-only memory (e.g., an electrically erasable programmable read-only memory (EEPROM)) at each camera 142a-n that is accessible to transformation module 214. Correction ratios 208a-n identify corrections to apply to a position of a lens of a corresponding camera sensor 142a-n in each of X, Y, and Z directions to counteract for a detected movement of the corresponding camera sensor 142a-n during capture of media 202a-n. That is, correction ratios 208a-n, when applied to at least one camera sensor of a plurality of camera sensors, correct a pitch, roll, and yaw of a lens of the at least one camera sensor based on (1) a movement of at least one of the plurality of camera sensors and (2) and calibration data associated with the plurality of camera sensors.
In one embodiment, in response to calculating correction ratio(s) 208a-n for at least one of camera sensors 142a-n, transformation module 214 directly applies correction ratio(s) 208a-n to the corresponding camera sensors 142a-n, as described in greater detail in
In response to completion of the capture of media 202a-n, CPU(s) 104 receives media 202a-n from camera sensors 142a-n and performs a fusion of the received media 202a-n to create a fused media 210 that corrects for the movement of camera sensors 142a-n. In one or more embodiments, CPU(s) 104 removes and/or corrects common artifacts in media 202a-n and generates a single optimized composite media (fused media 210) by fusing the corrected media 202a-n. For example, CPU(s) 104 may correct white balance and/or shading, reduce or eliminate camera sensor noise, and/or remove bad pixels in media 202a-n prior to fusing media 202a-n. In one embodiment, CPU(s) 104 performs a pre-processing on media 202a-n prior to fusing media 202a-n to create fused media 210. For example, prior to fusing media 202a-n, CPU(s) 104 analyzes conditions in media 202a-n and optimizes detail, sharpness, brightness, and/or light conditions in media 202a-n. In one or more embodiments, CPU(s) 104 analyzes a difference in point-of-view between media 202a-n. In fusing media 202a-n, CPU(s) 104 utilizes geometry data (not illustrated) within calibration data 212a-n to locate and associate the same objects within media 202a-n. In response to identifying the same objects within media 202a-n, CPU(s) 104a-n aligns media 202a-n based on the identified the objects. CPU(s) 104 then combines/fuses media 202a-n to create fused media 210. It should be noted that when media 202a-n includes multiple frames (e.g., a burst image or video), the fusion of media 202a-n is performed for each corresponding frame captured by primary camera sensor 142a and secondary camera sensor 142n. Fused media 210 generated by CPU(s) 104 minimizes and/or eliminates adverse artifacts detected in media 202a-n and/or enhances image quality over the image quality of media 202a-n. In response to generating fused media 210, CPU(s) 104 provides fused media 210 to an output device (e.g., display 145), stores fused media 210 in a memory (e.g., memory 110), and/or provides fused media 210 to another device that is communicatively connected to image capturing device 100.
Referring now to
It should be noted that the n in the denominator of the equation above represents the number of camera sensors 142a-n for which transformation module 214 has received movement data 204a-n. Additionally, n represents the number of individual X-Y-Z data sets added together in the numerator of the above equation. It should be noted that the PC movement and the SC movement in the above equation represents movement data 204a and movement data 204n, respectively. Thus, in the illustrated example of
In response to calculating movement mean 302, transformation module 214 calculates correction ratio 208a-n for each of camera 142a-n by multiplying movement mean 302 with calibration data 212a-n. More precisely, in calculating correction ratio (e.g., correction ratio 208a), transformation module 214 performs the below calculation:
As shown in the calculation above, calibration data 212a-n includes, for a particular camera (e.g., camera 142a), a rotation and translation matrix which is multiplied by a position matrix. In the rotation and translation matrix, R represents a rotation matrix and T represents a translation vector. In the position matrix, Lx represents a distance on an x-axis between primary camera sensor 142a secondary camera sensor 142n and Ly represents a distance on a y-axis between primary camera sensor 142a secondary camera sensor 142n.
In response to calculating correction ratios 208a-n for each of camera sensors 142a-n, transformation module 214 applies correction ratio 208a to primary camera sensor 142a and correction ratio 208n to secondary camera sensor 142n. By applying correction ratios 208a-n to camera sensors 142a-n, a position of lenses of camera sensors 142a-n is adjusted to compensate for a movement of primary camera sensor 142a and secondary camera sensor 142n. Primary camera sensor 142a and secondary camera sensor 142n thus capture media 202a and media 202n using the corrected lens positions provided by correction ratios 208a-n.
In response to completion of the capture of media 202a-n, CPU(s) 104 receives media 202a from primary camera sensor 142a and media 202n from secondary camera sensor 142n and fuses media 202a-n to generate a single optimized composite media (fused media 210). In response to generating fused media 210, CPU(s) 104 provides fused media 210 to at least one output device 222a-n. Fused media 210 may also be stored to memory 110 and/or another storage that is accessible to image capturing device 100.
Referring now to
In response to calculating correction ratio 208n based on calibration data 212n and movement data 204n, transformation module 214 applies correction ratio 208n to only secondary camera sensor 142n. The application of correction ratio 208n to secondary camera sensor 142n corrects a position of a lens of camera sensors 142n and compensates for a movement of secondary camera sensor 142n. Thus, primary camera sensor 142a captures media 202a in conjunction with any self-correction applied by OIS sensor 224a while secondary camera sensor 142n captures media 202n using the corrected lens position provided by correction ratio 208n.
In response to completion of the capture of media 202a-n, CPU(s) 104 receives media 202a from primary camera sensor 142a and media 202n from secondary camera sensor 142n. CPU(s) 104 fuses media 202a-n to create fused media 210, which is provided to at least one output device 222a-n. Fused media 210 may also be stored to memory 110 and/or another storage that is accessible to image capturing device 100.
Referring now to
In response to calculating correction ratio 208n, transformation module 214 applies correction ratio 208n to only secondary camera sensor 142n to correct a position of a lens of camera sensors 142n based on a movement of primary camera sensor 142a identified within movement data 204a.
Primary camera sensor 142a captures media 202a in conjunction with any self-correction applied by OIS sensor 224a while secondary camera sensor 142n captures media 202n using the corrected lens position provided by correction ratio 208n. In response to completion of the capture of media 202a-n, CPU(s) 104 receives media 202a from primary camera sensor 142a and media 202n from secondary camera sensor 142n. CPU(s) 104 fuses media 202a-n to create fused media 210, which is provided to at least one output device 222a-n. Fused media 210 may also be stored to memory 110 and/or another storage that is accessible to image capturing device 100.
Referring now to
OIS sensor 224a self-corrects a position of a lens of primary camera 142a. Secondary camera sensor 142n does not include an OIS sensor and thus cannot detect secondary movement data 204n. As illustrated, primary camera sensor 142a is directly connected to secondary camera sensor 142n. During capture of media 202a by primary camera 142a, Secondary camera sensor 142n receives primary movement data 204a from primary camera 142a in real time and automatically routes primary movement data 204a to transformation module 214.
In response to receiving movement data 204a from secondary camera 142n, transformation module 214 calculates correction ratio 208n for secondary camera sensor 142n by multiplying calibration data 212n with movement data 204a, as shown in the equation below:
In response to calculating correction ratio 208n, transformation module 214 applies correction ratio 208n to only secondary camera sensor 142n to correct a position of a lens of camera sensors 142n based on a movement of primary camera sensor 142a identified within movement data 204a. Primary camera sensor 142a captures media 202a in conjunction with any self-correction applied by OIS sensor 224a while secondary camera sensor 142n captures media 202n using the corrected lens position provided by correction ratio 208n.
In response to completion of the capture of media 202a-n, CPU(s) 104 receives media 202a from primary camera sensor 142a and media 202n from secondary camera sensor 142n. CPU(s) 104 fuses media 202a-n to create fused media 210, which is provided to at least one output device 222a-n. Fused media 210 may also be stored to memory 110 and/or another storage that is accessible to image capturing device 100. It should also be noted that while transformation module 214 is depicted in
Referring now to
Method 700 commences at initiator block 701 then proceeds to block 702. At block 702, CPU(s) 104 receives/detects request 218 to capture media of current scene 230. In response to receiving request 218, CPU(s) 104 initializes the capture of media 202a-n by camera sensors 142a-n (block 704). At block 706, a transformation module (e.g., transformation module 214) receives movement data (e.g., movement data 204a-n) from at least one of camera sensors 142a-n. At block 708, transformation module 214 calculates correction ratio(s) 208a-n based on received movement data 204a-n and calibration data 212a-n. At block 710, transformation module 214 provides/applies correction ratio(s) 208a-n to at least one of camera sensors 142a-n. At block 712, CPU(s) 104 receives primary media 202a from primary camera sensor 142a (block 712) and contemporaneously receives secondary media 202n from secondary camera sensor(s) 142n (block 714). At block 716, CPU(s) 104 automatically fuses media 202a-n to create fused media 210. At block 718, fused media 210 is provided to at least one output device (e.g., display 145). Method 700 then terminates at end block 720.
The methods presented in
Referring now to
Referring now to
Referring now to
Referring now to
In the above-described flow charts, one or more of the method processes may be embodied in a computer readable device containing computer readable code such that a series of steps are performed when the computer readable code is executed on a computing device. In some implementations, certain steps of the methods are combined, performed simultaneously or in a different order, or perhaps omitted, without deviating from the scope of the disclosure. Thus, while the method steps are described and illustrated in a particular sequence, use of a specific sequence of steps is not meant to imply any limitations on the disclosure. Changes may be made with regards to the sequence of steps without departing from the spirit or scope of the present disclosure. Use of a particular sequence is therefore, not to be taken in a limiting sense, and the scope of the present disclosure is defined only by the appended claims.
Aspects of the present disclosure are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language, without limitation. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine that performs the method for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. The methods are implemented when the instructions are executed via the processor of the computer or other programmable data processing apparatus.
As will be further appreciated, the processes in embodiments of the present disclosure may be implemented using any combination of software, firmware, or hardware. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment or an embodiment combining software (including firmware, resident software, micro-code, etc.) and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable storage device(s) having computer readable program code embodied thereon. Any combination of one or more computer readable storage device(s) may be utilized. The computer readable storage device may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage device can include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage device may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Where utilized herein, the terms “tangible” and “non-transitory” are intended to describe a computer-readable storage medium (or “memory”) excluding propagating electromagnetic signals; but are not intended to otherwise limit the type of physical computer-readable storage device that is encompassed by the phrase “computer-readable medium” or memory. For instance, the terms “non-transitory computer readable medium” or “tangible memory” are intended to encompass types of storage devices that do not necessarily store information permanently, including, for example, RAM. Program instructions and data stored on a tangible computer-accessible storage medium in non-transitory form may afterwards be transmitted by transmission media or signals such as electrical, electromagnetic, or digital signals, which may be conveyed via a communication medium such as a network and/or a wireless link.
While the disclosure has been described with reference to example embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the disclosure. In addition, many modifications may be made to adapt a particular system, device, or component thereof to the teachings of the disclosure without departing from the scope thereof. Therefore, it is intended that the disclosure not be limited to the particular embodiments disclosed for carrying out this disclosure, but that the disclosure will include all embodiments falling within the scope of the appended claims.
The description of the present disclosure has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope of the disclosure. The described embodiments were chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various embodiments with various modifications as are suited to the particular use contemplated.