A user may use a peripheral device to interface with a computing device. The computing device can control ambient lighting emitted from the peripheral device.
The accompanying drawings, which are incorporated in and form a part of this specification, illustrate examples of the disclosure and, together with the description, explain principles of the examples. In the drawings, like reference symbols and numerals indicate the same or similar components.
Embodiments of the disclosure are described in detail below with reference to the accompanying figures. Unless otherwise indicated, like parts and method steps are referred to with like reference numerals.
A peripheral device may electronically connect to a processing device so as to permit a user, when operating the peripheral device, to interact with the processing device. While connected to the peripheral device, the processing device may control the peripheral device in a manner that causes the peripheral device to create lighting effects. Lighting effects may include ambient lighting that the peripheral device emits for aesthetic purposes. Processing devices may restrict the types of lighting effects created by the peripheral device to a predetermined number of lighting effects, for example, as a result of a limited amount of lighting effects that can be produced by software of the processing device.
Described herein is a computing device that is electronically connectable to a peripheral device. The computing device may convert an image into a lighting control map and output the lighting control map to the peripheral device. The computing device may obtain the image for converting into the lighting control map from various sources, including displayed content, computer-generated content, or recorded content. In some examples, the computing device may convert the image into the lighting control map while the image is on the display screen. The image may be a still image. Likewise, the image may be an image frame of a video stream having a plurality of image frames. The peripheral device may create lighting effects by illuminating lights on the peripheral device according to the lighting control map so as to cause the lights on the peripheral device to irradiate in accordance with the image.
Accordingly, in some examples, systems, apparatuses, methods, and computer readable media storing instructions for execution are provided herein for a computing device that enables users to customize the lighting effects emitted from peripheral device based on the images that may appear on the display screen of the computing device. This and other features described herein provide unique lighting features for users to further enhance their experience through peripheral lighting. For example, by controlling lighting effects on the peripheral device to track or mirror content of an image on a display, the system may provide a more immersive experience for a user. As another example, by controlling lighting effects on the peripheral device according to an image from various sources, the system provides a more customized experience. By outputting the lighting control map to the peripheral device, the computing device may control the peripheral device to create a wide variety of lighting effects.
The following describes technical solutions with reference to accompanying drawings. Example embodiments are described in detail with reference to the accompanying drawings. For the sake of clarity and conciseness, matters related to the present embodiments that are well known in the art have not been described.
The computing device controller 113 may control the computing device 11. The computing device controller 113 may be implemented as any suitable processing circuitry including, but not limited to at least one of a microcontroller, a microprocessor, a single processor, and a multiprocessor. The computing device controller 113 may include at least one of a video scaler integrated circuit (IC), an embedded controller (EC), a central processing unit (CPU), a graphics processing unit (GPU), an accelerated processing unit (APU), an application specific integrated circuit (ASIC), field programmable gate arrays (FPGA), or the like, and may have a plurality of processing cores.
Computing device memory 115 may be a non-transitory processor readable or computer readable storage medium. Computing device memory 115 may comprise read-only memory (“ROM”), random access memory (“RAM”), other non-transitory computer-readable media, or a combination thereof. In some examples, computing device memory 115 may store firmware. Computing device memory 115 may store software for the computing device 11. The software for the computing device 11 may include program code. The program code includes program instructions that are readable and executable by the computing device controller 113, also referred to as machine-readable instructions. Computing device memory 115 may store filters, rules, data, or a combination thereof.
As illustrated in
The peripheral device interface 131 may communicate by wire or wirelessly with the computing device interface 111 in the computing device 11 such that the computing device 11 and the peripheral device 13(X) are in electronic communication. The computing device interface 111 and the peripheral device interface 131 may employ communication protocols such as Universal Serial Bus (USB), USB-C, Bluetooth, infrared technology and/or other connectivity protocols. While the peripheral device 13(X) is in communication with the computing device 11, the peripheral device controller 133 may control the peripheral device interface 131 to exchange configuration information between the computing device interface 111 and the peripheral device interface 131.
In some examples, only one peripheral device 13(X) may be in electronic communication with the computing device interface 111. In other examples, the device interface 111 may be in electronic communication with any number of the peripheral devices 13(1)-13(Z).
The peripheral device controller 133 may control the peripheral device 13(X). The peripheral device controller 133 may include a central processing unit (CPU), a graphic processing unit (GPU), a microprocessor, an application specific integrated circuit (ASIC), field programmable gate arrays (FPGA), or the like, and may have a plurality of cores.
Peripheral device memory 135 may be a non-transitory processor readable or computer readable storage medium. Peripheral device memory 135 may comprise read-only memory (“ROM”), random access memory (“RAM”), other non-transitory computer-readable media, or a combination thereof. In some examples, peripheral device memory 135 may store firmware. Peripheral device memory 135 may store software for the peripheral device 13(X). The software for the peripheral device 13(X) may include program code. The program code includes program instructions that are readable and executable by the peripheral device controller 133. Peripheral device memory 135 may store filters, rules, data, or a combination thereof.
The power module 137 may supply power or electrical energy to the peripheral device interface 131, the peripheral device controller 133, the peripheral device memory 135, and the light array 139. The power module 137 may wirelessly receive the power from the computing device 11. The power module 137 may receive the power from the computing device 11 by a wired connection with the computing device 11. The power module 137 may receive the power from the computing device 11 through the peripheral device interface 131.
The power module 137 may include a battery 138. The battery 138 may be removable from the power module 137. The battery may store power or electrical energy as potential energy when the power module 137 receives such power from the computing device 11. The battery 138 may be a rechargeable battery. As a rechargeable battery, the battery 138 may be repeatedly charged with the power when some or all of the potential energy stored in the battery 138 has been discharged from the battery 138.
The pixel array 119a of the display screen 119, when operating, may present an image for viewing. When the pixel array 119a presents the image for viewing, the display screen 119 may display the image. The image is viewable when the display screen 119 displays the image. The image may be a still image. The image may be a single image. Likewise, the image may be an image frame of a video stream having a plurality of image frames. The plurality of image frames of the video stream may be a sequence of images, or consecutive images, that, during playback, are displayed in succession by the display screen 119. The plurality of image frames of the video stream may be displayed in succession at a frame rate, for example, at 10 frames per second (fps), 24 fps, 30 fps, 60 fps, or another rate. The computing device controller 113 may control the pixel array 119a to display the image or images. The display screen 119 may be a liquid crystal display. The display screen 119 may be a light-emitting diode (LED) display. The light-emitting diode display may be an organic light-emitting diode (OLED) display.
A light array aspect ratio is an aspect ratio of the light array 139 in the peripheral device 13(X). The light array 139 may have the light array aspect ratio of M: N. The display screen aspect ratio for the display screen 119 may differ from the light array aspect ratio for the peripheral device 13(X). Likewise, the light array aspect ratio for another of the peripheral devices 13(1)-13(Z) may differ from the light array aspect ratio for the peripheral device 13(X). In some examples, the pixel array 119a of the display screen 119 has more pixels than the light array 139 has lights. For example, the pixel array 119a of the display screen 119 may have a 720×480 matrix (where 720 is the number of columns and 480 is the number of rows), 720×576 matrix, 1280×720 matrix, 1920×1080 matrix, 3840×2160 matrix, 7680×4320 matrix, among other matrix sizes. The light array 139 of the peripheral device 13(x), in some examples, may have a matrix of lights with fewer than 200, 100, 50, or 25 rows and columns. In some examples, the pixel array 119a has more or fewer pixels than these examples. In some examples, the light array 139 has more or fewer lights than these examples.
A light in the light array 139 may be a light-emitting diode (LED), an organic light-emitting diode (OLED), and/or any other light source that is capable of emitting multiple colors of light. Each light in the light array 139 may emit multiple colors of light. For example, each light may include a red-green-blue (RGB) pixel controllable (e.g., by the peripheral device controller 133) to emit a particular color at a given moment (e.g., based on a control signal thereto). Each RGB pixel, in some examples, may include a red sub-pixel, green sub-pixel, and blue sub-pixel. The peripheral device controller 133 may control the sub-pixels to emit various combinations and levels of red, green, and blue light to produce various colors.
In the flow diagram of
The lighting effects processing in
Thereafter, the lighting effects processing in
In block 21 of
Block 21 of
In block 22 of
The brightness parameter may indicate a relative light intensity for the lighting effects emitted by the light array 139 of the peripheral device 13(X). The brightness parameter may be indicated numerically on a scale between a lowest brightness level and a highest brightness level. For example, the brightness parameter may be a value between 1 and 100, with 1 representing the lowest brightness level, and 100 representing the highest brightness level. The peripheral device 13(X), when emitting the lighting effects with the brightness parameter at a higher number, presents the lighting effects in a manner that is brighter (e.g., with a higher intensity) than when the peripheral device 13(X) emits the lighting effects with the brightness parameter at a lower number.
The contrast parameter may indicate an amount of relative difference in luminance between dark and bright areas of the lighting effects emitted by the light array 139. The contrast parameter may be indicated numerically on a scale between a lowest contrast level and a highest contrast level. For example, the contrast parameter may be a value between 1 and 100, with 1 representing the lowest contrast level, and 100 representing the contrast level. The peripheral device 13(X), when emitting the lighting effects with the contrast parameter at a higher number, presents the lighting effects in a manner that has a larger difference in luminance between dark and bright areas of the lighting effects than when the peripheral device 13(X) emits the lighting effects with a contrast parameter at a lower number.
The color temperature parameter may indicate the color of the light of the lighting effects emitted by the light array 139. The color temperature parameter may be indicated using a numerical value on a scale between a lowest color temperature level and a highest color temperature level of the peripheral device 13(X) and may refer to the Kelvin scale. For example, the color temperature parameter may be a value between 2500K (or another Kelvin value) and 6500K (or another Kelvin value), with 2500K representing the lowest color temperature level, and 6500K representing the highest color temperature level. The peripheral device 13(X), when emitting the lighting effects with the color temperature parameter at a higher number, may present the lighting effects in a manner that is viewable as being cooler or more blue-like than when the peripheral device 13(X) emits the lighting effects with a color temperature parameter at a lower number, which may be viewable as being warmer or more yellow-like.
The sharpness parameter may indicate an amount of clarity or edge contrast with which the lighting effects is created on the peripheral device 13(X). The sharpness parameter may be indicated numerically on a scale between a lowest sharpness level and a highest sharpness level. For example, the contrast parameter may be a value between 1 and 10, with 1 representing the lowest sharpness level, and 10 representing the highest sharpness level. The light array 139, when emitting the lighting effects with the sharpness parameter at a higher number, presents the lighting effects in a manner that appears clearer with higher edge contrast between displayed objects (e.g., more distinct contours) than when the peripheral device 13(X) emits the lighting effects with a sharpness parameter at a lower number.
Prior to performing the lighting effects processing of
In block 23, the computing device controller 113 may determine whether or not the image source selection includes information that identifies (i) computer-generated content as the image source, (ii) displayed content as the image source, or (iii) recorded content as the image source. Computer-generated content is computer-generated imagery that is created electronically by an electronic device and/or with the aid of software. The lighting effects processing in
In block 24, the computing device controller 113 in block 24 may electronically create the computer-generated content. Upon electronically creating the computer-generated content, the computing device controller 113 in block 24 may store the computer-generated content into the computing device memory 115. Alternatively, the computing device controller 113 may store the computer-generated content into the computing device memory 115 prior to advancing the lighting effects processing from block 22 to block 23. In block 24, the computing device controller 113 in block 24 may retrieve the computer-generated content from the computing device memory 115. The computer-generated content may be a single image. The single image may be a still image. The computer-generated content may be a video stream having a plurality of image frames, which each may individually be referred to as an image or image frame. For example, the video stream may include a series of consecutive image frames. While in the computing device memory 115, the computer-generated content may be referred to as convertible image content, discussed in further detail below. Thereafter, the lighting effects processing in
Returning to block 23, the computing device controller 113 may determine that the image source selection includes information that identifies content on the display screen 119 (displayed content) as the image source. The content on the display screen 119 may be an image. The image may be a single image in the form of a still image. Alternatively, the image on the display screen 119 may an image frame of a video stream having a plurality of image frames. For example, the video stream may include a series of consecutive image frames. In some examples, the image source selection indicates a portion of the display screen 119 to serve as the image. For example, via the user interface, the computing device controller 113 may receive an area selection of the display screen 119. For example, the area selection may be received in response to a user dragging a cursor (e.g., using a computer mouse) to identify a rectangular sub-section of the display screen area, to a user selecting a window on the display screen 119 (e.g., a window associated with a particular software application currently executing on the computing device controller 113), or to use of other user interface techniques. The area selection may indicate the portion of the display screen 119 that is to serve as the image. Accordingly, in some examples, the area selection may select a window on the display screen 119 that is displaying a video stream or a still image, and the content in the window (e.g., an image frame of the video stream or still image) may serve as the image. The lighting effects processing in
In block 25, the computing device controller 113 may control the display 118 to cause the image on the display screen 119 to appear on the display screen 119 in real-time. The computing device interface 111 may receive the image on the display screen 119 from the computing device memory 115. The computing device interface 111 may receive the image on the display screen 119 from a source external to the computing device 11. The display 118 may display the image on the display screen 119 simultaneously with the computing device interface 111 receiving the image from the computing device memory 115 and/or from the source external to the computing device 11. The computing device controller 113 may store the image on the display screen 119 into the computing device memory 115. The computing device controller 113 may continuously update the computing device memory 115 in real-time to store the image that appears on the display screen 119. While in the computing device memory 115, the image on the display screen 119 may be referred to as the convertible image content, discussed in further detail below. Thereafter, the lighting effects processing in
In block 23, the computing device controller 113 may determine that the image source selection includes information that identifies recorded content as the image source. The recorded content may be previously stored in the computing device memory 115 or another computer readable medium, and may be retrieved by the computing device controller 113. For example, the information that identifies the recorded content as the image source may include or indicate a file name and/or memory address at which the recorded content is stored. The computing device controller 113 may retrieve a file that includes the recorded content from a memory based on the file name and/or memory address. The recorded content may be a previously-recorded image and/or a previously-recorded video stream. The previously-recorded image may be a single image. The single image may be a still image. The image may be a Graphics Interchange Format (GIF) image file, a Joint Photographic Experts Group (JPEG) image file, and/or any other image file. The previously-recorded video stream may be a series of consecutive image frames. The previously-recorded video stream is from (e.g., encoded or saved as) a Moving Picture Experts Group (MPEG) video file, and/or any other video file. In some examples, the computing device 11 provides a capture function that enables a user to capture content (e.g., an image or video stream) from the display screen 119 to serve as the recorded content. For example, via the user interface, the computing device controller 113 may receive an area selection of the display screen 119. For example, the area selection may be received in response to a user dragging a cursor (e.g., using a computer mouse) to identify a rectangular sub-section of the display screen area, to a user selecting a window on the display screen 119 (e.g., a window associated with a particular software application currently executing on the computing device controller 113), or using other user interface techniques. The area selection may indicate the portion of the display screen from which to capture an image or video stream. Accordingly, in some examples, the area selection may select a window on the display screen 119 that is displaying a video stream or a still image, and the content in the window (e.g., an image frame of the video stream or still image) may be captured and stored (e.g., in the computing device memory 115) as the recorded content.
While in the computing device memory 115, the recorded content may be referred to as the convertible image content, discussed in further detail below. The computing device controller 113 may store the recorded content into the computing device memory 115 prior to advancing the lighting effects processing from block 22 to block 23. The lighting effects processing in
Accordingly, in some examples, regardless of the image source selection and the path taken from block 23 to block 26, convertible image content may be identified and/or present in the computing device memory 115. For example, the convertible image content may include an image or video stream from computer-generated content, may include an image or video stream from recorded content, or may include an image from displayed content (e.g., being currently displayed on the display screen 119). As noted, in the image from the displayed content may be or include an image frame of a plurality of image frames of a video stream being displayed on the display screen 119.
In block 26 of
In some examples, to convert the convertible image content into the lighting control map for block 26, a process as illustrated in
In block 263, the computing device controller 113 may process the convertible image content to generate the color array. The computing device controller 113 in block 263 may process the convertible image content to pixelate the convertible image content. The convertible image content in pixelated form is an example of the color array. In the case of the convertible image content including a video stream, each image frame of the video stream may be pixelated to generate a plurality of color arrays (e.g., one color array for each image frame).
Each color array is an array of individual color swatches. Each color swatch in the color array is respectively associated with a portion of the convertible image content. For example, the color array associates the portion of the convertible image content with a color swatch in the color array and associates another portion of the convertible image content with another color swatch in the color array. The matrix of the pixels in the display screen 119 is composed of multiple pixel groups. Each pixel group is a subset of the matrix of the pixels. The pixel group indicates a size for each portion of the convertible image content. The computing device controller 113 may select a dominant color for each portion or pixel group of the convertible image content as the color of the color swatch corresponding to that portion. The dominant color that is selected may be an average color of the pixel group, a most common color in the pixel group, or another color representative of a most dominant color in the pixel group. Thereafter, the lighting effects processing in
In block 264 of
In block 265, the computing device controller 113 may convert the color array into the lighting control map for the peripheral device 13(X). In the case of a video stream resulting in a plurality of color arrays, each color array may be converted into a respective lighting control map, forming a plurality or stream of lighting control maps. When converting each color array into a lighting control map for the peripheral device 13(X), the computing device controller 113 may, to produce the lighting control map, map the color array to the light array 139 of the peripheral device 13 (X). When mapping the color array to the light array 139, the computing device controller 113 may transpose the color array to the lighting control map by adjusting the aspect ratio for the color array from the display screen aspect ratio to the light array aspect ratio, with the display screen aspect ratio being for the display screen 119 and the light array aspect ratio being for the peripheral device 13(X). The light array aspect ratio for the peripheral device 13(X) may become the aspect ratio of the lighting control map. In some examples, the lighting control map may include an array of values in which the values in the array are indicative of a color for each light of the light array 139. For example, each value in the array of values may correspond to a light of the light array 139, and may be a numerical value indicative of a color for that light to emit. Thereafter, the computing device controller 113 may advance the lighting effects processing in
In block 266, the computing device controller 113 may attach lighting information to the lighting control map (or maps). The lighting information may include sequencing parameters, a device identifier, and the collection of display parameters. The sequencing parameters may instruct the sequence and color of light emissions from the lights in the light array 139 of the peripheral device 13(X). The device identifier may uniquely identify the peripheral device 13(X) as the particular one of the peripheral devices 13(1)-13(Z) to which the lighting control map is applied. The collection of display parameters may include settings for brightness, contrast, color temperature, and sharpness of the lighting effects created by the peripheral device 13(X).
In some examples of the processing of
From block 266, the computing device controller 113 may advance the lighting effects processing in
In block 27, the computing device controller 113 may control the computing device interface 111 to output the lighting control map for the peripheral device 13(X) from the computing device interface 111. In the case of the convertible image content including a video stream and being converted into a stream of lighting control maps, in block 27, the computing device interface 111 may output the stream of lighting control maps. The lighting control map(s) that are output may be received by the peripheral device 13(X) via peripheral device interface 131. In response to receipt of the lighting control map(s), the peripheral device 13(X) may control the light array 139 to emit light in accordance with the lighting control map(s), as described in further detail with respect to
Although the conversion in block 26 and the output in block 27 are illustrated as discrete blocks in
In some examples, in the case of the image source being displayed content (see path from block 23 to block 25 of
The lighting effects processing in
In block 28, the computing device controller 113 may determine whether or not the user interface 117 has received a subsequent peripheral device selection from the user interface 117. The subsequent peripheral device selection may include information that uniquely identifies another of the peripheral devices 13(1)-13(Z) to which the lighting effects processing is applied. The user may input the subsequent peripheral device selection manually to the computing device 11 by navigating and manipulating the user interface 117.
When the computing device controller 113 determines in block 28 that the user interface 117 has not received the subsequent peripheral device selection from the user interface 117, the lighting effects processing in
When the computing device controller 113 determines in block 28 that the user interface 117 has received the subsequent peripheral device selection from the user interface 117, the computing device controller 113 retains the subsequent peripheral device selection in the computing device memory 115. While in the computing device memory 115, the subsequent peripheral device selection may become the peripheral device selection in the lighting effects processing of
In some examples of the lighting effects processing of
The lighting effects processing in
In block 31 of
In block 32 of
In block 33 of
In block 34 of
In block 35 of
In some examples, by looping back to block 32 (and proceeding again through blocks 33, 34, and 35), the peripheral device 13(X) may receive a stream of lighting control maps from the computing device 11 and control the lights of the light array 139 to illuminate in accordance with the stream of lighting control maps. In some examples, such controlling of the light array 139 in accordance with the stream of lighting control maps causes, effectively, streaming of a converted version of the video stream used to generate the stream of lighting control maps (e.g., via lighting effects processing of
An aspect of the lighting effects processing of
An aspect of the lighting effects processing of
Another aspect of the lighting effects processing in
In some examples, aspects of the technology, including computerized implementations of methods according to the technology, may be implemented as a system, method, apparatus, or article of manufacture using standard programming or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a processor, also referred to as an electronic processor, (e.g., a serial or parallel processor chip or specialized processor chip, a single-or multi-core chip, a microprocessor, a field programmable gate array, any variety of combinations of a control unit, arithmetic logic unit, and processor register, and so on), a computer (e.g., a processor operatively coupled to a memory), or another electronically operated controller to implement aspects detailed herein.
Accordingly, for example, examples of the technology may be implemented as a set of instructions, tangibly embodied on a non-transitory computer-readable media, such that a processor may implement the instructions based upon reading the instructions from the computer-readable media. Some examples of the technology may include (or utilize) a control device such as, e.g., an automation device, a special purpose or programmable computer including various computer hardware, software, firmware, and so on, consistent with the discussion herein. As specific examples, a control device may include a processor, a microcontroller, a field-programmable gate array, a programmable logic controller, logic gates etc., and other typical components that are known in the art for implementation of appropriate functionality (e.g., memory, communication systems, power sources, user interfaces and other inputs, etc.).
Certain operations of methods according to the technology, or of systems executing those methods, may be represented schematically in the figures or otherwise discussed herein. Unless otherwise specified or limited, representation in the figures of particular operations in particular spatial order may not necessarily require those operations to be executed in a particular sequence corresponding to the particular spatial order. Correspondingly, certain operations represented in the figures, or otherwise disclosed herein, may be executed in different orders than are expressly illustrated or described, as appropriate for particular examples of the technology. Further, in some examples, certain operations may be executed in parallel or partially in parallel, including by dedicated parallel processing devices, or separate computing devices configured to interoperate as part of a large system.
As used herein in the context of computer implementation, unless otherwise specified or limited, the terms “component,” “system,” “module,” “block,” and the like are intended to encompass part or all of computer-related systems that include hardware, software, a combination of hardware and software, or software in execution. For example, a component may be, but is not limited to being, a processor device, a process being executed (or executable) by a processor device, an object, an executable, a thread of execution, a computer program, or a computer. By way of illustration, both an application running on a computer and the computer may be a component. A component (or system, module, and so on) may reside within a process or thread of execution, may be localized on one computer, may be distributed between two or more computers or other processor devices, or may be included within another component (or system, module, and so on).
Also as used herein, unless otherwise limited or defined, “or” indicates a non-exclusive list of components or operations that may be present in any variety of combinations, rather than an exclusive list of components that may be present only as alternatives to each other. For example, a list of “A, B, or C” indicates options of: A; B; C; A and B; A and C; B and C; and A, B, and C. Correspondingly, the term “or” as used herein is intended to indicate exclusive alternatives only when preceded by terms of exclusivity, such as, e.g., “either,” “only one of,” or “exactly one of.” Further, a list preceded by “one or more” (and variations thereon) and including “or” to separate listed elements indicates options of one or more of any or all of the listed elements. For example, the phrases “one or more of A, B, or C” and “at least one of A, B, or C” indicate options of: one or more A; one or more B; one or more C; one or more A and one or more B; one or more B and one or more C; one or more A and one or more C; and one or more of each of A, B, and C. Similarly, a list preceded by “a plurality of” (and variations thereon) and including “or” to separate listed elements indicates options of multiple instances of any or all of the listed elements. For example, the phrases “a plurality of A, B, or C” and “two or more of A, B, or C” indicate options of: A and B; B and C; A and C; and A, B, and C. In general, the term “or” as used herein only indicates exclusive alternatives (e.g., “one or the other but not both”) when preceded by terms of exclusivity, such as, e.g., “either,” “only one of,” or “exactly one of.”
In the description above and the claims below, the term “connected” may refer to a physical connection or a logical connection. A physical connection indicates that at least two devices or systems co-operate, communicate, or interact with each other, and are in direct physical or electrical contact with each other. For example, two devices are physically connected via an electrical cable. A logical connection indicates that at least two devices or systems co-operate, communicate, or interact with each other, but may or may not be in direct physical or electrical contact with each other. Throughout the description and claims, the term “coupled” may be used to show a logical connection that is not necessarily a physical connection. “Co-operation,” “the communication,” “interaction” and their variations include at least one of: (i) transmitting of information to a device or system; or (ii) receiving of information by a device or system.
Any mark, if referenced herein, may be common law or registered trademarks of third parties affiliated or unaffiliated with the applicant or the assignee. Use of these marks is by way of example and shall not be construed as descriptive or to limit the scope of disclosed or claimed embodiments to material associated only with such marks.
The terminology used herein is for describing various examples only, and is not to be used to limit the disclosure. The articles “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. The terms “comprises,” “includes,” and “has” specify the presence of stated features, numbers, operations, members, elements, and/or combinations thereof, but do not preclude the presence or addition of one or more other features, numbers, operations, members, elements, and/or combinations thereof.
Throughout the application, ordinal numbers (e.g., first, second, third, etc.) may be used as an adjective for an element (i.e., any noun in the application). Although terms such as “first,” “second,” and “third” may be used herein to describe various members, components, regions, layers, or sections, these members, components, regions, layers, or sections are not to be limited by these terms. Rather, these terms are only used to distinguish one member, component, region, layer, or section from another member, component, region, layer, or section.
The use of ordinal numbers is not to imply or create any particular ordering of the elements nor to limit any element to being only a single element unless expressly disclosed, such as by the use of the terms “before,” “after,” “single,” and other such terminology. Rather, the use of ordinal numbers is to distinguish between the elements. By way of an example, a first element is distinct from a second element, and the first element may encompass more than one element and succeed (or precede) the second element in an ordering of elements. Thus, a first member, component, region, layer, or section referred to in examples described herein may also be referred to as a second member, component, region, layer, or section without departing from the teachings of the examples.
Unless otherwise defined, all terms, including technical and scientific terms, used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains and after an understanding of the disclosure of this application. Terms, such as those defined in commonly used dictionaries, are to be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the disclosure of this application.
Unless otherwise indicated, like parts and method steps are referred to with like reference numerals.
Although the present technology has been described by referring to certain examples, workers skilled in the art will recognize that changes may be made in form and detail without departing from the scope of the discussion.