This disclosure relates to eyeglasses having a capability of displaying information electronically as well as allowing a user to see through.
Eyewear (i.e., glasses, also known as eyeglasses or spectacles) are vision aids, consisting of glass or hard plastic lenses mounted in a frame that holds them in front of a person's eyes, typically utilizing a nose bridge over the nose, and legs (known as temples or temple pieces) which rest over the ears.
Smart eyewear are glasses (or smart glasses) that add information alongside what the wearer sees through the glasses. Superimposing information (e.g., digital images) onto a field of view may be achieved through smart optics such as an optical head-mounted display (OHMD), or embedded wireless glasses with transparent heads-up display (HUD), or augmented reality (AR) overlay. Modern smart eyewear are effectively wearable computers which can run self-contained mobile apps. Some may be handsfree and can communicate with the Internet via natural language voice commands, while others use touch buttons.
Consideration is now being given to simplifying assembly and custom fitting of smart eyewear to individual wearers.
In a general aspect, an eyewear frame includes two half-frames. An adjustable nose bridge is disposed between the two half-frames. The two half-frames are adapted to hold a pair of see-through lenses. A virtual display associated with at least one of the two half-frames, the virtual display having an associated eye box for full viewing of the virtual display. The adjustable nose bridge includes one or more adjustable bridge-to-frame fastener arrangements adapted to provide independent adjustments of one or more geometrical parameters of the eyeglasses frame for aligning a position of the eye box relative to the eyeglasses frame for full viewing of the virtual display, wherein the independent adjustment of each geometrical parameter of the eyeglasses frame is independent of adjustments of the other geometrical parameter of the eyeglasses frame. The independent adjustments of the geometrical parameters are carried out one-by-one.
Example embodiments will become more fully understood from the detailed description herein and the accompanying drawings, wherein like elements are represented by like reference numerals, which are given by way of illustration only and thus are not limiting of the example embodiments.
It should be noted that these FIGS. are intended to illustrate the general characteristics of methods, structure and/or materials utilized in certain example embodiments and to supplement the written description provided below. These drawings are not, however, to scale and may not precisely reflect the precise structural or performance characteristics of any given embodiment, and should not be interpreted as defining or limiting the range of values or properties encompassed by example embodiments. For example, the relative thicknesses and positioning of components of the described eyeglasses may be reduced or exaggerated for clarity. The use of similar or identical reference numbers in the various drawings is intended to indicate the presence of a similar or identical element or feature.
Smart glasses can have virtual image displays (hereinafter “virtual display”) that add information to what a wearer sees through the glasses. A virtual display may, for example, be an in-lens micro display, or a display projected on a lens surface, or a display projected on a plane of a lens-less frame, etc. Traditional smart glasses do not allow the wearer to adjust the position of the virtual display independently from an overall anatomical fitting of the smart glasses on the wearer. In many instances, smart glasses that have virtual displays (e.g., in-lens micro displays, etc.) are available only in one-size-fits-all versions without regard to variations in the wearers' anatomies. Yet other smart glasses that have virtual displays are custom fabricated at a remote factory for each individual wearer based on previously obtained anthropometric measurements (e.g., by a custom 3D-image scanning) of the individual wearer.
Disclosed herein is eyewear (e.g., smart eyewear) that can be custom fit to a wearer during on-site assembly of the eyewear on the wearer's face, in accordance with the principles of the present disclosure.
In an example implementation, the smart eyewear may be assembled from components that are interchangeable or are locally adjustable on-site. For example, the smart eyewear may be assembled by joining two half-frames using wires, bands, and/or other joining means to form a spectacle frame (hereinafter “frame”, or “eyeglasses frame”) and hold a pair of see-through lenses in front of a person's eyes. The joining means can include a locally adjustable nose bridge. A virtual display may be overlaid on, or embedded in, at least one of the pair of see-through lenses.
The eyeglasses frame may utilize the adjustable nose bridge over the person's nose, and utilize temple pieces that rest over the person's ears, to hold the pair of see-through lenses (including the virtual display) in position in front of the person's eyes.
The temple pieces attached to eyeglasses frame may include electronics to prepare and send data (e.g., captions, images) for display on the virtual display. The virtual display may have an associated eye box for full viewing of the virtual display.
The adjustable nose bridge may include mechanisms that can provide independent local adjustments of geometrical parameters of the eyeglasses frame (e.g., interpupillary distance, wrap angle, cyclo-rotation, etc.) for aligning optics of the eyeglasses to the person's face and eye geometry. Aligning optics of the eyeglasses may, for example, include aligning a position of the eye box relative to the eyeglasses frame for full viewing of the virtual display. The example adjustable nose bridges described herein enable at least three degrees of freedom (DOF) of independent adjustments (e.g., interpupillary distance, wrap angle, cyclo-rotation, etc.) for aligning the position of the eye box relative to the eyeglasses frame for full viewing of the virtual display.
The interchangeable or locally adjustable components used for assembling smart eyewear may also include nose pads that can be attached to the two half-frames. The nose pads may be locally adjustable to make a comfortable fit to the person's nose and to set a height of the eyeglasses frame on the person's nose for aligning optics of the eyeglasses to the person's face and eye geometry.
As shown in
A nose bridge may be adapted to use one or more bridge-to-frame fastener arrangements to interconnect right half-frame 10R and a left half-frame 10L to form an eyeglasses frame.
Strip 10C may include one or more longitudinal or horizontal openings (e.g., slot 10D1, slot 10D2, extending, e.g., from about a middle M of the strip, up to or nearly up to, the left and right side edges of the strip, respectively) as part of the bridge-to-frame fastener arrangements for attachment of adjustable nose bridge 10NB to right half-frame 10R (
Screws passing through the longitudinal or horizontal openings (e.g., narrow horizontal slots 10D1 and 10D2) may be used to affix adjustable nose bridge 10NB to right half-frame 10R and left half-frame 10L. The screws may be used at different positions along the length of narrow horizontal slots 10D1 and 10D2 (e.g., along the x axis) to set different inter-frame separation distances between right half-frame 10R and left half-frame 10L.
In example implementations, positions P1 and P2 may be located symmetrically with respect to middle axis M. Thus, inter-frame separation distance w1 may include symmetric or equal contributions (e.g., da, db, da=db)) from the distance of right half-frame 10R from middle axis M and the distance of right half-frame 10R from middle axis M.
In other implementations, positions P1 and P2 may be located asymmetrically with respect to middle axis M (i.e., at unequal distances from middle axis M). Thus, inter-frame separation distance w1 may include asymmetric or unequal contributions (e.g., da, db, da≠db) from the distance of right half-frame 10R from middle axis M and the distance of right half-frame 10R from middle axis M.
In an example implementation, eyeglasses frame 200B may be obtained by modifying or adjusting adjustable nose bridge 10NB in place (i.e. in situ) in eyeglasses frame 200A to obtain a different inter-frame separation distance. The adjusting may involve loosening one or both of the screws (i.e., screw S1 at position P1 and screw S2 at position P2,
In addition to right half-frame 10R (
In addition to the height of the eyeglasses frame, another geometrical frame parameter that may be important for properly aligning optics of an eyeglasses frame (e.g., frame 200B) to the person's face and eye geometry is a so-called face form angle (also known as bow, wrap angle, or dihedral angle (herein “wrap angle”)). The wrap angle may be defined as the angle between the plane of the eyeglasses front and the plane of the right lens shape or the left lens shape.
As noted above, in example implementations, adjustable nose bridge 10NB may be made of material that is bendable, for example, at or about a middle M of strip 10C. The bendable material may be an elastic metal, metal alloy, a plastic, a resin, a polymer or a metal-polymer composite.
Further, in example implementations, adjustable nose bridge (e.g., nose bridge 10NB) may be configured to adjustably hold each of right half-frame 10R and or left half-frame 10L in a frame (e.g., frame 200B) at an angle with respect to the vertical (x axis) in the x-y plane. When fitting the frame to a person's face and eye geometry, right half-frame 10R and or left half-frame 10L may be rotated angles (e.g., at angles θ1 and θ2, respectively) with respect to the vertical (x axis) to match or compensate for the cyclo-rotations (e.g., involuntary cyclovergence) of the person's eyes. In example implementations, the cyclo-rotation angle (θ1 or θ2) may be an angle in the range of 0° to 20°.
An eyewear (e.g., frame 200A, frame 200B) includes an adjustable nose bridge, two half-frames interconnected by the adjustable nose bridge to form an eyeglasses frame, and a virtual display associated with at least one of the two half-frames. The virtual display having an associated eye box for full viewing of the virtual display. The adjustable nose bridge includes one or more adjustable bridge-to-frame fastener arrangements adapted to provide independent adjustments of one or more geometrical parameters of the eyeglasses frame for aligning a position of the eye box relative to the eyeglasses frame for full viewing of the virtual display. The independent adjustment of each geometrical parameter (e.g., interpupillary distance) of the eyeglasses frame being independent of adjustments of the other geometrical parameters (e.g., wrap angle, cyclo-rotation, tec.) of the eyeglasses frame.
The adjustable nose bridge may thus include one or more adjustable bridge-to-frame fastener arrangements for adjusting one or more geometrical parameters of the eyeglasses frame in order to align a position of the eye box relative to the eyeglasses frame for full viewing of the virtual display.
The one or more adjustable bridge-to-frame fastener arrangements may, for example, allow for adapting one or more of interpupillary distance, nose-to-eye pupil distance, face wrap, head width, and ear position of the eyeglasses frame thereby adjusting at least one of a interpupillary distance parameter, a wrap angle parameter; and a cyclo-rotation parameter of the eyeglasses frame.
A kit for assembling smart eyewear fitted to a person includes two half-frames, at least one of the two half-frames being associated with a virtual display, and a set of interchangeable nose bridges (e.g. nose bridge 10NB, nose bridge 900NB, nose bridge 902NB, nose bridge 1001, nose bridge 1001-1, etc.). Each nose bridge is adapted to couple the two half-frames together as a single frame. The nose bridges in the set of interchangeable nose bridges differ in defining one or more geometrical parameters of the eyeglasses frame with respect to a position of an eye box for viewing the virtual display relative to the eyeglasses frame.
In the fitting procedure, optical see-through lenses 10A and 10B may be placed in front of the person's eyes by eyeglasses frame 200B as worn by the person. In eyeglasses frame 200B, lens 10A and lens 10B may be held by half-frames 10L and 10R that are interconnected by adjustable nose bridge 10NB. Lens 10A and lens 10B may prescription or neutral lenses. Lens 10A may include a virtual display 10MD. Virtual display 10MD may define a volume (i.e., an eye box 10EB, a trapezoidal-shaped frustum volume) in which an eye pupil must be present to have a full view of the micro display. Wrap angle and inter-frame distance adjustments of adjustable nose bridge 10NB may control the orientation and position of the eye box 10EB relative to the person's eye.
At stage 1 of the fitting procedure, adjustable nose bridge 10NB may be set to have an inter-frame separation distance Ls1, and bent to have a wrap angle α1. At the parameter settings Ls1 and α1 of frame 200B, as shown in the figure, the pupil of the right eye 60RE may not be fully in eye box 10EB. Inter-frame separation Ls1 may be too wide for the person to have a full view of eye box 10EB/virtual display 10MD. At the inter-frame separation of Ls1, the person may have only a partial view of virtual display 10MD with a right edge of eye box 10EB/virtual display 10MD appearing smeared as viewed by right eye 60RE. For purposes of illustration, the person's partial view of virtual display 10MD, at stage 1 of the fitting procedure, is pictorially represented in the figure by dashed lines as a rectangular box 10PV. As shown in the figure, the person's partial view (e.g., rectangular box 10PV) does not fully cover the right side (e.g., side 10EBR) of eye box 10EB. Because of this, the right side of virtual display 10MD may appear to be smeared in the person's view.
To bring virtual display 10MD in full view, the adjustable nose bridge 10NB may need to be adjusted to reduce the inter-frame separation distance. As shown in the figure, at stage 2 of the fitting procedure, adjustable nose bridge 10NB may be adjusted to set the inter-frame separation distance to a value Ls2 (less than Ls1) bringing virtual display 10MD in full view. The adjustment of the inter-frame separation distance may be independent of any changes to the wrap angle. As shown in the figure, the wrap angle has an unchanged value of α1 through stage 1 and stage 2 of the fitting procedure.
While the adjustment of the inter-frame separation distance at stage 2 of the fitting procedure may have brought eye box 10EB/virtual display 10MD in full view, as shown in the figure, micro display 10MD may still be positioned to the right in lens 10A so that the person has to look to the right to see the micro display.
Adjustable nose bridge 10NB may be further adjusted to position micro display 10MD in the front and center of lens 10A. As shown in
Method 800A (with reference to the example shown in
Method 800B (with reference to the example shown in
Method 800B may further include determining if the micro display needs to be relocated or shifted to the left (850) and adjusting the nose bridge to a wider inter-frame separation distance setting to shift the micro display to the left (852); and determining if the micro display needs to be more centered on the right lens (860), and adjusting the nose bridge to a narrower inter-frame separation distance setting and a smaller wrap angle α setting to shift to the micro display toward the center (862).
As described previously (e.g., with reference to
Method 800C may include determining that the micro display in the right eye lens of the eyeglasses is not fully visible (870). Method 800A may further include determining if a top edge of the micro display appears smeared (880) and adjusting the nose pad assemblies for the frame to sit higher up on person's face (882); and determining if a bottom edge of the micro display appears smeared (890) and adjusting the nose pad assemblies for the frame to sit lower down on person's face (892).
The flexible and slidable adjustable nose bridges (e.g., nose bridge 10NB) disposed in-between the eyeglasses' lenses can be adjusted during the fitting of the eyeglasses frame for one person, and then readjusted for a different person using simple tools (e.g., a screwdriver). The adjustable nose bridge (e.g., nose bridge 10NB) allows users to continuously change the frame distance to the left eye and the frame distance to the right eye independently. The adjustable nose bridge also allows users to adjust the wrap of the eyeglasses frame around each person's face. Further, nose pad assemblies 300 allow users adjust a height of the eyeglasses frame above the nose. In this manner, a single smart eyeglasses unit assembled with the adjustable nose bridge can be fit to most adult persons using only a simple tool (e.g., a screwdriver). A cosmetic cover can be applied on top of the adjustable nose bridge in the eyeglasses frame.
In other example implementations, a nose bridge having a fixed geometry (i.e., a non-adjustable or a non-sliding geometry) with a predetermined inter-frame separation distance may be used to couple right half-frame 10R and left half-frame 10L to form a fitted eyeglasses frame for a person.
In addition to having a predetermined inter-frame separation distance, a nose bridge with a fixed geometry may be configured to couple right half-frame 10R and left half-frame 10L at a predetermined wrap angle.
In an example implementation, eyeglasses component kit 1000 may include a set of interchangeable nose bridges having various combinations of respective predetermined inter-frame separation distance values (e.g., ranging from about 3 mm to 10 mm) and the respective wrap angle values (e.g., ranging from about 0° to 20°). As shown in
In an example procedure for fitting an eyeglasses frame to a person, different nose bridges (e.g., from kit 1000) with different predetermined inter-frame separation and predetermined wrap angle values may be used to couple the two half-frames (e.g., right half-frame 10R and left half-frame 10L) together as a single frame. Appropriate right half-frame 10R and left half-frame 10L may be selected based on a measurement of the person's anthropometric characteristics (e.g., interpupillary distance and head width). Appropriate parameters (e.g., inter-frame separation distance and wrap angle) for aligning optics of the eyeglasses frame to the person's face and eye geometry may be determined, for example, by trial-and-error use of the various nose bridges in kit 1000 to interconnect the half-frames.
In example implementations, example nose bridge 110NB may be made of bendable material. In addition to having a predetermined inter-frame separation distance, nose bridge 110NB may be bent (e.g., into the plane or out of the plane of the figure on either side of a center axis CC) to couple right half-frame 20R and left half-frame 20L at a predetermined wrap angle (e.g., in a manner similar to nose bridge 902NB as shown in
In example implementations, an eyeglasses component kits (e.g., kit 1000,
In example implementations, the eyeglasses component kits (e.g., kit 1000) described herein may include covers for the adjustable nose bridges used to interconnect the right half and left half-frames. The covers (e.g., cover 110C,
An optometric tool for capturing anthropometric or biometric characteristics of a person may include an eyeglasses component kit (e.g., kit 1000) that can be used to assemble and fit a test eyeglasses frame to a person, in accordance with the principles of the present disclosure. The anthropometric or biometric characteristics may, for example, include interpupillary distance, vertical distance from the nose to the pupils, face wrap, head width, ear position, etc.
The eyeglasses component kit may, for example, include two half-frames that can be interconnected to form the test eyeglasses frame and hold a pair of see-through lenses in front of the person's eyes. A virtual display may be embedded in, or overlaid on, at least one of the pair of see-through lenses.
The eyeglasses component kit may include adjustable or interchangeable nose bridges (e.g., nose bridge 10NB, nose bridge 900NB, nose bridge 902NB, nose bridges 1001, 1002, 1003, 1001-1, 1001-2 and 1003-1, etc.) that can interconnect the two half-frames to form the test eyeglasses frame. The test eyeglasses frame may utilize the nose bridge over the person's nose, and utilize temple pieces that rest over the person's ears to hold the pair of see-through lenses in position in front of the person's eyes. The temple pieces may include electronics to prepare and send stimuli (e.g., captions, images) for display on the virtual display. The electronics in the temple pieces may be coupled to the virtual display (e.g., over wires or wirelessly) and configured to display optical stimuli (e.g., images and patterns) on the virtual display.
The nose bridges may include mechanisms that can provide independent adjustments (i.e., one-by-one adjustments) of geometrical parameters of the test eyeglasses frame (e.g., interpupillary distance, wrap angle, cyclo-rotation, etc.) for aligning optics of the test eyeglasses frame to the person's face and eye geometry.
The eyeglasses component kit may also include a pair of nose pads that can be attached to the two half-frames. The nose pads may be adjustable to make a comfortable fit to the person's nose and to set a height of the test eyeglasses frame on the person's face for aligning optics of the test eyeglasses frame to the person's face and eye geometry.
The optometric tool may be configured (using, e.g., the electronics in the temple pieces) to display test stimuli on the virtual display during the fitting process. The test stimuli may, for example, include an optometric test pattern or image. The optometric test patterns or images, may, for example, include color patterns to assess color uniformity, or line patterns to assess distortion, or dot patterns to assess focus. Multiple test stimuli that capture perception of different qualities may be used.
The optometric tool may record the person's responses to the test stimuli displayed on the virtual display of the test eyeglasses frame, at each stage of the fitting process. The responses may include, for example, one or more of an ability to fully see the micro display (e.g., seeing all four corners of the display; uniformity, brightness, color, sharpness, contrast (and other image characteristics) of the whole image as perceived by the person; uniformity or non-uniformity of focus of the image across the micro display as perceived by the person; vertical and horizontal alignment, keystone and other geometric image distortions as perceived by the person; and other aspects related to image quality and geometry as perceived by the person).
The optometric tool may be provisioned with readout mechanisms to capture data on the test eyeglasses frame configurations (e.g., dimensions, orientations, spacings of the frame components, etc.) at each stage of the fitting process. The readout mechanisms may include encoders/sensors (e.g., mechanical, optical, electrical, sensors etc.). The encoders/sensors may respond to mechanical changes (or a state of the frame configuration) to capture the test eyeglasses frame geometry. Other example readout mechanisms may involve computer vision. For example, camera-based observations may be used to capture a current state of the test eyeglasses frame configuration. In some instances, the readout mechanisms may involve manual measurements and observations (e.g., human observation of distances and angles of the test eyeglasses frame geometry).
A best-fit configuration of the test eyeglasses frame may correspond to a desired or target perception of the stimuli (e.g., a full view of the virtual display) by the person. The optometric tool may capture data on the best-fit test eyeglasses frame configuration (e.g., dimensions, orientations, spacings of the frame components) that corresponds to the desired or target perception of the stimuli.
Data on the test eyeglasses frame configurations (e.g., dimensions, orientations, spacings of the frame components, etc.) and data on the person's responses to various display stimulus presented by the optometric tool during the fitting process may be collected as fitting results data. The fitting results data can be used to determine the geometry of a new glasses frame that matches, for example, the best-fit configuration of the test eyeglasses frame. The new eyeglasses frame may be fabricated as a single piece having a fixed geometry (e.g., a not-modifiable geometry).
The fitting results data obtained by the optometric tool while fitting eyewear may include anthropometric or biometric data that can be used for customization of eyeglasses and also devices other than eyewear for the person. The anthropometric or biometric data may, for example, include one or more of interpupillary distance, wrap angle, cyclo-rotation, etc. The anthropometric data may be used to customize and fit, for example, swimming goggles, or a motorcycle helmet, to the person.
Method 1200 may be implemented using an optometric tool that includes a glasses component kit (e.g., kit 1000) that can be used to assemble and fit a test frame to the person. The test frame may include two half-frames adapted to hold a pair of see-through lenses in front of the person's eyes. A virtual display (e.g., an in-lens micro display) may be embedded in at least one of the pair of see-through lenses, or the virtual display may be a projected display overlaid on at least one of the pair of see-through lenses. The test frame may include electronics (e.g., in the temple pieces) for displaying stimuli (e.g., images and patterns) on the virtual display.
Method 1200 includes assembling a test adjustable eyeglasses frame in a respective geometric configuration with components from the kit (1210).
Method 1200 further includes, in an iterative fitting procedure, placing the test adjustable eyeglasses frame on the person to hold a virtual display associated with the eyeglasses frame in front of a person's eyes in front of the person's eyes (1220), displaying one or more test stimuli on the virtual display (1230), recording the person's perception of the displayed one or more test stimuli (1240).
When the person's perceptions of the displayed one or more test stimuli match target perceptions of the one or more test stimuli, method 1200 includes determining the geometric configuration of the test adjustable frame as being a best-fit geometric configuration of the test frame (1250). When the person's perceptions of the displayed one or more test stimuli do not match the target perceptions of the one or more test stimuli, method 1200 includes readjusting the respective geometric configuration of the adjustable test eyeglasses frame, and repeating the iterative fitting procedure (1260).
Method 1200 further includes recording results of the iterative fitting procedure (1270), and building an eyewear device with a fixed geometry for the person based on anthropometric or biometric data extracted from the results of the iterative fitting procedure (1280).
The adjustable components from the kit used in method 1200 may include components that are adjustable to set one or more of interpupillary distance, vertical distance from the nose to the pupils, face wrap, head width, and ear position of the test adjustable eyeglasses frame. The adjustable components from the kit may include nose bridges and nose pads adjustable to set one or more of interpupillary distance, vertical distance from the nose to the pupils, wrap angles, and cyclo-rotations of the test adjustable eyeglasses frame. Alternatively or additionally, the adjustable components from the kit may include a replaceable nose bridge corresponding to a fixed inter-lens distance and a fixed wrap angle of the test adjustable eyeglasses frame.
In method 1200, displaying one or more test stimuli on the virtual display may include displaying one or more optometric test images and test patterns. The optometric test images and test patterns may include one or more of color patterns to assess color uniformity, or line patterns to assess distortion, or dot patterns to assess focus. Displaying one or more one or more optometric test images and test patterns may include industry-standard optometric test images and test patterns on the virtual display.
In method 1200, the target perceptions of the one or more test stimuli may include one or more of:
In method 1200, building an eyewear device with a fixed geometry for the person based on anthropometric or biometric data extracted from the results of the iterative fitting procedure includes building a fixed-geometry eyeglasses frame that matches the best-fit geometric configuration of the test frame.
While example embodiments may include various modifications and alternative forms, embodiments thereof are shown by way of example in the drawings and description herein. It should be understood, however, that there is no intent to limit example embodiments to the particular forms disclosed, but on the contrary, example embodiments are to cover all modifications, equivalents, and alternatives falling within the scope of the claims. Like numbers refer to like elements throughout the description of the figures.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which can be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device. Various implementations of the systems and techniques described here can be realized as and/or generally be referred to herein as a circuit, a module, a block, or a system that can combine software and hardware aspects. For example, a module may include the functions/acts/computer program instructions executing on a processor (e.g., a processor formed on a silicon substrate, a GaAs substrate, and the like) or some other programmable data processing apparatus.
Some of the above example embodiments are described as processes or methods depicted as flowcharts. Although the flowcharts describe the operations as sequential processes, many of the operations can be performed in parallel, concurrently or simultaneously. In addition, the order of operations can be re-arranged. The processes can be terminated when their operations are completed, but may also have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, subprograms, etc.
Methods discussed above, some of which are illustrated by the flow charts, can be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks can be stored in a machine or computer readable medium such as a storage medium. A processor(s) may perform the necessary tasks.
Specific structural and functional details disclosed herein are merely representative for purposes of describing example embodiments. Example embodiments, however, be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments. As used herein, the term and/or includes any and all combinations of one or more of the associated listed items.
It will be understood that when an element is referred to as being connected or coupled to another element, it can be directly connected or coupled to the other element or intervening elements can be present. In contrast, when an element is referred to as being directly connected or directly coupled to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., between versus directly between, adjacent versus directly adjacent, etc.).
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms a, an and the are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms comprises, comprising, includes and/or including, when used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may in fact be executed concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Portions of the above example embodiments and corresponding detailed description are presented in terms of software, or algorithms and symbolic representations of operation on data bits within a computer memory. These descriptions and representations are the ones by which those of ordinary skill in the art effectively convey the substance of their work to others of ordinary skill in the art. An algorithm, as the term is used here, and as it is used generally, is conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of optical, electrical, or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
In the above illustrative embodiments, reference to acts and symbolic representations of operations (e.g., in the form of flowcharts) that can be implemented as program modules or functional processes include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types and may be described and/or implemented using existing hardware at existing structural elements. Such existing hardware may include one or more Central Processing Units (CPUs), digital signal processors (DSPs), application-specific-integrated-circuits, field programmable gate arrays (FPGAs) computers or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise, or as is apparent from the discussion, terms such as processing or computing or calculating or determining of displaying or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical, electronic quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Note also that the software implemented aspects of the example embodiments are typically encoded on some form of non-transitory program storage medium or implemented over some type of transmission medium. The program storage medium can be magnetic (e.g., a floppy disk or a hard drive) or optical (e.g., a compact disk read only memory, or CD ROM), and can be read only or random access. Similarly, the transmission medium can be twisted wire pairs, coaxial cable, optical fiber, or some other suitable transmission medium known to the art. The example embodiments not limited by these aspects of any given implementation.
Lastly, it should also be noted that whilst the accompanying claims set out particular combinations of features described herein, the scope of the present disclosure is not limited to the particular combinations hereafter claimed, but instead extends to encompass any combination of features or embodiments herein disclosed irrespective of whether or not that particular combination has been specifically enumerated in the accompanying claims at this time.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2020/070593 | 9/29/2020 | WO |