Aspects described herein relate to human-computer interface devices. More specifically, aspects described herein relate to input devices usable to provide locomotion input to a computer system, e.g., for use with virtual reality, augmented reality, and/or mixed reality environments, or other applications or systems that may benefit from locomotion input.
Virtual Reality (“VR”), Augmented Reality (“AR”), and Mixed Reality (“MR”) describe computer systems used to allow a user to view and interact with a virtual environment. Such virtual environments may comprise simulations of light, sound, touch, and movement, such that a user may navigate and manipulate the virtual environment as if it were real, or may augment the user's environment, e.g., using a system such as HOLOLENS™ products sold by Microsoft Corp. of Redmond, Wash. The degree of immersion in a virtual environment is often correlated with a computer system's ability to replicate, for example, the natural motions of the user. There is thus an ever-present need to improve ways in which user locomotion is translated into virtual environments.
Known input systems relating to locomotion include, for example, floor mat game controllers such as the Power Pad sold by Nintendo Co., Ltd. of Kyoto, Japan, and various dance pad controllers sold by Konami Corp. of Tokyo, Japan. Such a device is disclosed in U.S. Pat. No. 6,786,821 to Nobe et al. These floor mat game controllers are designed to emulate button presses on a video game controller using a user's feet and thereby ultimately fail to emulate natural locomotion.
Known systems further include treadmill devices that track the user's movement on the treadmill. Such a device is disclosed in U.S. Pat. No. 5,562,572 to Carmein. These treadmill devices are often mechanically complicated, and are thus encumbered by the inherent lag times and momentum problems associated with moving mechanical masses.
Other known systems include a concave base that the user steps into and walks in place, as gravity brings the user back into the center of the concave base. Such a device is disclosed in U.S. Pat. App. PCT/US2015/021614. This system is cumbersome, as the user has to wear a harness and special shoes in order to use the system, adding cost and complexity.
Yet another system uses a pressure-sensitive mat that the user walks around on. This system is disclosed in U.S. Pat. No. 7,520,836 to Couvillion et al. While this system reduces the complexity of other systems, the user only has a finite area to move around in the virtual environment.
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosure. The summary is not an extensive overview of the disclosure. It is neither intended to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure. The following summary merely presents some concepts of the disclosure in a simplified form as a prelude to the description below.
Aspects described herein provide a system which uses an array of sensors to determine the natural motion of a user and implement such motions in a virtual environment.
A sensor array may be configured with a plurality of sensors, such as pressure-sensitive sensors, which detect the motion of a user. The sensor array may be configured to transmit signals from the sensors to a computing device. A computing device may, based on these signals and, if available, other information from other additional sensors, determine a vector corresponding to both a magnitude and direction of a user's movement in real space. The computing device may translate this vector into the virtual environment such that, for example, the user's motion in real space is substantially replicated in the virtual environment.
The sensor array described herein may be comprised of one or more modular elements which each comprise one or more sensors. For example, the sensor array may substantially comprise an octagon comprising eight different modular elements connected together. Each modular element may contain one or more sensors to detect, for example, pressure, and may additionally include additional sensors and/or a microcontroller for sending signals from the sensors to the computing device. The modularity of the sensor array allows the array to be as large or as small as a user may desire.
The vector determined by the computing device may be based on not only signals from the sensor array, but from other information available to the computing device, thereby enhancing accuracy. For example, the positional tracking used in modern virtual reality headsets, such as the VIVE™ virtual reality headset sold by HTC Corporation of Taoyuan, Taiwan, may be used by the computing device to better process signals received from the sensor array.
The details of these and other features are set forth in the accompanying drawings and the description below. Other features and advantages will be apparent from the description and drawings, and from the claims.
These and other features, aspects, and advantages of the present disclosure will become better understood with regard to the following description, claims, and drawings. The present disclosure is illustrated by way of example, and not limited by, the accompanying drawings in which like numerals indicate similar elements.
In the following description of the various features, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration, various features of the disclosure that may be practiced. It is to be understood that other features or combinations of features may be utilized.
Sensor array 100 may be any appropriate shape for capturing user locomotion. For example, sensor array 100 may substantially comprise a circle, oval, square, or octagon. Indeed, because sensor array 100 may be comprised of modular elements 101-105, further discussed below, sensor array 100 may be configured into a wide variety of shapes. Sensor array 100 may be shaped such that certain, more frequently used portions of sensor array 100 are larger than less commonly used portions of sensor array 100. For example, a portion of sensor array 100 corresponding to forward motion may be larger than a portion of sensor array 100 corresponding to backwards motion.
Sensor array 100 may have one or more modular elements 101-105. Like the sensor array 100, modular elements may be any shape. Modular elements may, for example, be square tiles similar to floor tiles to allow for a limitless number to be connected together, or may be one or more portions of a fixed shape, such as the circular shape depicted in
Each modular element 101-105 may contain one or more layers. Each modular element may comprise a top layer 106, a sensor layer 107, and a bottom layer 108. The bottom layer 108 may be designed for impact resistance and/or cushioning, whereas the top layer 106 may be designed to protect sensor layer 107 from the user. These layers may be any material or combination of material. Material considerations for layers may be made based on a variety of considerations, including structural integrity, comfort, durability, cost, and cleanliness. For example, bottom layer 108 may be plastic and rubber, sensor layer 107 may be one or more sensors 109 and foam, and top layer 106 may be plastic. As another example, a more expensive modular element may be entirely built from metal and rubber.
Each modular element 101-105 may have more or fewer layers as necessary. For instance, some modular elements might not have a bottom layer 108. Each modular array may comprise a single layer 107 containing one or more sensors. Such a barebones modular array may be useful where, for example, a user wishes to place such sensors underneath a floor covering, such as underneath floor tiles or a rug.
The one or more sensors 109 may be any sensor(s) appropriate for detecting user locomotion and transmitting a signal based on the locomotion. The one or more sensors 109 may include one or more pressure-sensitive sensors, such as resistive type pressure sensors and/or load cells. The one or more sensors 109 may be configured to detect any information relating to user locomotion, such as pressure, weight, impact, or the like. The one or more sensors may be different types of sensors in any configuration such that, for example, a plurality of sensor types may be on the same modular element 101-105. Where more than one sensor of the one or more sensors 109 is in a modular element 101-105, the more than one sensor may be disposed in the modular element 101-104 in any appropriate manner that best detects user locomotion. For example, if modular element 104 contained a hundred pressure-sensitive sensors, the pressure-sensitive sensors may be arranged to substantially form a one hundred-element grid across sensor layer 107 to best capture user locomotion across the entirety of, e.g., modular element 104.
Sensors may substantially overlap such that redundancies may exist. For example, one or more sensors 109 may be configured such that pressure in an area of modular element 104 causes a plurality of the sensors to transmit signals. Such redundancies may be useful to increase the accuracy of the signal transmitted by the one or more sensors.
Modular elements 101-105 may contain additional hardware or circuitry to transmit signals from sensor(s) contained in the modular element. Each modular element may contain a microcontroller (not pictured) used to transmit signals from one or more sensors to another modular element or to computing device 111.
Modular elements 101-105 may also contain circuitry which enhances VR/AR/MR tracking, such as circuitry which facilitates positional tracking of a user. For example, the VIVE™ infrared lighthouses sold by HTC Corporation of Taoyuan, Taiwan currently use infrared light signals to allow a headset to track its position in real space. Infrared LEDs may be implemented into one or more modular elements 101-105 for similar purposes.
One or more modular elements 101-105 may be designated as a neutral zone such that the user may stand on it when movement in the virtual environment is not desired. In
Sensor array 100 may have one or more wired or wireless interfaces 110a-110b configured to transmit signals associated with the one or more sensors associated with modular elements 101-105 to computing device 111 or to other modular elements 101-105. Such one or more interfaces 110a-110b may be wired or wireless, and may comprise one or a plurality of interface types. For instance,
As noted above, one or more modular elements 101-105 may receive signals from other modular elements 101-105. Modular element 105, for example, may receive signals over a first interface from modular elements 101-104 and transmit them, using the same or a second interface, to computing device 111. In this manner, only one modular element need be tasked with transmitting to computing device 111. These communications may also reduce the need for interfaces between sensor array 100 and computing device 111: if sensor array 100 were the size of an entire room with tens of modular elements, then intra-modular element communications would prevent the need for tens of wires running from each modular element to computing device 111.
As will be detailed further below, computing device 111 may be any computing device ultimately connected to sensor array 100 through one or more interfaces 110a-110b. Computing device 111 may be physically attached to sensor array 111 as well. For example, computing device 111 may be located underneath modular element 105. Computing device 111 may additionally or alternatively be, as explained in more detail below, a personal computer, laptop, video game console, or the like physically separated from sensor array 111.
Computing device 111 may be configured with one more additional sensors 112 relating to user motion. The one or more additional sensors 112 may include, for example, one or more motion trackers, such as the aforementioned VIVE™ infrared lighthouses sold by HTC Corp. of Taoyuan, Taiwan, the OCULUS™ sensor devices sold by Oculus VR of Irvine, Calif., or the KINECT™ motion sensing input device sold by Microsoft Corp. of Redmond, Wash. The additional sensors 112 may transmit signals to computing device 111 over one or more wireless or wired interfaces 113a-113b, which may be separate to or the same as the one or more interfaces 110a-110b. For example, computing device may receive signals from both sensor array 100 and the one or more additional sensors 112 over Bluetooth.
But if pressure changes the resistivity of the second sensor to 10 ohms, then
This increase in voltage may be detected and associated with a change in pressure.
In step 601, sensor array 100 is configured. Configuration may include establishing one or more wired or wireless interfaces 110a-110b. Sensor array 100 may, for example, be a Universal Serial Bus Human Interface Device (“USB HID”) (e.g., a USB HID compliant device) and use one or more device drivers to establish communications with computing device 111. The manner in which one or more wired or wireless interfaces 110a-110b may be established depends on the nature of the interface.
Configuration in step 601 may include determining one or more baseline measurements for the one or more sensors 109, such as a value of the sensor when the user is not moving. In this step, computing device 111 may determine a threshold sensitivity level associated with one or more sensors 109. If, for example, sensor array 100 comprises a plurality of pressure-sensitive sensors, the baseline measurements may allow the computing device 111 to determine readings to ignore from the sensor array 100. In this manner, if two sensors' baselines are different, computing device 111 may account for this difference in subsequent calculations. For instance, computing device 111 may detect that, when a user is not moving, a sensor erroneously reports motion, and thus computing device 111 may ignore or reduce its consideration of signals from that sensor. Such a baseline may change over time. For example, the layers of one or more modular elements 101-105 may wear down with use such that a sensor reads more pressure at baseline after thirty minutes of use than it did when initially configured. Computing device 111 may thus reset the baseline measurements to account for such changes.
In configuration step 601, one or more portions of sensor array 100 may be disabled or ignored based on, for example, whether all or part of sensor array 100 is broken or not needed for a particular program executing on computing device 111. For example, if sensor array 100 covers the entire floor of a large room but a program executing on computing device 100 does not require that a user move around the entire room, a portion of sensor array 100 may be disabled to maximize the computational efficiency of computing device 111, to conserve power, and/or to reduce confusion by a user inadvertently stepping on an unused sensor. Computing device 111 may instruct a microcontroller in sensor array 100 to cease transmission of signals associated with one or more sensors on this basis.
In configuration step 601, modular elements 101-105 may be classified or ranked, including classifying one or more modular elements as a neutral zone. A neutral zone may be classified where a portion of sensor array 100 is associated with a lack of motion, that is, standing. Computing device 111 may instruct microcontrollers in sensor array 100 to cease transmission of signals associated with sensors in a neutral zone. Additionally or alternatively, computing device 111 may simply ignore and/or discard signals received that are associated with sensors in a neutral zone. The neutral zone may change over time based on, for example, the nature of a program executing on computing device 111 such that this configuration step may be repeated by computing device 111 as needed.
Configuration step 601 may further entail determining the locations of one or more sensors. As will be described in more detail below, the location of the one or more sensors may have different implications based on, for example, the orientation of the user. Sensors may be identified based on their location within a modular element or based on their location in sensor array 100. For instance, modular elements 101-105 may be configured using microcontrollers to communicate and determine the way in which they have been attached to ultimately determine the locational relationship between sensors. Computing device 111 may additionally or alternatively require a user to implement one or more configuration steps to identify positions on sensor array 100. Computing device 111 may, for example, prompt a user to step on portions of sensor array 100 corresponding to cardinal directions and associate sensors actuated during such steps with the cardinal directions.
In step 602, one or more signals are received from sensor array 100. These signals may be on one or more interfaces. For instance, the one or more signals received from sensor array 100 may comprise quantized values of sensor readouts transmitted over a wireless connection, such as Bluetooth.
The one or more signals transmitted by sensor array 100 and received in step 602 may be formatted in a variety of ways. Sensor array 100 may use one or more microcontrollers to process and transmit digitized forms of analog signals transmitted by sensors in sensor array 100. Sensor array 100 may additionally or alternatively multiplex signals received from the one or more sensors and transmit the multiplexed signals as a single signal to the computing device 111. Because input lag is undesirable, minimal processing may be preferable before signals are received by computing device 111. As such, the format of the signals received in step 602 may be very basic and require processing by computing device 111.
Receipt of signals in step 602 may be responsive to a request by computing device 111. Some standards, including the USB HID specification, entail a process whereby computing device 111 polls sensor array 100 at a frequency, such as 250 Hz. This process advantageously allows computing device 111 to vary the polling frequency based on, for example, the desired fidelity of information from sensor array 100.
Processing of the signal may be required upon receipt to decrypt or reduce errors in the signal. For example, if the signal received in step 602 contains a number of errors, any number of known algorithms may be used to correct such errors. If the signal received from sensor array 100 is multiplexed, computing device 111 may demultiplex the signal.
In step 603, computing device 111 may determine the locations of the signals received with respect to sensor array 100. The signal retrieved in step 602 may contain sufficient information to allow computing device 111 to determine the location of each sensor as well as one or more readings associated with the sensor. Such locational information may be based on the locations determined in step 601.
In step 604, computing device 111 may receive one or more second signals from one or more additional sensors 112 over one or more interfaces 113a-113b. Sensors may come from a variety of sources, including one or more positional trackers, cameras, microphones, accelerometers, gyroscopes, or the like. The particular manner in which a signal is received by computing device 111 will differ based on the additional sensor 112 in question. For example, a positional tracker may be connected to computing device 111 over USB, but an accelerometer may be included in a wireless video game controller. The second signals may be processed by a driver or an API executing on computing device 111. For instance, signals received by computing device 111 from sensor array 100 may come via a first API executing on a video game engine, whereas signals from a microphone additional sensor 112 may be received by computing device 111 using a second API.
In step 605a, if one or more second signals are not received by computing device 111, then computing device 111 interprets the one or more signals received from sensor array 100. In step 605b, if one or more second signals are received by computing device 111, then computing device 111 interprets the both the one or more signals received from sensor array 100 and the second signals. Interpretation of either or both signals may be based on any of the considerations discussed below.
Interpretation of signals received from sensor array 100 may be based on one or more values derived from the signals. Computing device 111 may process received signals to derive one or more values, such as a fixed reading, a rate of change, or the like. A sensor of the sensor array 100 may provide a range of values corresponding to, for example, a range of pressures on a pressure sensor. These values may be used to interpret user motion by, for example, associating certain values with walking and certain values with running. Such values may depend on the type of sensor in sensor array 100. For instance, a sensor of sensor array 100 may be configured to transmit signals indicating weight, but not pressure.
Interpretation of signals received from sensor array 100 may comprise determining a vector associated with signals received from sensor array 100. A large step towards the top of sensor array 100 may generate a different vector than, for example, a small step towards the bottom of sensor array 100.
Interpretation of signals received from sensor array 100 may comprise determining a direction in which a user faces. A user may, for instance, rotate their body 180 degrees and step forward. In this case, a signal that would formerly have represented a backwards step would represent a forwards step. The direction in which the user faces may be determined by current or historical signals received by sensor array 100 or the one or more second signals received by additional sensors 112.
Interpretation of signals received from sensor array 100 may comprise associating the direction in which the user faces and the vector associated with sensor array 100. A large step towards the top of sensor array 100, for example, may travel in different directions (and therefore involve different sensors on sensor array 100) based upon the direction in which the user faces. For example, if the user is facing towards the top of the sensor array, the step may indicate forward travel, whereas if the user is facing away from the top of the sensor array, the step may indicate backwards travel. Computing device 111 may determine an offset angle based on a comparison of the direction which the user is facing and the vector associated with user motion. This association may be used by computing device 111 to avoid forcing the user to re-orient their physical orientation with their virtual orientation.
The association of the direction in which the user faces and the vector associated with sensor array 100 may comprise determination of a second vector. An example of such calculations is provided in the following paragraphs.
For this example, let the sensor array 100 be a circle which is divided into n slices, where n is a multiple of 4. Let q be the number of modular arrays per quadrant of the sensor array 100, such that
Let θ be me central angle (in degrees) of the arc of each slice n such that
Let i be the ordering of each slice n within each quadrant where, counting in a clockwise fashion i ranges from 0, to i−1. Let r be the vector in a single slice n, where
Then, let R be the vector of an entire quadrant composed of q slices:
Since the magnitude of r is always positive, a sign must be placed on each vector R according to which quadrant it represents. Let the x-components of the first and second quadrants be positive, the x-components of the third and fourth quadrants be negative, the y-components of the first and fourth quadrants be positive, and the y-components of the second and third quadrants be negative. Finally, let the vector V be the summation of the quadrant vectors R such that Vx2+Vy2=V2, Vx=Rx1+Rx2−Rx3−Rx4, and Vy=Ry1+Ry2−Ry3−Ry4. The vector V is the absolute directional vector generated by the absolute vector calculation algorithm.
The rotation of the user is presented through many VR, AR, and MR systems' API as a quaternion. The resultant vector representing the user's motion is a multiplication of the VR system's rotation of the user and the absolute directional vector. Let the quaternion of the VR System's rotation be Q, and the absolute vector be V. The velocity of vector of the user in a VR, AR, or MR virtual environment (Vuser) is simply the cross product of the quaternion Q and the absolute vector V: Vuser=Q×V
The rotation angle and quadrant calculation will generate an angle α which represents the offset of the absolute directional vector from the x or y axis. The angle α will enable the pad to apply a rotation to the absolute directional vector to generate a relative directional vector. The calculation of angle α is dependent on the magnitude of the x component of the absolute directional vector, the magnitude of the y component of the absolute directional vector, and the quadrant in which the absolute directional vector lies in, as follows:
Let the absolute directional vector components Vx, Vy, and p represent the x component, y component, and quadrant of the absolute directional vector. Let α represent the angle in which the absolute directional vector is offset from the nearest axis in the counter clockwise direction, such that: in the first quadrant, the x and y components are positive, and
in the third quadrant, the x and y components are negative, and
in the second quadrant, the x component is positive, the y component is negative, and
and in the third quadrant, the x component is negative, the y component is positive, and
Let the rotation quadrant t equal the current quadrant of the absolute directional vector p such that t=p.
The application of rotation angle and quadrant offset algorithm uses the current quadrant of the absolute directional vector p, the offset quadrant t, and the rotation angle α to rotate the absolute directional vector p by α, and by quadrant t to create a new relative direction vector p′, as follows:
Let the absolute directional vector components Vx, Vy, and p represent the x component, y component, and quadrant of the absolute directional vector. Then, apply the rotation angle α, let p′x=px cos α−py sin α, and let p′y=py cos α+px sin α. Then, to apply the offset quadrant the following logic will rotate the offset vectors to the appropriate quadrant:
Discussion will now return to various methods of interpreting signals in steps 605a-605b.
Signals received by computing device 111 from sensor array 100 and/or signals from the one or more additional sensors 112 may be consistent or may conflict. For instance, signals received from sensor array 100 may indicate that a user desires to step forward, whereas signals received from the one or more additional sensors 112 may indicate that a user remains still. Computing device 111 may take any appropriate steps, including the considerations detailed below, to interpret signals in view of such conflict. For example, algorithms such as a Kalman filter may be used to resolve such conflicts.
Interpretation of signals received from sensor array 100 may depend on one or more classifications associated with either or both modular elements 101-105 or the one or more sensors 109. For instance, modular element 105 may be classified as a neutral zone such that sensor information from that modular element is interpreted as a desire to not move (e.g. standing or sitting). Such a classification may be analogous to the dead zone on an analog control stick, such as those found on video game controllers. Portions of modular elements 101-105 may be classified in other ways. For instance, the proximal portions of modular elements 101-105 may be associated with slow motion, whereas the distal portions of modular elements 101-105 may be associated with fast motion.
Interpretation of signals from sensor array 100 may further be based on a history of previous signals received from sensor array 100. For instance, a user playing an exciting action game may regularly fail to return to a classified neutral zone on one or more modular elements 101-105. In that case, computing device 111 may subsequently determine that stepping near, but not on, the classified neutral zone is still indicative of a lack of motion. As another example, a user may rest their foot in one or more areas of sensor array 100 but no longer wish to travel in the virtual environment.
Interpretation of signals from sensor array 100 may further be based on one or more physical limitations associated with a user. Computing device 111 may use information associated with the user, such as the user's height, athletic ability, or previous movements, to interpret motion by the user. Computing device 111 may further interpret signals from sensor array 100 in a manner to avoid or mitigate nausea.
Interpretation of signals from sensor array 100 may further be based on one or more programs executing on computing device 111. If, for example, a user were playing a video game wherein the user was facing a virtual wall, computing device 111 may interpret signals received from sensor array 100 based on an assumption that the user does not desire to walk into the wall.
Interpretation of signals from sensor array 100 may further be based on one or more algorithms associated with the signals. Computing device 111 may apply one or more weights to received signals or conflicting interpretations. Computing device may also use one or more machine learning algorithms to best determine user motion.
Interpretation of signals from sensor array 100 may further be based on one or more user preferences. A user may provide preferences to computing device 111 indicating that certain motions are to be associated with certain actions in the virtual environment. A user of sensor array 100 may, for example, prefer that all steps on a certain modular element 101-105 be interpreted as running, rather than walking.
In step 606, based on the interpretation of the one or more signals from sensor array 100 and, if applicable, the one or more signals received from additional sensors 112, computing device 111 may determine a corresponding motion in the virtual environment. The corresponding motion and the interpreted signals in steps 605a-605b may be the same or different. For example, a user playing a video game may attempt to move forwards, but the video game might not permit forwards movement due to, for example, a virtual wall. In such an example, the computing device 111 may determine that a second motion, such as a bouncing action off the virtual wall, may be appropriate. The corresponding motion may be no motion at all. One or more rules or algorithms, such as a Kalman filter, may be used to resolve such conflicts.
In step 607, computing device 111 causes the corresponding motion in the virtual environment. As noted above, the corresponding motion may be different than the interpreted motion and, in some instances, may be no motion at all. Because the computing device 111 may be operating an API to manage signals received from sensor array 100, the same program interpreting signals from sensor array 100 need not be the same program presenting the virtual environment to the user. For instance, the actions performed by computing device 111 in steps 601-606 may be performed by a driver executing on computing device 111 which ultimately transmits information to a second program, such as a video game, exercising on computing device 111.
In step 607, causing motion may comprise sending information relating to the corresponding motion to a second computing device, program executing on the computing device 111, or the like. Computing device 111 may transmit, to another program or second computing device, data indicating a vector associated with the corresponding motion in the virtual environment determined in step 606. The second computing device or other program may further process the received vector. As such, computing device 111 need only ultimately cause motion in the virtual environment through another program, computer, or the like.
While illustrative reference has been made in various places to virtual reality and virtual environments, the teachings herein also apply to augmented reality and mixed reality environments. References to virtual reality should not be viewed as excluding augmented and/or mixed reality unless one is explicitly stated to the exclusion of the other(s).
One or more aspects of the disclosure may be embodied in a computer-usable media and/or computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other data processing device. The computer executable instructions may be stored on one or more computer readable media such as a hard disk, optical disk, removable storage media, solid state memory, RAM, etc. As will be appreciated by one of skill in the art, the functionality of the program modules may be combined or distributed as desired in various embodiments. In addition, the functionality may be embodied in whole or in part in firmware or hardware equivalents such as integrated circuits, field programmable gate arrays (“FPGA”), and the like. Particular data structures may be used to more effectively implement one or more aspects of the disclosure, and such data structures are contemplated within the scope of computer executable instructions and computer-usable data described herein.
Aspects of the disclosure have been described in terms of examples. While illustrative systems and methods as described herein embodying various aspects of the present disclosure are shown, it will be understood by those skilled in the art, that the disclosure is not limited to these embodiments. Modifications may be made by those skilled in the art, particularly in light of the foregoing teachings. For example, each of the features of the aforementioned examples may be utilized alone or in combination or sub-combination with elements of the other examples. For example, any of the above described systems and methods or parts thereof may be combined with the other methods and systems or parts thereof described above. For example, the steps shown in the figures may be performed in other than the recited order, and one or more steps shown may be optional in accordance with aspects of the disclosure. It will also be appreciated and understood that modifications may be made without departing from the true spirit and scope of the present disclosure. The description is thus to be regarded as illustrative instead of restrictive on the present disclosure.
This application claims benefit to provisional Application No. 62/406,457, filed Oct. 11, 2016, herein incorporated by reference in its entirety, for all purposes.
Number | Date | Country | |
---|---|---|---|
62406457 | Oct 2016 | US |