The subject disclosure relates generally to wearable devices and more specifically to proximity and/or liveness detection using an ultrasonic transceiver.
Wearable devices (WD) or wearable computing devices comprise a class of devices that are typically battery powered and comprise a central processing unit. They may comprise further features, such as communications, display, and user interface (UI) abilities, input-output (IO), and the like, and they may be provided with an array of sensors to enable feature rich applications. Moreover, wearable devices or wearable computing devices are typical designed to be as small and unobtrusive as possible so as to facilitate comfort and usability without overly hindering a user's activities.
As a result, maintaining battery life is a primary concern, which is at odds with maximizing attractive features and while maintaining small size of wearable devices or wearable computing devices. Thus, ensuring that unnecessary battery drain is kept at a minimum for wearable devices or wearable computing devices is a challenge.
For example, classes of wearable devices or wearable computing devices can include augmented reality (AR) glasses, virtual reality (VR) headsets, or head mounted displays (HMDs). They typically comprise a headset with a display and communications functionality that allows receipt of visual information to be displayed to a user. In addition, to improve user experience, for example, to avoid spoiling a VR experience, such wearable devices or wearable computing devices are both lightweight and battery powered, which places battery life at a premium. Accordingly, it is of paramount importance in such devices that they are powered off or powered down to a low power state when not in use and power on quickly when use is desired.
For instance, VR headsets comprise an optical proximity sensor that shines an infrared (IR) beam and analyzes the reflected infrared energy to determine whether a user is currently using the device. Typically, when a user wears the VR headset, it turns on, using battery power on the order of 10 Watts (W) of power. When the user is not wearing the VR headset, it should remain in a low power state, as governed by the proximity sensor. However, such proximity detection schemes can be thwarted, and resultant power saving opportunities lost, by inanimate obstructions such as the strap of the VR headset or high storage temperatures. As a result, battery capacity can be inadvertently drained during non-use for which the VR headset.
It is thus desired to provide improved wearable devices, designs and processes that address these and other deficiencies. The above-described deficiencies are merely intended to provide an overview of some of the problems of conventional implementations and are not intended to be exhaustive. Other problems with conventional implementations and techniques and corresponding benefits of the various aspects described herein may become further apparent upon review of the following description.
The following presents a simplified summary of the specification to provide a basic understanding of some aspects of the specification. This summary is not an extensive overview of the specification. It is intended to neither identify key or critical elements of the specification nor delineate any scope particular to any embodiments of the specification, or any scope of the claims. Its sole purpose is to present some concepts of the specification in a simplified form as a prelude to the more detailed description that is presented later.
Devices and methods are provided that facilitate proximity or liveness detection of a user of a wearable device or a user interacting with a device based on ultrasonic information from an ultrasonic transceiver. In various embodiments, machine learning classifier models can be employed to generate classification predictions of a donned or a doffed state of a wearable device. In various aspects, a gated-recurrent unit (GRU) recursive neural network (RNN) can be employed as a machine learning classifier model. In other aspects, a liveness detection classifier decision tree can be employed as a machine learning classifier model. Power states or operating modes can be selected for associated devices based on the proximity or liveness of the user of the wearable device, as an example.
These and other embodiments are described in more detail below.
Various non-limiting embodiments are further described with reference to the accompanying drawings in which:
While a brief overview is provided, certain aspects of the subject disclosure are described or depicted herein for the purposes of illustration and not limitation. Thus, variations of the disclosed embodiments as suggested by the disclosed apparatuses, systems, and methodologies are intended to be encompassed within the scope of the subject matter disclosed herein.
As described above, it is of paramount importance in battery powered wearable devices (WD) or wearable computing devices that they are powered off or powered down to a low power state when not in use in addition to powering on quickly when device use is desired. As a result, improvements in proximity and/or liveness detection of human use or non-use of such devices are desired. It can be understood that long battery life is a user concern regardless of the type of battery-powered WD or wearable computing device that is encountered. Thus, while various embodiments are described herein in the context of virtual reality (VR) headsets or head mounted displays (HMDs), it can be understood that the various embodiments described herein are not limited to such end-user device applications.
As used herein, the term, “proximity” is used to describe a condition of nearness between two or more objects, such as in the sense of an ultrasonic transceiver and a user of interest for detection of a user using a device (e.g., VR headsets, HMDs, other devices for which proximity detection is of interest) or in proximity to a device. Thus, the detection of “proximity” is of concern where a sensor can sense an object such as a user within the sensor's detection range. In addition, as further used herein, the term, “liveness,” is used to refer to indications that set one inanimate object apart from a living object or portion thereof such as a body part intended or caused to be in proximity to a device, which could typically include some indication including an/or in addition to proximity. Accordingly, as used herein, determination of “liveness” can be based on proximity (e.g., detection of a user within the detection range of the ultrasonic transceiver 104/204) or a change in the proximity (e.g., variations in the detection of a user within the detection range of the ultrasonic transceiver), a physiological event detection (e.g., in the ultrasonic information provided by ultrasonic transceiver), a predetermined time delay (e.g., elapsed time), or a predetermined classification cycle delay (e.g., number of elapsed cycles of the ultrasonic transceiver of the application of the liveness classification algorithm). Note here that an exemplary determination of liveness can either be a positive liveness indication (e.g., that a human is wearing or is interacting with an appropriately configured device) or a negative indication (e.g., that a human has ceased wearing or interacting with an appropriately configured device). Thus, as used herein, “liveness” detection based on a physiological event detection can comprise one or more of a pulse rate detection, a circulatory flow detection, a respiratory event detection, an eye movement detection, an eye blink detection, a facial movement detection, a facial expression change detection, a muscular movement detection, and the like, without limitation. Accordingly, while the various embodiments are described herein in terms of exemplary liveness detection and power management schemes, it can be appreciated that the various disclosed embodiments can be employed to leverage the ultrasonic transceiver “liveness” indication detections can be employed in further applications, including, but not limited to, controlling a user interface, issuing device commands, rendering audio or video data to a user based on users perceived focus, and so on, based on the detection the “liveness” indication.
For instance, the various embodiments described herein that facilitate proximity and/or liveness detection can find applicability in other battery-powered WD or wearable computing device contexts such as for augmented reality (AR) glasses, headphones, smart watches, fitness trackers, wearable monitors such as heart monitors, and similar devices. Moreover, the various embodiments described herein that facilitate proximity and/or liveness detection can find applicability in other battery-powered computing device contexts that are closely associated with active human use interspersed with periods of inactivity, regardless of whether the devices can be strictly defined as “wearable,” such as smart phones, video game controllers, and the like. Accordingly, it can be understood that, while the various embodiments described herein that facilitate proximity and/or liveness detection are described in terms of VR headsets HMDs, the herein appended claims are not strictly limited to such applications.
One or more embodiments are now described with reference to the drawings, wherein like referenced numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a more thorough understanding of the one or more embodiments. It is evident, however, in various cases, that the one or more embodiments can be practiced without these specific details.
Thus, while
In a further non-limiting aspect, exemplary ultrasonic transceiver 104/204 can be positioned within the wearable device such that the field of view of the ultrasonic transceiver 104/204 will cover one or more body part or portions thereof of the user in wearing or operation of the device. As described above, liveness and/or proximity detection for battery-powered devices in accordance with one or more embodiments described herein may be useful in device contexts other than wearable devices, regardless of whether the devices can be strictly defined as “wearable,” for example such as for smart phones, video game controllers, and the like. Thus, exemplary ultrasonic transceiver 104/204 can be positioned within the device such that the field of view of the ultrasonic transceiver 104/204 will interact with one or more body part or portions thereof of the user in expected operation of the device. In either case, exemplary ultrasonic transceiver 104/204 can be positioned within the device such that the ultrasonic transceiver 104/204 can be expected to experience movement relative to the exemplary ultrasonic transceiver 104/204 when the wearable device is being worn by the user or the device is interacting or operating the device.
In various embodiments, exemplary UTSIP 102 or IC 202 can comprise or be associated with one or more computer-readable memory, such as memory 106. In further embodiments, exemplary UTSIP 102 or IC 202 can comprise or be associated with one or more processors, such as, for example, processor 108, that can be configured to execute computer-executable components stored in a computer-readable memory (e.g., memory 106).
As a non-limiting example, computer-executable components that can be executed by processor 108 can comprise a liveness detection classifier component 110 that facilitates liveness and/or proximity detection in accordance with one or more embodiments described herein. For instance, an exemplary liveness detection classifier component 110 can be configured as described herein to generate a classification of ultrasonic information (e.g., received from ultrasonic transceiver 104) as indicative of proximity and/or liveness of a user of a wearable device comprising exemplary UTSIP 102 or IC 202 via a liveness classification algorithm, for example, as further described herein.
As another non-limiting example, computer-executable components that can be executed by processor 108 can comprise a determination component 112 that facilitates liveness and/or proximity detection in accordance with one or more embodiments described herein. For instance, an exemplary determination component 112 can be configured to generate a determination of proximity and/or liveness of the user of the wearable device based on the classification (e.g., via exemplary liveness detection classifier component 110), for example, as further described herein.
In a non-limiting aspect, an exemplary determination of liveness can be based on proximity (e.g., detection of a user within the detection range of the ultrasonic transceiver 104/204) or a change in the proximity (e.g., variations in the detection of a user within the detection range of the ultrasonic transceiver 104/204), a physiological event detection (e.g., in the ultrasonic information provided by ultrasonic transceiver 104/204), a predetermined time delay (e.g., elapsed time), or a predetermined classification cycle delay (e.g., number of elapsed cycles of the ultrasonic transceiver 104/204 of the application of the liveness classification algorithm). Note here that an exemplary determination of liveness can either be a positive liveness indication (e.g., that a human is wearing or is interacting with an appropriately configured device) or a negative indication (e.g., that a human has ceased wearing or interacting with an appropriately configured device). In a further non-limiting aspect, an exemplary physiological event detection can comprise one or more of a pulse rate detection, a circulatory flow detection, a respiratory event detection, an eye movement detection, an eye blink detection, a facial movement detection, a facial expression change detection, a muscular movement detection, and the like, without limitation. In still further non-limiting aspects, an exemplary determination component 112 can be configured to generate a determination of proximity and/or liveness of the user of the wearable device based on the classification (e.g., via exemplary liveness detection classifier component 110) and on one or more other inputs, either available by one or more sensors (not shown) associated with exemplary determination component 112, or otherwise. For example, further non-limiting embodiments can include or be associated with one or more sensors (not shown) other than an ultrasonic sensor, e.g., infrared sensor, temperature sensor, accelerometer, gyroscope, environmental sensor, acoustic sensor, and so on, without limitation, and exemplary determination component 112 can be configured to process such inputs in addition to the classification (e.g., via exemplary liveness detection classifier component 110) to facilitate generating a determination of proximity and/or liveness of the user of the wearable device.
For instance, exemplary ultrasonic transceiver 104/204 can be configured to emit ultrasonic pulses and receive echoes as ultrasonic information, as further described herein regarding
In other non-limiting embodiments, computer-executable components that can be executed by processor 108 can comprise a power management component 114 that facilitates controlling power states of the wearable device based on the liveness and/or proximity detection in accordance with one or more embodiments described herein. For instance, an exemplary power management component 114 can be configured as described herein to select a power state or an operating mode of the wearable device based on the determination (e.g., via exemplary determination component 112) of the proximity and/or liveness of the user of the wearable device, for example, as further described herein. While not shown in
In various embodiments described herein, an exemplary liveness classification algorithm can be based on a machine learning (ML) model trained on captured and classified ultrasonic information received during a number of states of use of a test wearable device (or a test device for non-wearable devices). For instance,
ML frameworks provide software libraries and application programming interfaces (APIs) for tasks like data preprocessing, model building, and the like to facilitate development of ML models such as exemplary ML model 122. One popular ML framework is TensorFlow, which can be used to develop an exemplary ML model 122, for example, as further described herein. As a non-limiting example, exemplary ML model training system 116 can be employed to produce an exemplary ML model 122 for the classification of ultrasonic information as indicative of proximity or liveness of a user of a wearable device (or a device in use). Thereafter, specific implementations of the exemplary ML model 122 can be compiled for specific hardware implementations such as exemplary UTSIP 102 or IC 202 in conjunction with ultrasonic transceiver 104/204, for example to produce an exemplary liveness detection classifier component 110, exemplary determination component 112, and the like that facilitate liveness and/or proximity detection in accordance with one or more embodiments described herein.
These hardware specific ML model components can be stored in one or more computer-readable memory, such as memory 106, associated with exemplary UTSIP 102 or IC 202, and one or more processors such as exemplary processor 108 can be configured to execute the computer-executable ML model components (e.g., exemplary liveness detection classifier component 110, exemplary determination component 112, and the like) stored in the computer-readable memory (e.g., memory 106) to facilitate liveness and/or proximity detection in accordance with one or more embodiments described herein. Further description of training and execution of various embodiments that employ ML models to facilitate liveness and/or proximity detection are provided regarding specific non-limiting implementations, for example, regarding
Exemplary UTSIP apparatus or device 302 can communicate with a host processor (not shown) via serial peripheral interface component (SPI) 316, facilitated by SPI chip select CS_B 318 (from external SPI host (not shown)), SPI Interface Clock SCLK 320 (e.g., from external SPI host (not shown)), SPI chip select CS_B 320 (from external SPI host (not shown)), MCU Out Sensor In serial data MOSI 322 (from external SPI host (not shown)), and MCU In Sensor Out serial data MISO 324 (to external SPI host (not shown)). Exemplary UTSIP apparatus or device 302 can further support general purpose IO (GPIO) for example, via interrupt request (INT1 326, INT2 328), which can be used as a system wake-up source when a measurement is ready. Exemplary UTSIP apparatus or device 302 can be supplied by digital logic supply VDD 330, analog power supply AVDD 332, I/O power supply VDDIO 334, with exemplary UTSIP apparatus or device 302 tied to ground GND 336. Exemplary UTSIP apparatus or device 302 can receive a clock signal via external I/O low frequency reference clock LFCLK 338 and external I/O 16× operating frequency reference clock MUTCLK 340 is controlled to 16 times the operating frequency (transmit and receive) (Fop).
Thus,
In the non-limiting example of a wearable device such as a VR headset with an exemplary UTSIP apparatus or device (e.g., exemplary UTSIP 102, exemplary IC 202 with ultrasonic transceiver 204, exemplary UTSIP apparatus or device 302) placed so that the ultrasonic transceiver is facing a user forehead, the distance between exemplary UTSIP apparatus or device (e.g., exemplary UTSIP 102, exemplary IC 202 with ultrasonic transceiver 204, exemplary UTSIP apparatus or device 302) and the user's forehead is typically constrained by the VR headset construction. As a result, in situations where the VR headset is not worn, there should be no primary echo 504, however, as depicted in
As an example,
However,
Accordingly, due to the variety and size of the data that must be analyzed to develop a successful classification algorithm, various disclosed embodiments can employ a machine learning algorithm to facilitate liveness and/or proximity detection and classification thereof as further described herein. Thus, the various disclosed embodiments can detect when user is putting the headset on (or other wearable device, or other device interaction), e.g., “donned,” and turn the device on, where a donning event can result in a “worn” state being set to 1 in a particular non-limiting embodiment. In addition, the various disclosed embodiments provide robust detection, regardless of wearable device or device usage (e.g., user moving, user not moving, user interactions in addition to detected interaction), resulting in “worn” remaining equal to 1 in a particular non-limiting embodiment. Moreover, the various disclosed embodiments can provide low latency, e.g., from liveness and/or proximity detection and classification to device power on less than about a second.
In addition, the various disclosed embodiments can detect when user is removing the headset (or other wearable device, or other device interaction), e.g., “doffed,” and turn the device of, where a doffing event can result in a “worn” state being set to 0 in a particular non-limiting embodiment. In addition, the various disclosed embodiments provide robust detection, regardless of wearable device or device usage, resulting in “worn” remaining equal to 0 in a particular non-limiting embodiment, without false triggers while the headset (or other device) is not being used (headset/device stored in its storage box, headset/device put in a bag and user walking). Moreover, the various disclosed embodiments can provide low latency, e.g., from liveness and/or proximity detection and classification to device power off less than about five seconds to provide improved battery life.
In further embodiments, exemplary ML framework 120 can facilitate ML model generation 806, as further described herein regarding
In a non-limiting aspect, exemplary liveness classification algorithm in a ML model 122 (e.g., ML classifier model 808) can comprise a liveness detection classifier decision tree trained on captured and classified ultrasonic information received during a set of states of use of a test wearable device (or other device). As a non-limiting example, the exemplary liveness classification algorithm can be executed on every new frame of pulse-echo ultrasonic information (e.g., with roughly 7 to 25 frames/second). As further described herein regarding
Accordingly, exemplary liveness classification algorithm can be configured to compute magnitude of samples of ultrasonic information at 1202. In addition, exemplary liveness classification algorithm can be further configured to normalize the magnitude of samples of ultrasonic information as a function of ultrasonic transceiver (e.g., ultrasonic transceiver 104/204/304) operating frequency (Fop) at 1204 to generate normalized magnitude ultrasonic information. Furthermore, exemplary liveness classification algorithm can be further configured to compute a set of feature vectors in the normalized magnitude ultrasonic information at 1206.
Moreover, exemplary liveness classification algorithm can be further configured to determine a set of feature classification labels (e.g., for the computed feature vectors) and corresponding confidence factors using the liveness detection classifier decision tree at 1208 to generate an overall classification label and a confidence factor. Accordingly, exemplary determination component 112 can be configured to generate the determination of the proximity or liveness based on the set of feature classification labels and corresponding confidence factors.
In addition, exemplary liveness classification algorithm can comprise classifier post processing at 1210. If the confidence factor for the classification is greater than a threshold for the classification, then the liveness detection classifier decision tree can facilitate generating the donned or doffed state (e.g. based on a recursive average of a number of samples of confidence factors (e.g., 2 confidence factors)), the averaged confidence factor, and the classifier detection as an output.
In another non-limiting aspect, if it is determined that the don-doff latency is excessive, various embodiments described herein can further include an eye blink detection classification component (not shown), for example, as further described herein regarding
Accordingly, various embodiments as described herein can provide an exemplary UTSIP apparatus or device (e.g., exemplary UTSIP 102, exemplary IC 202 with ultrasonic transceiver 204, exemplary UTSIP apparatus or device 302) that can facilitate liveness and/or proximity detection based on a decision tree-based ML model 808 as further described herein. For instance, an exemplary decision tree based ML model 808 having depth of 6 and 8 feature extraction zones, when incorporating the decision tree classifier ML model portion of the exemplary liveness classification algorithm with optimization of false negative (doff) rate by rejection of doffing event until several consecutive doff frames detected and with optimization of false positive (doff) rate with an exemplary eye blink detection classification component can provide reliable nominal use case detection, short latency of don 906 event, longer latency of doff 908 event, false positives of decision tree classifier ML model portion of the exemplary liveness classification algorithm having limited duration (due to eye blink detection), and rare false positives (don) rate that will not drain the battery of wearable devices configured as described herein.
In further embodiments, exemplary ML framework 120 can facilitate ML model generation and training 1306, as further described herein regarding
Accordingly, the pulse-echo data A-scan frames having the appropriate classification/annotation labels 1502 and sample weights 1504, exemplary ML model training system 116 can facilitate exemplary ML model 122 selection/generation, and training. As previously discussed, accurate don/doff state cannot be accurately predicted by looking at only the most recent A-scan frame. As a result, suitable ML model 122 can comprise a convolutional neural network (CNN) or a recursive neural network (RNN). Various embodiments described herein can facilitate liveness and/or proximity detection based on a RNN ML model 122. In a non-limiting aspect, an exemplary RNN ML model 122 can have a smaller memory footprint, suitable for low-power, integrated devices such as described above regarding
Accordingly,
In a non-limiting aspect,
In a further non-limiting aspect, GRU 1706 comprise a recursive layer (having memory, which enables decision-making based on the past samples), which then the outputs of three GRU 1706 are combined at concatenate 1710, and passed to another simple neural network comprising dense layer 1712, which provides a linear combination of the inputs, which in this will produce two outputs (e.g., only two possible states at the output, don 906, doff 908). The output of the dense layer 1712 is passed to logits 1714, which generates log of probability, where large positive numbers indicate a highly likely condition (e.g., don 906 or doff 908), and where large negative numbers indicate a very unlikely condition (e.g., don 906 or doff 908), resulting in two outputs, one of which indicates likelihood of don 906 and the other of which indicates likelihood of doff 908, to facilitate generation of the final classification prediction output 1716 of liveness and/or proximity detection in accordance with one or more embodiments described herein. Note that in the instance where both logits 1714 outputs indicate of very unlikely condition, the classification would result in neither a don 906 nor doff 908 condition.
Accordingly, various embodiments described herein can comprise a ML model 122 based on a liveness detection classifier recursive neural network RNN) trained 1306 on captured 1302 and the classified/labeled 1304 ultrasonic information received during a of states of use of a test wearable device (e.g., HMD mockup), for example, as further described herein regarding
Accordingly, various embodiments described herein, an exemplary ML model training system 116 can be employed to produce an exemplary ML model 122 for the classification of ultrasonic information as indicative of proximity or liveness of a user of a wearable device (or a device in use) based on a TensorFlow ML framework 120, which exemplary ML model 122 (e.g., GRU RNN ML classifier model 1312) can trained 1306 on a database collected using an test device (e.g., HMD mockup) comprising exemplary UTSIP apparatus or device (e.g., exemplary UTSIP 102, exemplary IC 202 with ultrasonic transceiver 204, exemplary UTSIP apparatus or device 302). In a non-limiting aspect, an exemplary GRU RNN ML classifier model 1312 can be quantized to int32 using TensorFlow ML framework 120 and manually modified and optimized into int16 format for use on a 16-bit MCU such as can be employed in exemplary UTSIP apparatus or device (e.g., exemplary UTSIP 102, exemplary IC 202 with ultrasonic transceiver 204, exemplary UTSIP apparatus or device 302). According to non-limiting aspects, an exemplary 16-bit integer GRU RNN ML classifier model 1312 approaches similar model accuracy as a 32-bit integer GRU RNN ML classifier model 1312 or a floating point GRU RNN ML classifier model 1312.
In view of the subject matter described supra, methods that can be implemented in accordance with the subject disclosure will be better appreciated with reference to the flowcharts of
In another non-limiting embodiment, exemplary methods 1800 can comprise, at 1804, generating a classification, via a liveness detection classifier component (e.g., liveness detection classifier component 110) associated with a memory (e.g., memory 106, program memory 314) coupled to the processor (e.g., processor 108, MCU 312), of the ultrasonic information as indicative of one of proximity or liveness of the user of the wearable device (e.g., wearable device or other battery powered device) according to a liveness classification algorithm executed by the processor (e.g., processor 108, MCU 312), as further described herein regarding
In further non-limiting embodiments, exemplary methods 1800 can comprise, at 1806, generating the classification according to the liveness classification algorithm based on a machine learning model (e.g., machine learning model 122, 808, 1312) trained on captured and classified ultrasonic information received during a set of states of use of a test wearable device (e.g., wearable device or other battery powered device), as further described herein regarding
In still further non-limiting embodiments, exemplary methods 1800 can comprise, at 1808, generating a determination, via a determination component (e.g., determination component 112) associated with the memory (e.g., memory 106, program memory 314), of the proximity or the liveness of the user of the wearable device (e.g., wearable device or other battery powered device) based on the classification, as further described herein regarding
In addition, further embodiments of exemplary methods 1800 can comprise, at 1810, generating the determination of the proximity or the liveness of the user of the wearable device (e.g., wearable device or other battery powered device) as a donned 906 or doffed 908 state of the wearable device (e.g., wearable device or other battery powered device), as further described herein regarding
In other non-limiting embodiments, exemplary methods 1800 can comprise, at 1812, selecting, via a power management component (e.g., power management component 114) associated with the memory (e.g., memory 106, program memory 314), one of a power state or operating mode of the wearable device (e.g., wearable device or other battery powered device) based on the determination of the proximity or the liveness of the user of the wearable device (e.g., wearable device or other battery powered device), as further described herein regarding
In further non-limiting embodiments, exemplary methods 1900 can comprise, at 1904, generating the classification, via the liveness detection classifier component (e.g., liveness detection classifier component 110) configured to generate the classification of the ultrasonic information as indicative of the proximity or the liveness based on the liveness classification algorithm, including computing magnitude of samples of ultrasonic information and normalizing the magnitude of samples of ultrasonic information as a function of ultrasonic transceiver (e.g., ultrasonic transceiver 104/204/304) operating frequency (Fop) to generate normalized magnitude ultrasonic information), as further described herein regarding
In other non-limiting embodiments, exemplary methods 1900 can comprise, at 1906, generating the classification, via the liveness detection classifier component (e.g., liveness detection classifier component 110) configured to generate the classification of the ultrasonic information as indicative of the proximity or the liveness based on the liveness classification algorithm, including computing a set of feature vectors in the normalized magnitude ultrasonic information and determining a set of feature classification labels and corresponding confidence factors using the liveness detection classifier decision tree, as further described herein regarding
In addition, other non-limiting embodiments, exemplary methods 1900 can comprise, at 1908, generating the determination via the determination component (e.g., determination component 112) configured to generate the determination of the proximity or the liveness based on the set of feature classification labels and corresponding confidence factors, as further described herein regarding
In still other non-limiting embodiments, exemplary methods 1900 can comprise, at 1910, generating the determination via the determination component (e.g., determination component 112) configured to generate the determination of the proximity or the liveness of the user of the wearable device (e.g., wearable device or other battery powered device) as the donned 906 or the doffed 908 state of the wearable device (e.g., wearable device or other battery powered device), based on one of an eye blink detection classification component (not shown) determination of a set of eye blinks or repeated cycles of a determined doff 908 state, as further described herein regarding
In other non-limiting embodiments, exemplary methods 2000 can comprise, at 2004, generating the classification prediction of the donned 906 or the doffed 908 state of the wearable device (e.g., wearable device or other battery powered device) based on the one of the proximity or the liveness of the user of the wearable device (e.g., wearable device or other battery powered device), which is based on one of a blinking eye, a facial movement, or a facial expression change in an echo of the echoes in the ultrasonic information as processed by the GRU (e.g., GRU RNN ML model 122), as further described herein regarding
Various embodiments described herein can comprise or be associated with a computer-implemented system for creating and training of machine learning models for employment in devices or apparatuses that facilitate proximity and/or liveness detection.
Those having ordinary skill in the art will appreciate that the herein disclosure describes non-limiting examples of various embodiments of the invention. For ease of description and/or explanation, various portions of the herein disclosure utilize the term “each” when discussing various embodiments of the invention. Those having ordinary skill in the art will appreciate that such usages of the term “each” are non-limiting examples. In other words, when the herein disclosure provides a description that is applied to “each” of some particular computerized object and/or component, it should be understood that this is a non-limiting example of various embodiments of the invention, and it should be further understood that, in various other embodiments of the invention, it can be the case that such description applies to fewer than “each” of that particular computerized object.
Generally, program modules include routines, programs, components, data structures, etc., that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the inventive methods can be practiced with other computer system configurations, including single-processor or multiprocessor computer systems, minicomputers, mainframe computers, Internet of Things (IoT) devices, distributed computing systems, as well as personal computers, hand-held computing devices, microprocessor-based or programmable consumer electronics, and the like, each of which can be operatively coupled to one or more associated devices.
The illustrated embodiments of the embodiments herein can be also practiced in distributed computing environments where certain tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules can be located in both local and remote memory storage devices.
Computing devices typically include a variety of media, which can include computer-readable storage media, machine-readable storage media, and/or communications media, which two terms are used herein differently from one another as follows. Computer-readable storage media or machine-readable storage media can be any available storage media that can be accessed by the computer and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer-readable storage media or machine-readable storage media can be implemented in connection with any method or technology for storage of information such as computer-readable or machine-readable instructions, program modules, structured data or unstructured data.
Computer-readable storage media can include, but are not limited to, random access memory (RAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read only memory (CD ROM), digital versatile disk (DVD), Blu-ray disc (BD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, solid state drives or other solid state storage devices, or other tangible and/or non-transitory media which can be used to store desired information. In this regard, the terms “tangible” or “non-transitory” herein as applied to storage, memory or computer-readable media, are to be understood to exclude only propagating transitory signals per se as modifiers and do not relinquish rights to all standard storage, memory or computer-readable media that are not only propagating transitory signals per se.
Computer-readable storage media can be accessed by one or more local or remote computing devices, e.g., via access requests, queries or other data retrieval protocols, for a variety of operations with respect to the information stored by the medium.
Communications media typically embody computer-readable instructions, data structures, program modules or other structured or unstructured data in a data signal such as a modulated data signal, e.g., a carrier wave or other transport mechanism, and includes any information delivery or transport media. The term “modulated data signal” or signals refers to a signal that has one or more of its characteristics set or changed in such a manner as to encode information in one or more signals. By way of example, and not limitation, communication media include wired media, such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
With reference again to
The system bus 2108 can be any of several types of bus structure that can further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures. The system memory 2106 includes ROM 2110 and RAM 2112. A basic input/output system (BIOS) can be stored in a non-volatile memory such as ROM, erasable programmable read only memory (EPROM), EEPROM, which BIOS contains the basic routines that help to transfer information between elements within the computer 2102, such as during startup. The RAM 2112 can also include a high-speed RAM such as static RAM for caching data.
The computer 2102 further includes an internal hard disk drive (HDD) 2114 (e.g., EIDE, SATA), one or more external storage devices 2116 (e.g., a magnetic floppy disk drive (FDD) 2116, a memory stick or flash drive reader, a memory card reader, etc.) and a drive 2120, e.g., such as a solid state drive, an optical disk drive, which can read or write from a disk 2122, such as a CD-ROM disc, a DVD, a BD, etc. Alternatively, where a solid-state drive is involved, disk 2122 would not be included, unless separate. While the internal HDD 2114 is illustrated as located within the computer 2102, the internal HDD 2114 can also be configured for external use in a suitable chassis (not shown). Additionally, while not shown in environment 2100, a solid-state drive (SSD) could be used in addition to, or in place of, an HDD 2114. The HDD 2114, external storage device(s) 2116 and drive 2120 can be connected to the system bus 2108 by an HDD interface 2124, an external storage interface 2126 and a drive interface 2128, respectively. The interface 2124 for external drive implementations can include at least one or both of Universal Serial Bus (USB) and Institute of Electrical and Electronics Engineers (IEEE) 1394 interface technologies. Other external drive connection technologies are within contemplation of the embodiments described herein.
The drives and their associated computer-readable storage media provide nonvolatile storage of data, data structures, computer-executable instructions, and so forth. For the computer 2102, the drives and storage media accommodate the storage of any data in a suitable digital format. Although the description of computer-readable storage media above refers to respective types of storage devices, it should be appreciated by those skilled in the art that other types of storage media which are readable by a computer, whether presently existing or developed in the future, could also be used in the example operating environment, and further, that any such storage media can contain computer-executable instructions for performing the methods described herein.
A number of program modules can be stored in the drives and RAM 2112, including an operating system 2130, one or more application programs 2132, other program modules 2134 and program data 2136. All or portions of the operating system, applications, modules, and/or data can also be cached in the RAM 2112. The systems and methods described herein can be implemented utilizing various commercially available operating systems or combinations of operating systems.
Computer 2102 can optionally comprise emulation technologies. For example, a hypervisor (not shown) or other intermediary can emulate a hardware environment for operating system 2130, and the emulated hardware can optionally be different from the hardware illustrated in
Further, computer 2102 can be enable with a security module, such as a trusted processing module (TPM). For instance, with a TPM, boot components hash next in time boot components and wait for a match of results to secured values, before loading a next boot component. This process can take place at any layer in the code execution stack of computer 2102, e.g., applied at the application execution level or at the operating system (OS) kernel level, thereby enabling security at any level of code execution.
A user can enter commands and information into the computer 2102 through one or more wired/wireless input devices, e.g., a keyboard 2138, a touch screen 2140, and a pointing device, such as a mouse 2142. Other input devices (not shown) can include a microphone, an infrared (IR) remote control, a radio frequency (RF) remote control, or other remote control, a joystick, a virtual reality controller and/or virtual reality headset, a game pad, a stylus pen, an image input device, e.g., camera(s), a gesture sensor input device, a vision movement sensor input device, an emotion or facial detection device, a biometric input device, e.g., fingerprint or iris scanner, or the like. These and other input devices are often connected to the processing unit 2104 through an input device interface 2144 that can be coupled to the system bus 2108, but can be connected by other interfaces, such as a parallel port, an IEEE 1394 serial port, a game port, a USB port, an IR interface, a BLUETOOTH® interface, etc.
A monitor 2146 or other type of display device can be also connected to the system bus 2108 via an interface, such as a video adapter 2148. In addition to the monitor 2146, a computer typically includes other peripheral output devices (not shown), such as speakers, printers, etc.
The computer 2102 can operate in a networked environment using logical connections via wired and/or wireless communications to one or more remote computers, such as a remote computer(s) 2150. The remote computer(s) 2150 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 2102, although, for purposes of brevity, only a memory/storage device 2152 is illustrated. The logical connections depicted include wired/wireless connectivity to a local area network (LAN) 2154 and/or larger networks, e.g., a wide area network (WAN) 2156. Such LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which can connect to a global communications network, e.g., the Internet.
When used in a LAN networking environment, the computer 2102 can be connected to the local network 2154 through a wired and/or wireless communication network interface or adapter 2158. The adapter 2158 can facilitate wired or wireless communication to the LAN 2154, which can also include a wireless access point (AP) disposed thereon for communicating with the adapter 2158 in a wireless mode.
When used in a WAN networking environment, the computer 2102 can include a modem 2160 or can be connected to a communications server on the WAN 2156 via other means for establishing communications over the WAN 2156, such as by way of the Internet. The modem 2160, which can be internal or external and a wired or wireless device, can be connected to the system bus 2108 via the input device interface 2144. In a networked environment, program modules depicted relative to the computer 2102 or portions thereof, can be stored in the remote memory/storage device 2152. It will be appreciated that the network connections shown are example and other means of establishing a communications link between the computers can be used.
When used in either a LAN or WAN networking environment, the computer 2102 can access cloud storage systems or other network-based storage systems in addition to, or in place of, external storage devices 2116 as described above, such as but not limited to a network virtual machine providing one or more aspects of storage or processing of information. Generally, a connection between the computer 2102 and a cloud storage system can be established over a LAN 2154 or WAN 2156 e.g., by the adapter 2158 or modem 2160, respectively. Upon connecting the computer 2102 to an associated cloud storage system, the external storage interface 2126 can, with the aid of the adapter 2158 and/or modem 2160, manage storage provided by the cloud storage system as it would other types of external storage. For instance, the external storage interface 2126 can be configured to provide access to cloud storage sources as if those sources were physically connected to the computer 2102.
The computer 2102 can be operable to communicate with any wireless devices or entities operatively disposed in wireless communication, e.g., a printer, scanner, desktop and/or portable computer, portable data assistant, communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, store shelf, etc.), and telephone. This can include Wireless Fidelity (Wi-Fi) and BLUETOOTH® wireless technologies. Thus, the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices.
The present invention may be a system, a method, an apparatus and/or a computer program product at any possible technical detail level of integration. The computer program product can include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention. The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium can be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium can also include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network can comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device. Computer readable program instructions for carrying out operations of the present invention can be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions can execute entirely on the user's computer, partly on the user's computer, as a standalone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer can be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection can be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) can execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions. These computer readable program instructions can be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions can also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks. The computer readable program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational acts to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams can represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the blocks can occur out of the order noted in the Figures. For example, two blocks shown in succession can, in fact, be executed substantially concurrently, or the blocks can sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
While the subject matter has been described above in the general context of computer-executable instructions of a computer program product that runs on a computer and/or computers, those skilled in the art will recognize that this disclosure also can or can be implemented in combination with other program modules. Generally, program modules include routines, programs, components, data structures, etc. that perform particular tasks and/or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the inventive computer-implemented methods can be practiced with other computer system configurations, including single-processor or multiprocessor computer systems, mini-computing devices, mainframe computers, as well as computers, hand-held computing devices (e.g., PDA, phone), microprocessor-based or programmable consumer or industrial electronics, and the like. The illustrated aspects can also be practiced in distributed computing environments in which tasks are performed by remote processing devices that are linked through a communications network. However, some, if not all aspects of this disclosure can be practiced on stand-alone computers. In a distributed computing environment, program modules can be located in both local and remote memory storage devices.
As used in this application, the terms “component,” “system,” “platform,” “interface,” and the like, can refer to and/or can include a computer-related entity or an entity related to an operational machine with one or more specific functionalities. The entities disclosed herein can be either hardware, a combination of hardware and software, software, or software in execution. For example, a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and/or thread of execution and a component can be localized on one computer and/or distributed between two or more computers. In another example, respective components can execute from various computer readable media having various data structures stored thereon. The components can communicate via local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal). As another example, a component can be an apparatus with specific functionality provided by mechanical parts operated by electric or electronic circuitry, which is operated by a software or firmware application executed by a processor. In such a case, the processor can be internal or external to the apparatus and can execute at least a part of the software or firmware application. As yet another example, a component can be an apparatus that provides specific functionality through electronic components without mechanical parts, wherein the electronic components can include a processor or other means to execute software or firmware that confers at least in part the functionality of the electronic components. In an aspect, a component can emulate an electronic component via a virtual machine, e.g., within a cloud computing system.
In addition, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances. Moreover, articles “a” and “an” as used in the subject specification and annexed drawings should generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form. As used herein, the terms “example” and/or “exemplary” are utilized to mean serving as an example, instance, or illustration. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as an “example” and/or “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques known to those of ordinary skill in the art.
As it is employed in the subject specification, the term “processor” can refer to substantially any computing processing unit or device comprising, but not limited to, single-core processors; single-processors with software multithread execution capability; multi-core processors; multi-core processors with software multithread execution capability; multi-core processors with hardware multithread technology; parallel platforms; and parallel platforms with distributed shared memory. Additionally, a processor can refer to an integrated circuit, an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), a programmable logic controller (PLC), a complex programmable logic device (CPLD), a discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. Further, processors can exploit nano-scale architectures such as, but not limited to, molecular and quantum-dot based transistors, switches and gates, in order to optimize space usage or enhance performance of user equipment. A processor can also be implemented as a combination of computing processing units. In this disclosure, terms such as “store,” “storage,” “data store,” data storage,” “database,” and substantially any other information storage component relevant to operation and functionality of a component are utilized to refer to “memory components,” entities embodied in a “memory,” or components comprising a memory. It is to be appreciated that memory and/or memory components described herein can be either volatile memory or nonvolatile memory or can include both volatile and nonvolatile memory. By way of illustration, and not limitation, nonvolatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable ROM (EEPROM), flash memory, or nonvolatile random-access memory (RAM) (e.g., ferroelectric RAM (FeRAM). Volatile memory can include RAM, which can act as external cache memory, for example. By way of illustration and not limitation, RAM is available in many forms such as synchronous RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), direct Rambus RAM (DRRAM), direct Rambus dynamic RAM (DRDRAM), and Rambus dynamic RAM (RDRAM). Additionally, the disclosed memory components of systems or computer-implemented methods herein are intended to include, without being limited to including, these and any other suitable types of memory.
What has been described above include mere examples of systems and computer-implemented methods. It is, of course, not possible to describe every conceivable combination of components or computer-implemented methods for purposes of describing this disclosure, but one of ordinary skill in the art can recognize that many further combinations and permutations of this disclosure are possible. Furthermore, to the extent that the terms “includes,” “has,” “possesses,” and the like are used in the detailed description, claims, appendices and drawings such terms are intended to be inclusive in a manner similar to the term “comprising” as “comprising” is interpreted when employed as a transitional word in a claim.
The descriptions of the various embodiments have been presented for purposes of illustration but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
This patent application is a Non-Provisional Application that claims priority to U.S. Provisional Patent Application Ser. No. 63/703,042, filed Oct. 3, 2024, entitled “PROXIMITY AND LIVENESS SENSING,” U.S. Provisional Patent Application Ser. No. 63/608,357, filed Dec. 11, 2023, entitled “DETECTION OF DONNING/DOFFING OF GOGGLES USING AN ULTRASOUND SENSOR USING MACHINE LEARNING,” and U.S. Provisional Patent Application Ser. No. 63/597,110, filed Nov. 8, 2023, entitled “PROXIMITY AND LIVENESS DETECTION USING AN ULTRASONIC TRANSCEIVER,” the entireties of which U.S. Provisional patent applications are incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
63703042 | Oct 2024 | US | |
63608357 | Dec 2023 | US | |
63597110 | Nov 2023 | US |