Visual content (e.g., what a person sees in his or her surroundings, by way of a display screen, etc.) can greatly affect a person's mental state, ability to think clearly and/or perform physical tasks, exercise impulse control, interact with others, and/or otherwise function mentally and/or physically. Accordingly, it would be desirable to be able to determine this effect as the person experiences the visual content and perform one or more operations that modify the visual content, help the user achieve a desired mental state, and/or optimize the user's ability to function mentally and/or physically.
The accompanying drawings illustrate various embodiments and are a part of the specification. The illustrated embodiments are merely examples and do not limit the scope of the disclosure. Throughout the drawings, identical or similar reference numbers designate identical or similar elements.
An illustrative wearable glasses assembly configured to be worn by a user may include a viewing lens assembly configured to provide the user with a visual experience in which the user sees a real-world environment through the viewing lens assembly and an optical measurement system configured to output optical measurement data representative of one or more optical measurements performed with respect to the user.
In some examples, a processor (e.g., a processor included in the wearable glasses assembly and/or in a device separate from the wearable glasses assembly) may be configured to perform, based on the optical measurement data, an operation with respect to the visual experience. For example, the processor may adjust, based on the optical measurement data, one or more attributes associated with the visual experience (e.g., by controlling an operation of the wearable glasses assembly and/or a computing device communicatively coupled to the wearable glasses assembly based on the optical measurement data).
The devices, systems, and methods described herein may provide a number of advantages and benefits over conventional wearable glasses assemblies. For example, by including an optical measurement system in a wearable glasses assembly, brain activity data and/or other types of optical measurement data may be acquired while the user is experiencing visual content by way of the wearable glasses assembly. Based on the optical measurement data, a processor may determine, e.g., in real time, an effect of the visual experience on the user and adjust one or more attributes associated with the visual experience accordingly. For example, as described herein, the visual experience itself may be adjusted, other content (e.g., audio content and/or one or more notifications) may be presented to the user, and/or one or more other operations may be performed to help the user achieve a desired mental state, exercise impulse control, and/or more effectively perform one or more mental and/or physical tasks. These and other advantages and benefits of the devices, systems, and methods described herein are described more fully herein.
Wearable glasses assembly 102 may be implemented by any suitable apparatus configured to provide a visual experience to a user. For example, wearable glasses assembly 102 may be implemented by prescription eye glasses, reading glasses, sunglasses, smart glasses (e.g., a wearable glasses assembly comprising a processor configured to present augmented reality content to the user), and/or any other suitable eye piece(s) as may serve a particular implementation.
As used herein, a visual experience is one in which the user sees a real-world environment through wearable glasses assembly 102 (e.g., through viewing lenses of wearable glasses assembly 102, as described herein). The visual experience may also include presenting augmented reality content and/or any other type of content to the user.
As shown, wearable glasses assembly 102 includes a viewing lens assembly 104. Viewing lens assembly 104 may be configured to provide the user with a visual experience in which the user sees a real-world environment through viewing lens assembly 104. For example, viewing lens assembly 104 may include a first viewing lens configured to be positioned in front of a first eye of the user and a second viewing lens configured to be positioned in front of a second eye of the user. In this configuration, the user may see a real-world environment (i.e., a physical surroundings) of the user through the first and second viewing lenses. In some alternative configurations, viewing lens assembly 104 may include only a single viewing lens through which both eyes can see the real-world environment or through which only a single eye can see the real-world environment. It will be assumed in the examples provided herein that viewing lens assembly 104 includes two viewing lenses, one for each eye.
The one or more viewing lenses included in viewing lens assembly 104 may be made out of any suitable material, such as glass, plastic, and/or any other at least semi-transparent material through which the user can see.
In some example, viewing lens assembly 104 may be further configured to display augmented reality content associated with the real-world environment. For example, one or both of the viewing lenses included in viewing lens assembly 104 may include display capabilities such that the augmented reality content may be presented to the user by way of the viewing lens(es). The augmented reality content may add digital elements to a live view of the user. This is described more fully in U.S. Patent Application Nos. 63/139,469 and 63/139,478, the contents of which are incorporated herein by reference in their entirety.
Wearable glasses assembly 102 further includes an optical measurement system 106. Optical measurement system 106 may be included in wearable glasses assembly 102 in any suitable manner. Exemplary configurations in which optical measurement system 106 is included in (also referred to as “integrated into”) wearable glasses assembly 102 are described herein.
Optical measurement system 106 is configured to output optical measurement data, which may be generated using any suitable time domain-based optical measurement technique, such as time-correlated single-photon counting (TCSPC), time domain near infrared spectroscopy (TD-NIRS), time domain diffusive correlation spectroscopy (TD-DCS), and/or time domain digital optical tomography (TD-DOT). In some examples, the optical measurement data may include brain activity data representative of brain activity of the user. Additionally or alternatively, the optical measurement data may be representative of one or more non-invasive measurements of blood oxygen saturation (SaO2) through Time-Resolved Pulse Oximetry (TR-SpO2). Exemplary implementations of optical measurement system 106 are described herein.
Processor 202 may be implemented by any suitable processing or computing device. Moreover, processor 202 may be included in any suitable device. To illustrate,
Exemplary operations that may be performed by processor 202 based optical measurement data output by optical measurement system 106 will now be described. The operations described herein are merely illustrative of the many different operations that may be performed by processor 202 in accordance with the principles described herein.
In some examples, processor 202 may adjust, based on the optical measurement data, one or more attributes associated with the visual experience. This may be performed in any suitable manner. For example, processor 202 may transmit (e.g., wirelessly or by way of a wired connection) one or more commands to wearable glasses assembly 102 and/or a computing device communicatively coupled to wearable glasses assembly 102.
To illustrate,
As shown in
As shown in
In some examples, computing device 502 is not communicatively coupled to wearable glasses assembly 102, but still configured to control one or more attributes of the visual experience provided by way of wearable glasses assembly 102. For example, computing device 502 may be configured to display (or control the display of) visual content included in the visual experience. Such visual content may include video content displayed, for example, by way of a screen of or communicatively coupled to computing device 502. In these examples, computing device 502 may adjust one or more attributes of the visual experience by controlling one or more attributes of the video content being display by way of the screen.
In either configuration 400 (
As used herein “an effect” may be related to an effect on the user's metal state or physiological functions of the user. Mental states may include joy, excitement, relaxation, surprise, fear, stress, anxiety, sadness, anger, disgust, contempt, contentment, calmness, approval, focus, attention, creativity, cognitive assessment, positive or negative reflections/attitude on experiences or the use of objects, etc. The cognitive assessment may encompass intellectual functions and processes (e.g., memory retrieval, focus, attention, creativity, reasoning, problem solving, decision making, comprehension and production of language, etc.). Physiological functions of the user may include, e.g., heart rate, respiratory rate, body temperature, blood pressure, skin conductivity, and/or an increase or decrease in these functions. Furthermore, “an effect” may include sensation of pain, an increase in the pain sensation, etc., as described more fully in U.S. Provisional Application No. 63/188,783, filed May 14, 2021, and incorporated herein by reference in its entirety.
As another example, processor 202 may determine, based on the optical measurement data, a current mental state of the user. Processor 202 may be further configured to obtain data representative of a desired mental state of the user (e.g., by way of user input provided by the user and/or based on an activity being performed by the user). Based on the current mental state and the desired mental state, processor 202 may adjust one or more attributes associated with the visual experience to change the current mental state of the user to the desired mental state of the user.
As used herein, mental states may include joy, excitement, relaxation, surprise, fear, stress, anxiety, sadness, anger, disgust, contempt, contentment, calmness, approval, focus, attention, creativity, cognitive assessment, positive or negative reflections/attitude on experiences or the use of objects, etc. Further details on the methods and systems related to a predicted brain state, physiological functions, behavior, preferences, or attitude of the user, and the creation, training, and use of neuromes can be found in U.S. patent application Ser. No. 17/188,298, filed Mar. 1, 2021, issued as U.S. Pat. No. 11,132,625. Exemplary measurement systems and methods using biofeedback for awareness and modulation of mental state are described in more detail in U.S. patent application Ser. No. 16/364,338, filed Mar. 26, 2019, issued as U.S. Pat. No. 11,006,876. Exemplary measurement systems and methods used for detecting and modulating the mental state of a user using entertainment selections, e.g., music, film/video, are described in more detail in U.S. patent application Ser. No. 16/835,972, filed Mar. 31, 2020, issued as U.S. Pat. No. 11,006,878. Exemplary measurement systems and methods used for detecting and modulating the mental state of a user using product formulation from, e.g., beverages, food, selective food/drink ingredients, fragrances, and assessment based on product-elicited brain state measurements are described in more detail in U.S. patent application Ser. No. 16/853,614, filed Apr. 20, 2020, issued as U.S. Pat. No. 11,172,869. Exemplary measurement systems and methods used for detecting and modulating the mental state of a user through awareness of priming effects are described in more detail in U.S. patent application Ser. No. 16/885,596, filed May 28, 2020, published as US2020/0390358A1. Exemplary measurement systems and methods used for wellness therapy, such as pain management regime, are described more fully in U.S. Provisional Application No. 63/188,783, filed May 14, 2021. These applications and corresponding U.S. patents and publications are incorporated herein by reference in their entirety.
By way of example, while the user is viewing a particular video by way of wearable glasses assembly 102, the optical measurement data output by optical measurement system 106 may indicate that the video is making the user experience an effect (e.g., feeling anxious), tempting the user to indulge in an undesirable behavior (e.g., by seeing a visual trigger that makes the user want to smoke, drink alcohol, etc.), lessoning the user's ability to exercise impulse control, and/or lessening the user's ability to think clearly about a task at hand. Based on this, processor 202 may adjust one or more attributes of the current video itself (e.g., by filtering out certain portions of the video, adjusting a volume level of the video, etc.), stop a presentation of the video, switch to presenting a different video in place of the current video, etc.
As another example, while a user is wearing wearable glasses assembly 102 outside in bright sunlight, the optical measurement data output by optical measurement system 106 may indicate that the sunlight is negatively affecting the ability of the user to experience an effect (e.g., less attentive and/or unable to focus and think clearly). Based on this, processor 202 may increase a tinting of the viewing lenses included in viewing lens assembly 104.
As another example, while a user is viewing a particular real-world scene, the optical measurement data output by optical measurement system 106 may indicate that the user is experiencing a physiological function, e.g., user's heartrate increases above a threshold amount. This may be a sign that the user is nervous, anxious, and/or excited. Based on this, processor 202 may present calming audio and/or visual content to the user to help the user minimize or decrease the physiological function where the user experiences a more relaxed mental state, e.g., calmness.
As shown, imaging device 602 may output imaging data, which may be representative of the one or more captured images. The imaging data may be provided as an input to processor 202, which may be configured to perform an operation based on both the optical measurement data and the image data.
To illustrate, processor 202 may be configured to identify, based on the one or more images, a real-world object in a field of view of the user. The operation performed by processor 202 may accordingly be based on the optical measurement data and the identification of the real-world object.
For example, processor 202 may identify, based on the one or more images, that the user sees an unhealthy food item (e.g., a cookie, or a sweet dessert). Processor 202 may further determine, based on the optical measurement data, that the user is tempted to eat the food item. Processor 202 may accordingly provide the user with one or more notifications (e.g., audio and/or visual cues) that help the user overcome the temptation to eat the unhealthy food item.
As another example, processor 202 may be configured to identify, based on the one or more images, another person in a field of view of the user. Processor 202 may be further configured to determine one or more attributes of the other person. For example, processor 202 may determine one or more physical traits of the user, whether the user has seen the other person before, an identity of the other person, whether the other person is in a contact list and/or friend list of the user, etc. Processor 202 may accordingly base the operation that it performs on the one or more attributes of the other person.
For example, the optical measurement data may indicate that the user is subconsciously attracted to the other person, e.g., the measurement data includes a mental state of joy or excitement when the user is in the presence of the other person. Processor 202 may accordingly include one or more attributes of the other person in an attribute profile of people that the user is attracted to. The attribute profile may then be used to identify potential dating candidates for the user. Processor 202 may additionally or alternatively notify the user that he or she is attracted subconsciously to the other person.
As another example, the optical measurement data may indicate that the user is subconsciously threatened by the other person, e.g., the measurement data includes metal state of being afraid. Processor 202 may accordingly warn the user that he or she should exercise caution in the presence of the other person.
As shown, microphone 702 may output sound data, e.g., music; audio from a movie, a theater show, a lecture, a lesson, which may be representative of the detected sound. The sound data may be provided as an input to processor 202, which may be configured to perform an operation based on both the optical measurement data and the sound data.
In some examples, processor 202 may present, by way of a graphical user interface, content associated with the optical measurement data. For example, processor 202 may present one or more graphs, recommendations, directions, information, etc. based on the optical measurement data and the visual experience presented to the user.
Optical measurement system 106 may be included in wearable glasses assembly 102 in any suitable manner. For example, optical measurement system 106 may be included in and/or attached to a frame of wearable glasses assembly 102.
To illustrate,
As another example,
Various implementations of optical measurement system 106 will now be described.
In some examples, optical measurement system 106 may be implemented by any suitable wearable system configured to perform optical-based brain data acquisition operations, such as any of the wearable optical measurement systems described in U.S. patent application Ser. No. 17/176,315, filed Feb. 16, 2021 and published as US2021/0259638A1; U.S. patent application Ser. No. 17/176,309, filed Feb. 16, 2021 and published as US2021/0259614A1; U.S. patent application Ser. No. 17/176,460, filed Feb. 16, 2021 and issued as U.S. Pat. No. 11,096,620; U.S. patent application Ser. No. 17/176,470, filed Feb. 16, 2021 and published as US2021/0259619A1; U.S. patent application Ser. No. 17/176,487, filed Feb. 16, 2021 and published as US2021/0259632A1; U.S. patent application Ser. No. 17/176,539, filed Feb. 16, 2021 and published as US2021/0259620A1; U.S. patent application Ser. No. 17/176,560, filed Feb. 16, 2021 and published as US2021/0259597A1; U.S. patent application Ser. No. 17/176,466, filed Feb. 16, 2021 and published as US2021/0263320A1; Han Y. Ban, et al., “Kernel Flow: A High Channel Count Scalable TD-fNIRS System,” SPIE Photonics West Conference (Mar. 6, 2021); and Han Y. Ban, et al., “Kernel Flow: a high channel count scalable time-domain functional near-infrared spectroscopy system,” Journal of Biomedical Optics (Jan. 18, 2022), which applications and publications are incorporated herein by reference in their entirety.
Additionally or alternatively, optical measurement system 106 may be configured to non-invasively measure blood oxygen saturation (SaO2) (e.g., at the ear) through Time-Resolved Pulse Oximetry (TR-SpO2), such as one or more of the devices described in more detail in U.S. Provisional Patent Application No. 63/134,479, filed Jan. 6, 2021, U.S. Provisional Patent Application No. 63/154,116, filed Feb. 26, 2021, U.S. Provisional Patent Application No. 63/160,995, filed Mar. 15, 2021, and U.S. Provisional Patent Application No. 63/179,080, filed Apr. 23, 2021, which applications are incorporated herein by reference. Using time-resolved techniques, information that allows for determining the absolute coefficients of absorption (μa) and reduced scattering (μs′) can be determined. From these absolute tissue properties, tissue oxygenation may be determined through the Beer-Lambert Law.
Additionally or alternatively, optical measurement system 106 may be configured to perform both optical-based brain data acquisition operations and electrical-based brain data acquisition operations, such as any of the wearable multimodal measurement systems described in U.S. patent application Ser. Nos. 17/176,315 and 17/176,309, which applications have already been incorporated herein by reference.
In some examples, optical measurement operations performed by optical measurement system 1000 are associated with a time domain-based optical measurement technique. Example time domain-based optical measurement techniques include, but are not limited to, TCSPC, TD-NIRS, TD-DCS, and TD-DOT.
Optical measurement system 1000 (e.g., an optical measurement system that is implemented by a wearable device or other configuration, and that employs a time domain-based (e.g., TD-NIRS) measurement technique) may detect blood oxygenation levels and/or blood volume levels by measuring the change in shape of laser pulses after they have passed through target tissue, e.g., brain, muscle, finger, etc. As used herein, a shape of laser pulses refers to a temporal shape, as represented for example by a histogram generated by a time-to-digital converter (TDC) coupled to an output of a photodetector, as will be described more fully below.
As shown, optical measurement system 1000 includes a detector 1004 that includes a plurality of individual photodetectors (e.g., photodetector 1006), a processor 1008 coupled to detector 1004, a light source 1010, a controller 1012, and optical conduits 1014 and 1016 (e.g., light pipes). However, one or more of these components may not, in certain embodiments, be considered to be a part of optical measurement system 1000. For example, in implementations where optical measurement system 1000 is wearable by a user, processor 1008 and/or controller 1012 may in some embodiments be separate from optical measurement system 1000 and not configured to be worn by the user.
Detector 1004 may include any number of photodetectors 1006 as may serve a particular implementation, such as 10n photodetectors (e.g., 256, 512, . . . , 16384, etc.), where n is an integer greater than or equal to one (e.g., 4, 5, 8, 10, 11, 14, etc.). Photodetectors 1006 may be arranged in any suitable manner.
Photodetectors 1006 may each be implemented by any suitable circuit configured to detect individual photons of light incident upon photodetectors 1006. For example, each photodetector 1006 may be implemented by a single photon avalanche diode (SPAD) circuit and/or other circuitry as may serve a particular implementation. The SPAD circuit may be gated in any suitable manner or be configured to operate in a free running mode with passive quenching. For example, photodetectors 1006 may be configured to operate in a free-running mode such that photodetectors 1006 are not actively armed and disarmed (e.g., at the end of each predetermined gated time window). In contrast, while operating in the free-running mode, photodetectors 1006 may be configured to reset within a configurable time period after an occurrence of a photon detection event (i.e., after photodetector 1006 detects a photon) and immediately begin detecting new photons. However, only photons detected within a desired time window (e.g., during each gated time window) may be included in the histogram that represents a light pulse response of the target (e.g., a temporal point spread function (TPSF)). The terms histogram and TPSF are used interchangeably herein to refer to a light pulse response of a target.
Processor 1008 may be implemented by one or more physical processing (e.g., computing) devices. In some examples, processor 1008 may execute instructions (e.g., software) configured to perform one or more of the operations described herein.
Light source 1010 may be implemented by any suitable component configured to generate and emit light. For example, light source 1010 may be implemented by one or more laser diodes, distributed feedback (DFB) lasers, super luminescent diodes (SLDs), light emitting diodes (LEDs), diode-pumped solid-state (DPSS) lasers, super luminescent light emitting diodes (sLEDs), vertical-cavity surface-emitting lasers (VCSELs), titanium sapphire lasers, micro light emitting diodes (mLEDs), and/or any other suitable laser or light source. In some examples, the light emitted by light source 1010 is high coherence light (e.g., light that has a coherence length of at least 5 centimeters) at a predetermined center wavelength.
Light source 1010 is controlled by controller 1012, which may be implemented by any suitable computing device (e.g., processor 1008), integrated circuit, and/or combination of hardware and/or software as may serve a particular implementation. In some examples, controller 1012 is configured to control light source 1010 by turning light source 1010 on and off and/or setting an intensity of light generated by light source 1010. Controller 1012 may be manually operated by a user, or may be programmed to control light source 1010 automatically.
Light emitted by light source 1010 may travel via an optical conduit 1014 (e.g., a light pipe, a single-mode optical fiber, and/or or a multi-mode optical fiber) to body 1002 of a subject. Body 1002 may include any suitable turbid medium. For example, in some implementations, body 1002 is a brain or any other body part of a human or other animal. Alternatively, body 1002 may be a non-living object. For illustrative purposes, it will be assumed in the examples provided herein that body 1002 is a human brain.
As indicated by arrow 1020, the light emitted by light source 1010 enters body 1002 at a first location 1022 on body 1002. Accordingly, a distal end of optical conduit 1014 may be positioned at (e.g., right above, in physical contact with, or physically attached to) first location 1022 (e.g., to a scalp of the subject). In some examples, the light may emerge from optical conduit 1014 and spread out to a certain spot size on body 1002 to fall under a predetermined safety limit. At least a portion of the light indicated by arrow 1020 may be scattered within body 1002.
As used herein, “distal” means nearer, along the optical path of the light emitted by light source 1010 or the light received by detector 1004, to the target (e.g., within body 1002) than to light source 1010 or detector 1004. Thus, the distal end of optical conduit 1014 is nearer to body 1002 than to light source 1010, and the distal end of optical conduit 1016 is nearer to body 1002 than to detector 1004. Additionally, as used herein, “proximal” means nearer, along the optical path of the light emitted by light source 1010 or the light received by detector 1004, to light source 1010 or detector 1004 than to body 1002. Thus, the proximal end of optical conduit 1014 is nearer to light source 1010 than to body 1002, and the proximal end of optical conduit 1016 is nearer to detector 1004 than to body 1002.
As shown, the distal end of optical conduit 1016 (e.g., a light pipe, a light guide, a waveguide, a single-mode optical fiber, and/or a multi-mode optical fiber) is positioned at (e.g., right above, in physical contact with, or physically attached to) output location 1026 on body 1002. In this manner, optical conduit 1016 may collect at least a portion of the scattered light (indicated as light 1024) as it exits body 1002 at location 1026 and carry light 1024 to detector 1004. Light 1024 may pass through one or more lenses and/or other optical elements (not shown) that direct light 1024 onto each of the photodetectors 1006 included in detector 1004. In cases where optical conduit 1016 is implemented by a light guide, the light guide may be spring loaded and/or have a cantilever mechanism to allow for conformably pressing the light guide firmly against body 1002.
Photodetectors 1006 may be connected in parallel in detector 1004. An output of each of photodetectors 1006 may be accumulated to generate an accumulated output of detector 1004. Processor 1008 may receive the accumulated output and determine, based on the accumulated output, a temporal distribution of photons detected by photodetectors 1006. Processor 1008 may then generate, based on the temporal distribution, a histogram representing a light pulse response of a target (e.g., brain tissue, blood flow, etc.) in body 1002. Such a histogram is illustrative of the various types of brain activity measurements that may be performed by optical measurement system 1000.
Light sources 1104 are each configured to emit light (e.g., a sequence of light pulses) and may be implemented by any of the light sources described herein. Detectors 1106 may each be configured to detect arrival times for photons of the light emitted by one or more light sources 1104 after the light is scattered by the target. For example, a detector 1106 may include a photodetector configured to generate a photodetector output pulse in response to detecting a photon of the light and a time-to-digital converter (TDC) configured to record a timestamp symbol in response to an occurrence of the photodetector output pulse, the timestamp symbol representative of an arrival time for the photon (i.e., when the photon is detected by the photodetector).
Wearable assembly 1102 may be implemented by any of the wearable devices, modular assemblies, and/or wearable units described herein. For example, wearable assembly 1102 may be integrated into one or more components of wearable glasses assembly 102.
Optical measurement system 1100 may be modular in that one or more components of optical measurement system 1100 may be removed, changed out, or otherwise modified as may serve a particular implementation. As such, optical measurement system 1100 may be configured to conform to three-dimensional surface geometries, such as a user's head. Exemplary modular optical measurement systems comprising a plurality of wearable modules are described in more detail in U.S. patent application Ser. No. 17/176,460, filed Feb. 16, 2021 and issued as U.S. Pat. No. 11,096,620, U.S. patent application Ser. No. 17/176,470, filed Feb. 16, 2021 and published as US2021/0259619A1, U.S. patent application Ser. No. 17/176,487, filed Feb. 16, 2021 and published as US2021/0259632A1, U.S. patent application Ser. No. 17/176,539, filed Feb. 16, 2021 and published as US2021/0259620A1, U.S. patent application Ser. No. 17/176,560, filed Feb. 16, 2021 and published as US2021/0259597A1, and U.S. patent application Ser. No. 17/176,466, filed Feb. 16, 2021 and published as US2021/0263320A1, which applications are incorporated herein by reference in their respective entireties.
Electrodes 1208 may be configured to detect electrical activity within a target (e.g., the brain). Such electrical activity may include electroencephalogram (EEG) activity and/or any other suitable type of electrical activity as may serve a particular implementation. In some examples, electrodes 1208 are all conductively coupled to one another to create a single channel that may be used to detect electrical activity. Alternatively, at least one electrode included in electrodes 1208 is conductively isolated from a remaining number of electrodes included in electrodes 1208 to create at least two channels that may be used to detect electrical activity.
Memory 1302 may maintain (e.g., store) executable data used by processor 1304 to perform one or more of the operations described herein. For example, memory 1302 may store instructions 1306 that may be executed by processor 1304 to perform one or more operations based on optical measurement data output by optical measurement system 106 and visual experience output by viewing lens assembly 104. Instructions 1306 may be implemented by any suitable application, program, software, code, and/or other executable data instance. Memory 1302 may also maintain any data received, generated, managed, used, and/or transmitted by processor 1304.
Processor 1304 may be configured to perform (e.g., execute instructions 1306 stored in memory 1302 to perform) various operations described herein.
In some examples, a non-transitory computer-readable medium storing computer-readable instructions may be provided in accordance with the principles described herein. The instructions, when executed by a processor of a computing device, may direct the processor and/or computing device to perform one or more operations, including one or more of the operations described herein. Such instructions may be stored and/or transmitted using any of a variety of known computer-readable media.
A non-transitory computer-readable medium as referred to herein may include any non-transitory storage medium that participates in providing data (e.g., instructions) that may be read and/or executed by a computing device (e.g., by a processor of a computing device). For example, a non-transitory computer-readable medium may include, but is not limited to, any combination of non-volatile storage media and/or volatile storage media. Exemplary non-volatile storage media include, but are not limited to, read-only memory, flash memory, a solid-state drive, a magnetic storage device (e.g. a hard disk, a floppy disk, magnetic tape, etc.), ferroelectric random-access memory (“RAM”), and an optical disc (e.g., a compact disc, a digital video disc, a Blu-ray disc, etc.). Exemplary volatile storage media include, but are not limited to, RAM (e.g., dynamic RAM).
As shown in
Communication interface 1402 may be configured to communicate with one or more computing devices. Examples of communication interface 1402 include, without limitation, a wired network interface (such as a network interface card), a wireless network interface (such as a wireless network interface card), a modem, an audio/video connection, and any other suitable interface.
Processor 1404 generally represents any type or form of processing unit capable of processing data and/or interpreting, executing, and/or directing execution of one or more of the instructions, processes, and/or operations described herein. Processor 1404 may perform operations by executing computer-executable instructions 1412 (e.g., an application, software, code, and/or other executable data instance) stored in storage device 1406.
Storage device 1406 may include one or more data storage media, devices, or configurations and may employ any type, form, and combination of data storage media and/or device. For example, storage device 1406 may include, but is not limited to, any combination of the non-volatile media and/or volatile media described herein. Electronic data, including data described herein, may be temporarily and/or permanently stored in storage device 1406. For example, data representative of computer-executable instructions 1412 configured to direct processor 1404 to perform any of the operations described herein may be stored within storage device 1406. In some examples, data may be arranged in one or more databases residing within storage device 1406.
I/O module 1408 may include one or more I/O modules configured to receive user input and provide user output. I/O module 1408 may include any hardware, firmware, software, or combination thereof supportive of input and output capabilities. For example, I/O module 1408 may include hardware and/or software for capturing user input, including, but not limited to, a keyboard or keypad, a touchscreen component (e.g., touchscreen display), a receiver (e.g., an RF or infrared receiver), motion sensors, and/or one or more input buttons.
I/O module 1408 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen), one or more output drivers (e.g., display drivers), one or more audio speakers, and one or more audio drivers. In certain embodiments, I/O module 1408 is configured to provide graphical data to a display for presentation to a user. The graphical data may be representative of one or more graphical user interfaces and/or any other graphical content as may serve a particular implementation.
At operation 1502, a processor obtains optical measurement data output by an optical measurement system included in a wearable glasses assembly being worn by a user, the wearable glasses assembly configured to provide the user with a visual experience in which the user sees a real-world environment.
At operation 1504, the processor performs, based on the optical measurement data, an operation with respect to the visual experience.
An illustrative system includes a wearable glasses assembly configured to be worn by a user and comprising: a viewing lens assembly configured to provide the user with a visual experience in which the user sees a real-world environment through the viewing lens assembly; and an optical measurement system configured to output optical measurement data representative of one or more optical measurements performed with respect to the user.
Another illustrative system includes a wearable glasses assembly configured to be worn by a user and configured to provide the user with a visual experience in which the user sees a real-world environment; an optical measurement system included in the wearable glasses assembly, the optical measurement system configured to perform one or more optical measurements with respect to the user and output optical measurement data representative of the one or more optical measurements; and a processor configured to perform, based on the optical measurement data, an operation with respect to the visual experience.
An illustrative method includes obtaining, by a processor, optical measurement data output by an optical measurement system included in a wearable glasses assembly being worn by a user, the wearable glasses assembly configured to provide the user with a visual experience in which the user sees a real-world environment; and performing, by the processor based on the optical measurement data, an operation with respect to the visual experience.
An illustrative non-transitory computer-readable medium stores instructions that, when executed, direct a processor of a computing device to: obtain optical measurement data output by an optical measurement system included in a wearable glasses assembly being worn by a user, the wearable glasses assembly configured to provide the user with a visual experience; and perform, based on the optical measurement data, an operation with respect to the visual experience.
In the preceding description, various exemplary embodiments have been described with reference to the accompanying drawings. It will, however, be evident that various modifications and changes may be made thereto, and additional embodiments may be implemented, without departing from the scope of the invention as set forth in the claims that follow. For example, certain features of one embodiment described herein may be combined with or substituted for features of another embodiment described herein. The description and drawings are accordingly to be regarded in an illustrative rather than a restrictive sense.
The present application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 63/196,917, filed on Jun. 4, 2021, and to U.S. Provisional Patent Application No. 63/154,131, filed Feb. 26, 2021, each of which are incorporated herein by reference in their respective entireties.
Number | Date | Country | |
---|---|---|---|
63196917 | Jun 2021 | US | |
63154131 | Feb 2021 | US |