This disclosure relates generally to polarization sensors, and more specifically to polarization multi-channel sensor with sparse polarization pixels.
Polarization of light can be used to determine useful information about shape and material properties of objects. Conventional color polarization image sensors use wire grid matrix filters that cover the entire color sensor (e.g., sensor covered with a Bayer filter). This structure is non-ideal as it, e.g., reduces the light power signal (wire grid reflect ˜50% of light) and it reduces resolution.
Described herein is a polarization multi-channel sensor with sparse polarization pixels (sensor). The sensor has a plurality of macro-pixels. Light from a local area that includes an object may be incident on some or all of the macro-pixels. Each macro pixels includes a corresponding plurality of pixels configured to detect intensity of light in different color channels (e.g., red, green, blue), and one or more polarization pixels. A polarization pixel may be configured to detect intensity of light in a same color channel (e.g., green) that is linearly polarized to a particular orientation. In cases where there are multiple polarization pixels within a macro pixel, the polarization pixels may be configured to detect intensity of light in a same color channel, but at different orientations of linearly polarized light. Moreover, a number of polarization pixels within a macro pixel is relatively sparse compared to a total number of pixels within the macro-pixel. A controller may use the detected intensity information from pixels that have a same color channel as that of the polarization pixels and the detected intensity information from the polarization pixels to determine one or more properties (e.g., location, shape, etc.) of the object.
In some embodiments, a sensor is described. The sensor includes a plurality of macro-pixels. Each macro pixels includes a plurality of pixels configured to detect intensity of light in different color channels, and a first polarization pixel. The plurality of pixels includes a first pixel configured to detect intensity of light in a first color channel of the different color channels. The first polarization pixel is configured to detect intensity of light in the first color channel that is linearly polarized in a first direction. Light incident on a macro-pixel, of the plurality of macro-pixels, is from an object, and a controller may be configured to determine a property of the object based in part on a first intensity signal of a first pixel of the macro-pixel and a first polarization signal of a first polarization pixel of the macro-pixel.
In some embodiments, a system is described. The system may me a polarization camera assembly. The system includes a camera and a controller. The camera includes a plurality of macro-pixels that receive light from a local area that includes an object. A macro-pixel, of the plurality of macro-pixels includes a plurality of pixels configured to detect intensity of light in different color channels and a first polarization pixel. The plurality of pixels including a first pixel configured to detect intensity of light in a first color channel (of the different color channels) as a first intensity signal. The first polarization pixel may be configured to detect intensity of light in the first color channel that is linearly polarized in a first direction as a first polarization signal. The controller may be configured to determine a property of the object based in part on the first intensity signal and the first polarization signal.
In some embodiments, a sensor is described. The sensor includes a plurality of macro-pixels. Each macro pixel includes a plurality of pixels and a first polarization pixel. The plurality of pixels may be configured to detect intensity of light in different color channels. The first polarization pixel may be configured to detect intensity of light in a first color channel of the different color channel that is linearly polarized in a first direction. Light incident on a macro-pixel, of the plurality of macro-pixels, is from an object, and a controller may be configured to determine a property of the object based in part on a first polarization signal of a first polarization pixel of the macro-pixel.
The figures depict various embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
Described herein is a polarization multi-channel sensor with sparse polarization pixels (sensor). The sensor is part of a camera of a polarization camera assembly (PCA). The sensor has a plurality of macro-pixels. Each macro pixels includes a corresponding plurality of pixels configured to detect intensity of light in different color channels (e.g., red, green, blue), and one or more polarization pixels (e.g., wire grid or metasurface). The polarization pixels of each macro-pixel may be configured to detect intensity of light in a same color channel (e.g., green) that is linearly polarized. And in cases where there are multiple polarization pixels within a macro pixel, the polarization pixels may be configured to detect intensity of light in a same color channel, but at different orientations of linearly polarized light. For example, a macro-pixel may include two polarization pixels—one configured to detect linearly polarized light at a first orientation, and the other configured to detect linearly polarized light at a second orientation where the angle between the first and second orientation is 45 degrees. A controller of the PCA may use the detected intensity information from pixels that have a same color channel as that of the polarization pixels and the detected intensity information from the polarization pixels to determine one or more properties (e.g., location, shape, etc.) of the object.
Note that a number of polarization pixels within a macro pixel is relatively sparse compared to a total number of pixels within the macro-pixel. Sparse as used herein means that no more than 50% of pixels within a macro-pixel are polarization pixels. Moreover, the green color channel typically includes the most pixels as the human eye is most sensitive to green light, and for a given macro-pixel, the polarization pixels can be in lieu of some (or in some embodiments all) of the green pixels. This sparseness in the number of polarization pixels facilitates a sensor that has a higher signal to noise ratio relative to conventional polarization sensors where all pixels are polarization pixels (i.e., automatically lose half of incident unpolarized light).
Embodiments of the invention may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to create content in an artificial reality and/or are otherwise used in an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a wearable device (e.g., headset) connected to a host computer system, a standalone wearable device (e.g., headset), a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
The frame 110 holds the other components of the headset 100. The frame 110 includes a front part that holds the one or more display elements 120 and end pieces (e.g., temples) to attach to a head of the user. The front part of the frame 110 bridges the top of a nose of the user. The length of the end pieces may be adjustable (e.g., adjustable temple length) to fit different users. The end pieces may also include a portion that curls behind the ear of the user (e.g., temple tip, earpiece).
The one or more display elements 120 provide light to a user wearing the headset 100. As illustrated the headset includes a display element 120 for each eye of a user. In some embodiments, a display element 120 generates image light that is provided to an eyebox of the headset 100. The eyebox is a location in space that an eye of user occupies while wearing the headset 100. For example, a display element 120 may be a waveguide display. A waveguide display includes a light source (e.g., a two-dimensional source, one or more line sources, one or more point sources, etc.) and one or more waveguides. Light from the light source is in-coupled into the one or more waveguides which outputs the light in a manner such that there is pupil replication in an eyebox of the headset 100. In-coupling and/or outcoupling of light from the one or more waveguides may be done using one or more diffraction gratings. In some embodiments, the waveguide display includes a scanning element (e.g., waveguide, mirror, etc.) that scans light from the light source as it is in-coupled into the one or more waveguides. Note that in some embodiments, one or both of the display elements 120 are opaque and do not transmit light from a local area around the headset 100. The local area is the area surrounding the headset 100. For example, the local area may be a room that a user wearing the headset 100 is inside, or the user wearing the headset 100 may be outside and the local area is an outside area. In this context, the headset 100 generates VR content. Alternatively, in some embodiments, one or both of the display elements 120 are at least partially transparent, such that light from the local area may be combined with light from the one or more display elements to produce AR and/or MR content.
In some embodiments, a display element 120 does not generate image light, and instead is a lens that transmits light from the local area to the eyebox. For example, one or both of the display elements 120 may be a lens without correction (non-prescription) or a prescription lens (e.g., single vision, bifocal and trifocal, or progressive) to help correct for defects in a user's eyesight. In some embodiments, the display element 120 may be polarized and/or tinted to protect the user's eyes from the sun.
The PCA 140 determines determine one or more properties (e.g., location, shape, etc.) of objects within a local area of the PCA 140. The PCA 140 is described in detail below with regard to
The audio system provides audio content. The audio system includes a transducer array, a sensor array, and an audio controller. However, in other embodiments, the audio system may include different and/or additional components. Similarly, in some cases, functionality described with reference to the components of the audio system can be distributed among the components in a different manner than is described here. For example, some or all of the functions of the controller may be performed by a remote server.
The transducer array presents sound to user. The transducer array includes a plurality of transducers. A transducer may be a speaker 160 or a tissue transducer (e.g., a bone conduction transducer or a cartilage conduction transducer). Although the speakers 160 are shown exterior to the frame 110, the speakers 160 may be enclosed in the frame 110. In some embodiments, instead of individual speakers for each ear, the headset 100 includes a speaker array comprising multiple speakers integrated into the frame 110 to improve directionality of presented audio content. The tissue transducer couples to the head of the user and directly vibrates tissue (e.g., bone or cartilage) of the user to generate sound. The number and/or locations of transducers may be different from what is shown in
The sensor array detects sounds within the local area of the headset 100. The sensor array includes a plurality of acoustic sensors 180. An acoustic sensor 180 captures sounds emitted from one or more sound sources in the local area (e.g., a room). Each acoustic sensor is configured to detect sound and convert the detected sound into an electronic format (analog or digital). The acoustic sensors 180 may be acoustic wave sensors, microphones, sound transducers, or similar sensors that are suitable for detecting sounds.
In some embodiments, one or more acoustic sensors 180 may be placed in an ear canal of each ear (e.g., acting as binaural microphones). In some embodiments, the acoustic sensors 180 may be placed on an exterior surface of the headset 100, placed on an interior surface of the headset 100, separate from the headset 100 (e.g., part of some other device), or some combination thereof. The number and/or locations of acoustic sensors 180 may be different from what is shown in
The audio controller processes information from the sensor array that describes sounds detected by the sensor array. The audio controller may comprise a processor and a computer-readable storage medium. The audio controller may be configured to generate direction of arrival (DOA) estimates, generate acoustic transfer functions (e.g., array transfer functions and/or head-related transfer functions), track the location of sound sources, form beams in the direction of sound sources, classify sound sources, generate sound filters for the speakers 160, or some combination thereof.
The position sensor 190 generates one or more measurement signals in response to motion of the headset 100. The position sensor 190 may be located on a portion of the frame 110 of the headset 100. The position sensor 190 may include an inertial measurement unit (IMU). Examples of position sensor 190 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor that detects motion, a type of sensor used for error correction of the IMU, or some combination thereof. The position sensor 190 may be located external to the IMU, internal to the IMU, or some combination thereof.
In some embodiments, the headset 100 may provide for simultaneous localization and mapping (SLAM) for a position of the headset 100 and updating of a model of the local area. For example, the headset 100 may include a passive camera assembly that generates color image data. The passive camera assembly may include one or more RGB cameras that capture images of some or all of the local area. The images captured by the PCA and the location information determined by the PCA 140 may be used to determine parameters of the local area, generate a model of the local area, update a model of the local area, or some combination thereof. Furthermore, the position sensor 190 tracks the position (e.g., location and pose) of the headset 100 within the room. Additional details regarding the components of the headset 100 are discussed below in connection with
The camera 220 detects light from the local area 205. The camera 220 includes a polarization multi-channel sensor with sparse polarization pixels (“sensor”), and may include one or more optical elements (e.g., lenses). The one or more optical elements may, e.g., focus light from the local area 205 onto the sensor.
The sensor detects intensity information of light from the local area 205. The sensor includes a plurality of macro-pixels. Each macro pixels includes a corresponding plurality of pixels configured to detect intensity of light in different color channels and one or more polarization pixels. The color channels may be, e.g., red, green, and blue. In other embodiments the color channels may include some other color and/or correspond to some other combination of colors.
The one or more polarization pixels of each macro-pixel are configured to detect intensity of light in a same color channel that is linearly polarized. For example, in some embodiments, the one or more polarization pixels of each macro-pixel are configured to detect intensity of green light that is linear polarized. In embodiments, where there are multiple polarization pixels within a macro pixel, the polarization pixels may be configured to detect intensity of light in a same color channel, but at different orientations of linearly polarized light. For a given sensor layout, the orientations of linear polarized light detected by the one or more polarization pixels of each macro-pixel are the same. The sensor may have different layouts, e.g., linear polarization pixels configured to detect two detect 2, 3, or 4 different orientations of linearly polarized light. Examples of the various layouts are described below with regard to, e.g.,
The polarized light projector 240 outputs linearly polarized light. The polarized light projector 240 includes one or more illuminators that emit linear polarized light. In some embodiments, the one or more illuminators emit linearly polarized light at a single orientation (e.g., 0 degrees) that is the same for all of the one or more illuminators—and is aligned 90 degrees relative to polarization pixels on the camera 220. In other embodiments, the polarized light projector 240 emits, in a time multiplexed and/or sequential manner, linearly polarized light at a first orientation (0 degrees) and is aligned 90 degrees relative to some polarization pixels on the camera 220, and linear polarized light at a second orientation (e.g., 45 degrees) is aligned 90 degrees relative to some polarization pixels on the camera 220. The polarized light projector 240 may include a first illuminator that emits light at the first orientation and a second illuminator that emits light at the second orientation. In other embodiments, a single illuminator can emit linearly polarized light in different orientations at different respective times.
The controller 230 determines determine one or more properties of objects (e.g., object 250) within the local area 205 of the PCA 140. A property of the object may be, e.g., shape, position, one or more material properties (e.g., roughness, diffuse albedo, etc.), or some combination thereof. How the controller 230 process information from the sensor and/or controls the polarized light projector 240 to determine the one or more properties of the object is based in part on the specific sensor layout, and is discussed below with regard to, e.g.,
A controller (e.g., the controller 230) may determine the S0, S1, and S2 Stokes parameters using the detected intensities from the polarization pixels 320. S0 is a Stokes parameter that refers to a total intensity of light. S1 is a Stokes parameter that refers to a horizontal v. vertical preference in linear polarization, e.g., a difference between the amount of light polarized along 0 degrees and the amount of light polarized at 90 degrees. S2 is a Stokes parameter that refers to a diagonal v. antidiagonal preference in linear polarization, e.g., difference between an amount of light polarized along 45 degrees and an amount of light polarized along −45 degrees. The controller may determine the S0, S1, and S2 via:
Where AG0 is detected intensity (i.e., absorption) at the polarization pixel 320A, AG-π/4 is detected intensity at the polarization pixel 320B, AGπ/2 is detected intensity at the polarization pixel 320C, AGπ/4 is detected intensity at the polarization pixel 320D, and AG is detected intensity (i.e., absorption) at a green pixel in the macro-pixel 310 that is non-polarizing (i.e., not one of the polarization pixels 320). In this example, AG is available in four different pixels in each macro-pixel. In some embodiments, AG is determined based on a value of closest AG pixels (e.g., via interpolation). The controller may use the Stokes parameters with other factors (e.g., shape) to solve for a material property using methods found in the art.
Note using the layout of
A controller (e.g., the controller 230) may determine the S0, S1, and S2 Stokes parameters using the detected intensities from the polarization pixels 420. The controller may determine the S0, S1, and S2 via:
Where AGπ/8 is detected intensity at the polarization pixel 420A, AG-π/8 is detected intensity at the polarization pixel 420B, and AG3π/8 is detected intensity at the polarization pixel 420C. The controller can then use the Stokes parameters to solve for a material property using methods found in the art.
Note using the layout of
A controller (e.g., the controller 230) may determine the S0, S1, and S2 Stokes parameters using the detected intensities from the polarization pixels 420. The controller may determine the S0, S1, and S2 via:
Where AGπ/8 is detected intensity at the polarization pixel 520A, and AG-π/8 is detected intensity at the polarization pixel 520B. The controller can then use the determined Stokes parameters to solve for a material property using methods found in the art.
Note that in each of the
In other embodiments, the PCA uses both the camera and the polarized light projector to determine material properties of an object. These embodiments are based in part on properties associated with diffuse reflection and specular reflection. Specular reflection occurs at a surface—e.g., on microfacets along a surface of an object. While a total amount of specular reflection is not meant to vary, it is angularly distributed as a function of roughness. Roughness is the main material property affecting specular reflection, through a distribution function. The distribution function describes the density of facets as a function of their zenith angle with respect to surface normal. The specificity of specular reflection is that it fully maintains the polarization of the source after reflection, so it is fully polarized if the source is polarized, with the same orientation angle, for whatever position on the object. In contrast, for diffuse reflection is a result of some amount of light getting transmitted into the object and then is back scattered, appearing as reflected light. This amount of retroreflection is called diffuse albedo (between 0 and 1). This diffuse light is depolarized inside the object, therefore it can only acquire polarization by transmission back through the air-object interface. This transmission happens to be fully unpolarized for directions normal to the surface normal.
A controller (e.g., the controller 230) instructs the polarized light projector to illuminate a local area with polarized light. Portions of the polarized light are reflected and/or scattered off of objects within the local area and are incident on the sensor.
The control may be configured to determine a degree of linear polarization for the second direction for a given macro-pixel (e.g., 610) via:
Where, AG=S0=S0-sp+S0-diff, S0-sp is the specular component of S0, S0-diff is the diffuse component of S0, and AG0 is the intensity detected by the polarization pixel (e.g., 620) of the macro-pixel
The control may be configured to determine a degree of linear polarization at highlight (“DOLPhighlight”) for the given macro-pixel (e.g., 610) via:
The DOLPhighlight refers to the degree of linear polarization at maximal irradiance (characterized by the fact that the viewing direction is perfectly orthogonal to a surface normal of the object) The controller uses the DOLPHorizontal values to determine material properties like, e.g., roughness and/or diffuse albedo, and S0 can be used to determine object position.
The DOLPhighlight (equals to horizontal DOLP), is the ratio of specular irradiance to full irradiance (partial specular irradiance), and gives information on the ratio of the diffuse albedo (“wdiff”) to a peak of a distribution function DO (Distribution function at angle origin). The ratio of diffuse albedo to peak of the Distribution function may be found via:
Where n is the refractive index (e.g., 1.5). The distribution function can be found either from: highlight sharpness, or a minimum of DOLPHorizontal near the highlight. For example, the minimum of DOLPHorizontal may be found via:
Where α is angle and αmin is angle at which the corresponding function is a minimum. Using equations 13 and 14 above, the controller may determine the Distribution function (e.g., describes object roughness) and the diffuse albedo for the color channel the polarization pixels are in (e.g., green). The controller may then determine the diffuse albedo for other color channels (e.g., blue and/or red) based on the relative intensity of the other color channels with respect to the color channel the polarization pixels are in.
The controller can determine object position based in part on the determined material properties and detected intensity. For example, a distance (“dhigh”) to the object may be calculated via:
Where S0-highlight is S0 Stokes parameter at the highlight, Ilight is source power (e.g., power of light emitted by the polarized light projector 240). Accordingly, the controller is able to determine roughness, diffuse albedo, and position using the equations shown above.
The controller of the PCA may determine, e.g., using the equations describe above, object position and material properties. The embodiments described in
Where the superscript (1) refers to a time period in which the sensor captures light 730A reflected from objects in the local area, and the superscript (2) refers to a time period in which the sensor captures light 730B reflected from objects in the local area. As such, AG(1) is detected intensity of 730A at a green pixel (non-polarized), AG(2) is detected intensity of 730B at the green pixel (non-polarized), AG0(2) is detected intensity of the light 730B at the polarization pixel (e.g., 720A), AGπ/4(1) is detected intensity of the light 730A at the polarization pixel (e.g., 720B), and AGπ/4(2) is detected intensity of the light 730B at the polarization pixel (e.g., 720B).
Once these diffuse components are extracted, the controller may determine one or more surface normals orientation (i.e., shape) for the object. The controller may make the determination using one or more methods known in the art.
Note that in each of the
Whereas all the invention was described with wire grid polarization filters, there is another kind of components that can be used, the metasurface polarization routers. These metasurfaces are generally dielectric (whereas wire grids are metallic) and are able to route light as a function of its polarization state to the correct pixel. This further reduces the photon loss compared to wire grids.
For example, as shown in
The controller (e.g., the controller 230) may use the detected intensities to determine a property of an object based in part on the manner described above for
The headset 1105 includes the display assembly 1130, an optics block 1135, one or more position sensors 1140, and the PCA 200. Some embodiments of headset 1105 have different components than those described in conjunction with
The display assembly 1130 displays content to the user in accordance with data received from the console 1115. The display assembly 1130 displays the content using one or more display elements (e.g., the display elements 120). A display element may be, e.g., an electronic display. In various embodiments, the display assembly 1130 comprises a single display element or multiple display elements (e.g., a display for each eye of a user). Examples of an electronic display include: a liquid crystal display (LCD), an organic light emitting diode (OLED) display, an active-matrix organic light-emitting diode display (AMOLED), a waveguide display, some other display, or some combination thereof. Note in some embodiments, the display element 120 may also include some or all of the functionality of the optics block 1135.
The optics block 1135 may magnify image light received from the electronic display, corrects optical errors associated with the image light, and presents the corrected image light to one or both eyeboxes of the headset 1105. In various embodiments, the optics block 1135 includes one or more optical elements. Example optical elements included in the optics block 1135 include: an aperture, a Fresnel lens, a convex lens, a concave lens, a filter, a reflecting surface, or any other suitable optical element that affects image light. Moreover, the optics block 1135 may include combinations of different optical elements. In some embodiments, one or more of the optical elements in the optics block 1135 may have one or more coatings, such as partially reflective or anti-reflective coatings.
Magnification and focusing of the image light by the optics block 1135 allows the electronic display to be physically smaller, weigh less, and consume less power than larger displays. Additionally, magnification may increase the field of view of the content presented by the electronic display. For example, the field of view of the displayed content is such that the displayed content is presented using almost all (e.g., approximately 110 degrees diagonal), and in some cases, all of the user's field of view. Additionally, in some embodiments, the amount of magnification may be adjusted by adding or removing optical elements.
In some embodiments, the optics block 1135 may be designed to correct one or more types of optical error. Examples of optical error include barrel or pincushion distortion, longitudinal chromatic aberrations, or transverse chromatic aberrations. Other types of optical errors may further include spherical aberrations, chromatic aberrations, or errors due to the lens field curvature, astigmatisms, or any other type of optical error. In some embodiments, content provided to the electronic display for display is pre-distorted, and the optics block 1135 corrects the distortion when it receives image light from the electronic display generated based on the content.
The position sensor 1140 is an electronic device that generates data indicating a position of the headset 1105. The position sensor 1140 generates one or more measurement signals in response to motion of the headset 1105. The position sensor 190 is an embodiment of the position sensor 1140. Examples of a position sensor 1140 include: one or more IMUs, one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor that detects motion, or some combination thereof. The position sensor 1140 may include multiple accelerometers to measure translational motion (forward/back, up/down, left/right) and multiple gyroscopes to measure rotational motion (e.g., pitch, yaw, roll). In some embodiments, an IMU rapidly samples the measurement signals and calculates the estimated position of the headset 1105 from the sampled data. For example, the IMU integrates the measurement signals received from the accelerometers over time to estimate a velocity vector and integrates the velocity vector over time to determine an estimated position of a reference point on the headset 1105. The reference point is a point that may be used to describe the position of the headset 1105. While the reference point may generally be defined as a point in space, however, in practice the reference point is defined as a point within the headset 1105.
The PCA 200 is configured to determine a property of an object in a local area of the headset 1105. Operation and structure of the PCA 200 is described above with regard to
The audio system 1150 provides audio content to a user of the headset 1105. The audio system 1150 is substantially the same as the audio system describe above with reference to
The I/O interface 1110 is a device that allows a user to send action requests and receive responses from the console 1115. An action request is a request to perform a particular action. For example, an action request may be an instruction to start or end capture of image or video data, or an instruction to perform a particular action within an application. The I/O interface 1110 may include one or more input devices. Example input devices include: a keyboard, a mouse, a game controller, or any other suitable device for receiving action requests and communicating the action requests to the console 1115. An action request received by the I/O interface 1110 is communicated to the console 1115, which performs an action corresponding to the action request. In some embodiments, the I/O interface 1110 includes an IMU that captures calibration data indicating an estimated position of the I/O interface 1110 relative to an initial position of the I/O interface 1110. In some embodiments, the I/O interface 1110 may provide haptic feedback to the user in accordance with instructions received from the console 1115. For example, haptic feedback is provided when an action request is received, or the console 1115 communicates instructions to the I/O interface 1110 causing the I/O interface 1110 to generate haptic feedback when the console 1115 performs an action.
The console 1115 provides content to the headset 1105 for processing in accordance with information received from one or more of: the PCA 200, the headset 1105, and the I/O interface 1110. In the example shown in
The application store 1155 stores one or more applications for execution by the console 1115. An application is a group of instructions, that when executed by a processor, generates content for presentation to the user. Content generated by an application may be in response to inputs received from the user via movement of the headset 1105 or the I/O interface 1110. Examples of applications include: gaming applications, conferencing applications, video playback applications, or other suitable applications.
The tracking module 1160 tracks movements of the headset 1105 or of the I/O interface 1110 using information from the PCA 200, the one or more position sensors 1140, or some combination thereof. For example, the tracking module 1160 determines a position of a reference point of the headset 1105 in a mapping of a local area based on information from the headset 1105. The tracking module 1160 may also determine positions of an object or virtual object. Additionally, in some embodiments, the tracking module 1160 may use portions of data indicating a position of the headset 1105 from the position sensor 1140 as well as representations of the local area from the PCA 200 to predict a future location of the headset 1105. The tracking module 1160 provides the estimated or predicted future position of the headset 1105 or the I/O interface 1110 to the engine 1165.
The engine 1165 executes applications and receives position information, acceleration information, velocity information, predicted future positions, or some combination thereof, of the headset 1105 from the tracking module 1160. Based on the received information, the engine 1165 determines content to provide to the headset 1105 for presentation to the user. For example, if the received information indicates that the user has looked to the left, the engine 1165 generates content for the headset 1105 that mirrors the user's movement in a virtual local area or in a local area augmenting the local area with additional content. Additionally, the engine 1165 performs an action within an application executing on the console 1115 in response to an action request received from the I/O interface 1110 and provides feedback to the user that the action was performed. The provided feedback may be visual or audible feedback via the headset 1105 or haptic feedback via the I/O interface 1110.
The network 1120 couples the headset 1105 and/or the console 1115 to the mapping server 1125. The network 1120 may include any combination of local area and/or wide area networks using both wireless and/or wired communication systems. For example, the network 1120 may include the Internet, as well as mobile telephone networks. In one embodiment, the network 1120 uses standard communications technologies and/or protocols. Hence, the network 1120 may include links using technologies such as Ethernet, 802.11, worldwide interoperability for microwave access (WiMAX), 2G/3G/4G mobile communications protocols, digital subscriber line (DSL), asynchronous transfer mode (ATM), InfiniBand, PCI Express Advanced Switching, etc. Similarly, the networking protocols used on the network 1120 can include multiprotocol label switching (MPLS), the transmission control protocol/Internet protocol (TCP/IP), the User Datagram Protocol (UDP), the hypertext transport protocol (HTTP), the simple mail transfer protocol (SMTP), the file transfer protocol (FTP), etc. The data exchanged over the network 1120 can be represented using technologies and/or formats including image data in binary form (e.g., Portable Network Graphics (PNG)), hypertext markup language (HTML), extensible markup language (XML), etc. In addition, all or some of links can be encrypted using conventional encryption technologies such as secure sockets layer (SSL), transport layer security (TLS), virtual private networks (VPNs), Internet Protocol security (IPsec), etc.
The mapping server 1125 may include a database that stores a virtual model describing a plurality of spaces, wherein one location in the virtual model corresponds to a current configuration of a local area of the headset 1105. The mapping server 1125 receives, from the headset 1105 via the network 1120, information describing at least a portion of the local area and/or location information for the local area. The user may adjust privacy settings to allow or prevent the headset 1105 from transmitting information to the mapping server 1125. The mapping server 1125 determines, based on the received information and/or location information, a location in the virtual model that is associated with the local area of the headset 1105. The mapping server 1125 determines (e.g., retrieves) one or more acoustic parameters associated with the local area, based in part on the determined location in the virtual model and any acoustic parameters associated with the determined location. The mapping server 1125 may transmit the location of the local area and any values of acoustic parameters associated with the local area to the headset 1105.
One or more components of system 1100 may contain a privacy module that stores one or more privacy settings for user data elements. The user data elements describe the user or the headset 1105. For example, the user data elements may describe a physical characteristic of the user, an action performed by the user, a location of the user of the headset 1105, a location of the headset 1105, an HRTF for the user, etc. Privacy settings (or “access settings”) for a user data element may be stored in any suitable manner, such as, for example, in association with the user data element, in an index on an authorization server, in another suitable manner, or any suitable combination thereof.
A privacy setting for a user data element specifies how the user data element (or particular information associated with the user data element) can be accessed, stored, or otherwise used (e.g., viewed, shared, modified, copied, executed, surfaced, or identified). In some embodiments, the privacy settings for a user data element may specify a “blocked list” of entities that may not access certain information associated with the user data element. The privacy settings associated with the user data element may specify any suitable granularity of permitted access or denial of access. For example, some entities may have permission to see that a specific user data element exists, some entities may have permission to view the content of the specific user data element, and some entities may have permission to modify the specific user data element. The privacy settings may allow the user to allow other entities to access or store user data elements for a finite period of time.
The privacy settings may allow a user to specify one or more geographic locations from which user data elements can be accessed. Access or denial of access to the user data elements may depend on the geographic location of an entity who is attempting to access the user data elements. For example, the user may allow access to a user data element and specify that the user data element is accessible to an entity only while the user is in a particular location. If the user leaves the particular location, the user data element may no longer be accessible to the entity. As another example, the user may specify that a user data element is accessible only to entities within a threshold distance from the user, such as another user of a headset within the same local area as the user. If the user subsequently changes location, the entity with access to the user data element may lose access, while a new group of entities may gain access as they come within the threshold distance of the user.
The system 1100 may include one or more authorization/privacy servers for enforcing privacy settings. A request from an entity for a particular user data element may identify the entity associated with the request and the user data element may be sent only to the entity if the authorization server determines that the entity is authorized to access the user data element based on the privacy settings associated with the user data element. If the requesting entity is not authorized to access the user data element, the authorization server may prevent the requested user data element from being retrieved or may prevent the requested user data element from being sent to the entity. Although this disclosure describes enforcing privacy settings in a particular manner, this disclosure contemplates enforcing privacy settings in any suitable manner.
The foregoing description of the embodiments has been presented for illustration; it is not intended to be exhaustive or to limit the patent rights to the precise forms disclosed. Persons skilled in the relevant art can appreciate that many modifications and variations are possible considering the above disclosure.
Some portions of this description describe the embodiments in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.
Any of the steps, operations, or processes described herein may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In one embodiment, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all the steps, operations, or processes described.
Embodiments may also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
Embodiments may also relate to a product that is produced by a computing process described herein. Such a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the patent rights. It is therefore intended that the scope of the patent rights be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the patent rights, which is set forth in the following claims.