The application relates generally to positional tracking systems with holographic encoded positions.
Many applications benefit from knowing the relative location of an object such as a virtual reality (VR) or augmented realty (AR) headset relative to an object such as a display. Computer games for instance, can benefit from knowing such locations.
A system records a hologram of an array of position encodings onto photographic film (holographic film) to be read by a digital or analog sensor (charge coupled device (CCD), complementary metal-oxide semiconductor (CMOS), photodiodes, etc.). The position of the laser source used for the recording of the hologram is encoded into the holographic film as a reflection of its light onto a series of objects used to encode the physical position of the laser source relative to the film. The encoding onto the hologram can be a simple series of bars, splotches, lines, bar codes, QR codes, or any form of noise robust image based encoding scheme to encoding one or more values representing the relative Azimuth angle A, Bearing angle B, Roll angle R, X position, Y position, Z position, or any combination of angles and positions to form one, two or three dimensions of positions and angles of the laser source. These encodings represent the relative pose of the laser source to the holographic film.
A mechanized system can move a laser source and a series of position encoding object reflectors such that for each position of the laser emitter, a different encoding object reflection is recorded into one or more areas on the holographic film. The object reflectors are positioned to reflect an image based encoding of one or more pose values. The pose value(s) are a ground truth measurement of the laser source relative to the holographic film. The mechanized system moves the laser light source and the pose encoding reflector objects over a 1D, 2D or 3D array of positions to record the positional spacing and/or relative angles of the laser source into the holographic film.
A real-time positional tracking system is achieved by placing the previously recorded holographic film over a light to electronic sensor like a CCD, CMOS, Photodiode array. The sensors measure the light from areas of the holographic film that relay an encoding of the pose of a remote laser source. The laser source can be the fixed reference point in this positional tracking system. By decoding the light patterns from the holographic film onto the sensor(s) the 1D, 2D or 3D pose of the remote laser source can be determined. In addition, polarization can be used to improve the robustness of the pose tracking or add a single axis of orientation (Azimuth, Bearing or Roll) tracking. A static polarizer on the laser source and on the holographic film can change the encoding patterns to portray additional information to aid the positional tracking.
Using the techniques herein, a positional tracking system can be constructed from a coherent laser source and holographic film placed over a light sensor, with software or hardware to decode the holographic patterns as positional and/or orientation tracking information. The laser source can operate in the infrared range and be invisible to the naked eye, as well as being modulated at a very high carrier frequency to be robust to noise from sunlight.
Accordingly, a method for recording a hologram of an array of position and/or orientation encodings (all examples of pose encodings) onto holographic film includes moving light from an encoding laser across plural position encoding object reflectors. At least some of the reflectors emit patterns of the light differently from each other to establish respective coded emissions. The method includes receiving each of the coded emissions from the reflectors on respective regions of the film, and correlating the coded emissions to respective positions of a laser.
In some implementations, the method can include illuminating the film using at least one indicator laser, juxtaposing the film with at least one sensor to sense light from areas of the film illuminated by the indicator laser and representing at least one of the coded emissions, and decoding signals from the sensor representing the at least one coded emission to return a respective position of a laser. In such embodiments, the position (and if desired other pose information) of a laser returned from decoding the signals is a position of the indicator laser, which may be an IR laser. Light from the indicator laser can be modulated at a carrier frequency of at least one megahertz.
In some embodiments the position encoding object reflectors establish plural different splotches, plural different lines, plural different bar codes, and plural different quick response (QR) codes.
In another aspect, an apparatus includes at least one indicator laser and at least one holographically recorded film having plural coded regions, with each coded region representing a code different from other coded regions on the film. At least one sensor is provided to sense light from at least one coded region of the film illuminated by the indicator laser. Also, at least one decoder is configured for decoding signals from the sensor representing the at least one coded region to return a respective position, orientation, or other pose information of the indicator laser.
In another aspect, an apparatus includes at least one holographically recorded film having plural coded regions. Each coded region represents a code different from other coded regions on the film. At least one data storage medium correlates the coded regions to respective positions of a laser. Alternatively or in addition, a circuit such as but not limited to an application specific integrated circuit (ASIC) may be provided for decoding information in the coded regions to render an output representing the pose information of the laser.
The details of the present application, both as to its structure and operation, can best be understood in reference to the accompanying drawings, in which like reference numerals refer to like parts, and in which:
This disclosure relates generally to computer ecosystems including aspects of consumer electronics (CE) device networks such as but not limited to computer game networks. A system herein may include server and client components, connected over a network such that data may be exchanged between the client and server components. The client components may include one or more computing devices including game consoles such as Sony PlayStation® or a game console made by Microsoft or Nintendo or other manufacturer virtual reality (VR) headsets, augmented reality (AR) headsets, portable televisions (e.g. smart TVs, Internet-enabled TVs), portable computers such as laptops and tablet computers, and other mobile devices including smart phones and additional examples discussed below. These client devices may operate with a variety of operating environments. For example, some of the client computers may employ, as examples, Linux operating systems, operating systems from Microsoft, or a Unix operating system, or operating systems produced by Apple Computer or Google. These operating environments may be used to execute one or more browsing programs, such as a browser made by Microsoft or Google or Mozilla or other browser program that can access websites hosted by the Internet servers discussed below. Also, an operating environment according to present principles may be used to execute one or more computer game programs.
Servers and/or gateways may include one or more processors executing instructions that configure the servers to receive and transmit data over a network such as the Internet. Or, a client and server can be connected over a local intranet or a virtual private network. A server or controller may be instantiated by a game console such as a Sony PlayStation®, a personal computer, etc.
Information may be exchanged over a network between the clients and servers. To this end and for security, servers and/or clients can include firewalls, load balancers, temporary storages, and proxies, and other network infrastructure for reliability and security. One or more servers may form an apparatus that implement methods of providing a secure community such as an online social website to network members.
As used herein, instructions refer to computer-implemented steps for processing information in the system. Instructions can be implemented in software, firmware or hardware and include any type of programmed step undertaken by components of the system.
A processor may be any conventional general purpose single- or multi-chip processor that can execute logic by means of various lines such as address lines, data lines, and control lines and registers and shift registers.
Software modules described by way of the flow charts and user interfaces herein can include various sub-routines, procedures, etc. Without limiting the disclosure, logic stated to be executed by a particular module can be redistributed to other software modules and/or combined together in a single module and/ or made available in a shareable library.
Present principles described herein can be implemented as hardware, software, firmware, or combinations thereof; hence, illustrative components, blocks, modules, circuits, and steps are set forth in terms of their functionality.
Further to what has been alluded to above, logical blocks, modules, and circuits described below can be implemented or performed with a general purpose processor, a digital signal processor (DSP), a field programmable gate array (FPGA) or other programmable logic device such as an application specific integrated circuit (ASIC), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor can be implemented by a controller or state machine or a combination of computing devices.
The functions and methods described below, when implemented in software, can be written in an appropriate language such as but not limited to Java, C# or C++, and can be stored on or transmitted through a computer-readable storage medium such as a random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), compact disk read-only memory (CD-ROM) or other optical disk storage such as digital versatile disc (DVD), magnetic disk storage or other magnetic storage devices including removable thumb drives, etc. A connection may establish a computer-readable medium. Such connections can include, as examples, hard-wired cables including fiber optics and coaxial wires and digital subscriber line (DSL) and twisted pair wires. Such connections may include wireless communication connections including infrared and radio.
Components included in one embodiment can be used in other embodiments in any appropriate combination. For example, any of the various components described herein and/or depicted in the Figures may be combined, interchanged or excluded from other embodiments.
“A system having at least one of A, B, and C” (likewise “a system having at least one of A, B, or C” and “a system having at least one of A, B, C”) includes systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.
In the example shown the array 16 is a two dimensional array but could be a three dimensional array as shown by the x-y-z axes 18. Light 20 from the encoding laser 12 that does not impinge on a reflector can interfere with light 22 that passes through a reflector, with the resulting interference pattern being encoded in a region 24 of a holographic film 26. Once illumination of a first reflector “A” is encoded onto the region 24 of the film 26, a motor 28 that is coupled to the encoding laser 12 by a mechanism 30 (such as a gimbal, servo, rail, rack-and-pinion, etc.) can be activated to move the encoding laser 12 to illuminate another one of the reflectors in the array, with each reflector (or groups of reflectors when simultaneously illuminated) establishing its own unique code. If desired, reflectors that are not to be illuminated for a particular location of the encoding laser 12 can be masked by, for example, a movable physical mask (not shown for clarity) with a single opening placed over the reflector sought to be illuminated and with a mask substrate that blocks light from other reflectors. A similar movable mask 32 can also be formed with an opening 34 and positioned over the region 24 during encoding to mask other nearby regions, reducing cross-talk. A polarization filter 35 may be disposed in the opening 34 if desired. Motors and mechanisms similar to those used to move the laser can be used to move the mask(s). Various other mechanisms can be utilized for masking the exposure for the areas outside of region 24, including but not limited to a LCD polarizing screen and other forms of dynamic light blocking schemes.
Note that the polarization filters herein may be altered spatially for the hologram recording to reduce cross-talk with neighboring encoding areas on the holographic film. The polarization can be dynamic by using an electronically controlled spatial light modulator in addition to or in lieu of the polarizer 14.
Prior to further explanation of present techniques, reference is directed to
In
Once the region 24 has encoded the unique pattern from the reflector A, the encoding laser 12 is moved one increment to a next nearest location as shown in
While
It may now be appreciated that once the film 26 has been encoded as described above, when another laser (referred to herein as an “indicator” laser) subsequently illuminates the film, the indicator laser will illuminate the region of film that was encoded by the encoding laser 12 when the encoding laser 12 was in the same relative location to the film 26 as the subsequent indicator laser is in.
The indicator laser 900 may be an infrared (IR) laser, although other wavelengths including visible and ultraviolet are contemplated. In some embodiments the wavelength of the light emitted by the indicator laser 900 may be greater than 1,000 nanometers, e.g., 1,440 nm to ensure that a game player does not see the laser light. The laser may be pulsed using a pulse repetition rate (PRR) that uniquely identifies the laser from other nearby indicator lasers. The laser may be modulated at a very high carrier frequency, e.g., in excess of thirty kilohertz, more preferably in excess of fifty kilohertz, and more preferably still at least one megahertz to be robust to noise from sunlight.
If desired, light from the indicator laser 900 can be polarized and changed over time using a polarizer 920 to improve signal to noise ratio of encoding. In this way, a plurality of signals can be decoded for each temporal polarization and the encoding with the highest signal to noise ratio may be chosen.
Proceeding to block 1006 the encoding laser 12 is moved to the next location relative to the film 12, and if desired its polarization is changed at block 1008 for reasons explained above. A second reflector B is illuminated by the laser at block 1010 and its code captured (encoded) in the film 26 at block 1012. The described process of moving the encoding laser, changing polarization if desired, and successively illuminating reflectors continues at block 1014 for subsequent locations 3, . . . N to encode subsequent respective unique reflector codes C, . . . N onto the film 26, with each code being recorded and correlated to the respective location information of the laser 12 at block 1016.
Recalling the subsequent location determination system of
Proceeding to block 1102, the film 26 is illuminated with the indicator laser 900. The sensor 904 senses the resultant unique robust code pattern of light emitted from the film and its signal representative thereof is received at block 1104. Image recognition is applied to the signal to recognize the code at block 1106. In one example, the recognized code can be used at block 1108 as entering argument to, e.g., the data structure of
In any case, however derived from the code, the pose information including location of the laser may be output at block 1110 to an AR or VR computer game console for reasons to be shortly illuminated.
More specifically and turning to
Note that plural fixed illuminators 1200 may be used in a system, each using a unique PRR as indicated above. If desired, each fixed illuminator may be associated with a respective fixed sensor 904 with film 26 assembly, and each illuminator with its sensor and film, in a fixed assembly, can be aware of other fixed illuminator assemblies. This allows multiple illuminators to self-calibrate, enabling a single tracking space.
Alternatively and again because the recorded locations in
Assuming the architecture of
Or, the movable film/sensor assembly 1202 may be implemented by a game controller such as the controller 1400 shown in
In any case, it may now be appreciated that the locations of objects such as but not limited to the movable game-related objects described herein can be ascertained with respect to a reference point that can be tied to a computer game. Each movable film/sensor assembly 1202 can determine its location as described above and wirelessly report the location to the game processor. Or, the assembly can simply send a signal representing the unique code being illuminated to the game processor for derivation of the location by the game processor. Regardless, the game processor may then know, for example, the location of a VR/AR headset relative to the display on which the game is presented, and/or the location of the game controller 1400, etc. and tailor VR/AR presentation accordingly.
In some implementations, the encoded regions of the film 26 (e.g., the regions 24, 604) can be exposed to the encoding laser light multiple times to be reused. For example, the region 24 can be exposed for laser position X=0, Y=0 and again for laser position X=0, Y=1. This will allow fewer code blocks to be used for encoding. An example can use 2's complement unsigned binary encoding, where 2 code blocks=3 codes, 3 code blocks=7 codes, 4 code blocks=15 codes, etc.
As mentioned previously, temporally changing polarization of the encoding laser 12 (and subsequent decoding by the indicator laser 900) can be used to improve code block robustness. During the encoding phase, adjoining code blocks on the holographic film 26 may be recorded from differently polarized laser light to reduce cross-talk of laser light into adjoining code blocks. During the sensing phase, the laser light from the indicator laser 900 can be temporally polarized with differing polarizations. The successive polarizations over a short duration and subsequent lit code blocks will facilitate the code-to-position determination as one polarization on a specific code block (that was recorded with that polarization) will have a significantly higher signal-to-noise ratio (SNR) than other polarizations. This technique allows for improved filtering and detection of the correct position encoding code sequence.
Now referring to
Accordingly, to undertake such principles the AVD 1612 can be established by some or all of the components shown in
In addition to the foregoing, the AVD 1612 may also include one or more input ports 1626 such as, e.g., a high definition multimedia interface (HDMI) port or a USB port to physically connect (e.g. using a wired connection) to another CE device and/or a headphone port to connect headphones to the AVD 1612 for presentation of audio from the AVD 1612 to a user through the headphones. For example, the input port 1626 may be connected via wire or wirelessly to a cable or satellite source 1626a of audio video content. Thus, the source 1626a may be, e.g., a separate or integrated set top box, or a satellite receiver. Or, the source 1626a may be a game console or disk player containing content that might be regarded by a user as a favorite for channel assignation purposes described further below. The source 1626a when implemented as a game console may include some or all of the components described below in relation to the CE device 1644.
The AVD 1612 may further include one or more computer memories 1628 such as disk-based or solid state storage that are not transitory signals, in some cases embodied in the chassis of the AVD as standalone devices or as a personal video recording device (PVR) or video disk player either internal or external to the chassis of the AVD for playing back AV programs or as removable memory media. Also in some embodiments, the AVD 1612 can include a position or location receiver such as but not limited to a cellphone receiver, GPS receiver and/or altimeter 1630 that is configured to e.g. receive geographic position information from at least one satellite or cellphone tower and provide the information to the processor 1624 and/or determine an altitude at which the AVD 1612 is disposed in conjunction with the processor 1624. However, it is to be understood that that another suitable position receiver other than a cellphone receiver, GPS receiver and/or altimeter may be used in accordance with present principles to e.g. determine the location of the AVD 1612 in e.g. all three dimensions.
Continuing the description of the AVD 1612, in some embodiments the AVD 1612 may include one or more cameras 2632 that may be, e.g., a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into the AVD 1612 and controllable by the processor 1624 to gather pictures/images and/or video in accordance with present principles. Also included on the AVD 1612 may be a Bluetooth transceiver 1634 and other Near Field Communication (NFC) element 1636 for communication with other devices using Bluetooth and/or NFC technology, respectively. An example NFC element can be a radio frequency identification (RFID) element.
Further still, the AVD 1612 may include one or more auxiliary sensors 1637 (e.g., a motion sensor such as an accelerometer, gyroscope, cyclometer, or a magnetic sensor, an infrared (IR) sensor, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g. for sensing gesture command), etc.) providing input to the processor 1624. The AVD 1612 may include an over-the-air TV broadcast port 1638 for receiving OTA TV broadcasts providing input to the processor 1624. In addition to the foregoing, it is noted that the AVD 1612 may also include an infrared (IR) transmitter and/or IR receiver and/or IR transceiver 1642 such as an IR data association (IRDA) device. A battery (not shown) may be provided for powering the AVD 1612.
Still referring to
In the example shown, to illustrate present principles all three devices 1612, 1644, 1646 are assumed to be members of an entertainment network in, e.g., a home, or at least to be present in proximity to each other in a location such as a house. However, present principles are not limited to a particular location, illustrated by dashed lines 1648, unless explicitly claimed otherwise. Any or all of the devices in
The example non-limiting first CE device 1644 may be established by any one of the above-mentioned devices, for example, a portable wireless laptop computer or notebook computer or game controller (also referred to as “console”), and accordingly may have one or more of the components described below. The first CE device 1644 may be a remote control (RC) for, e.g., issuing AV play and pause commands to the AVD 1612, or it may be a more sophisticated device such as a tablet computer, a game controller communicating via wired or wireless link with the AVD 1612, a personal computer, a wireless telephone, etc.
Accordingly, the first CE device 1644 may include one or more displays 1650 that may be touch-enabled for receiving user input signals via touches on the display. The first CE device 1644 may include one or more speakers 1652 for outputting audio in accordance with present principles, and at least one additional input device 1654 such as e.g. an audio receiver/microphone for e.g. entering audible commands to the first CE device 1644 to control the device 1644. The example first CE device 1644 may also include one or more network interfaces 1656 for communication over the network 1622 under control of one or more CE device processors 1658. A graphics processor 1658A may also be included. Thus, the interface 1656 may be, without limitation, a Wi-Fi transceiver, which is an example of a wireless computer network interface, including mesh network interfaces. It is to be understood that the processor 1658 controls the first CE device 1644 to undertake present principles, including the other elements of the first CE device 1644 described herein such as e.g. controlling the display 1650 to present images thereon and receiving input therefrom. Furthermore, note the network interface 1656 may be, e.g., a wired or wireless modem or router, or other appropriate interface such as, e.g., a wireless telephony transceiver, or Wi-Fi transceiver as mentioned above, etc.
In addition to the foregoing, the first CE device 1644 may also include one or more input ports 1660 such as, e.g., a HDMI port or a USB port to physically connect (e.g. using a wired connection) to another CE device and/or a headphone port to connect headphones to the first CE device 1644 for presentation of audio from the first CE device 1644 to a user through the headphones. The first CE device 1644 may further include one or more tangible computer readable storage medium 1662 such as disk-based or solid state storage. Also in some embodiments, the first CE device 1644 can include a position or location receiver such as but not limited to a cellphone and/or GPS receiver and/or altimeter 1664 that is configured to e.g. receive geographic position information from at least one satellite and/or cell tower, using triangulation, and provide the information to the CE device processor 1658 and/or determine an altitude at which the first CE device 1644 is disposed in conjunction with the CE device processor 1658. However, it is to be understood that that another suitable position receiver other than a cellphone and/or GPS receiver and/or altimeter may be used in accordance with present principles to e.g. determine the location of the first CE device 1644 in e.g. all three dimensions.
Continuing the description of the first CE device 1644, in some embodiments the first CE device 1644 may include one or more cameras 1666 that may be, e.g., a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into the first CE device 1644 and controllable by the CE device processor 1658 to gather pictures/images and/or video in accordance with present principles. Also included on the first CE device 1644 may be a Bluetooth transceiver 1668 and other Near Field Communication (NFC) element 1670 for communication with other devices using Bluetooth and/or NFC technology, respectively. An example NFC element can be a radio frequency identification (RFID) element.
Further still, the first CE device 1644 may include one or more auxiliary sensors 1672 (e.g., a motion sensor such as an accelerometer, gyroscope, cyclometer, or a magnetic sensor, an infrared (IR) sensor, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g. for sensing gesture command), etc.) providing input to the CE device processor 1658. The first CE device 1644 may include still other sensors such as e.g. one or more climate sensors 1674 (e.g. barometers, humidity sensors, wind sensors, light sensors, temperature sensors, etc.) and/or one or more biometric sensors 1676 providing input to the CE device processor 1658. In addition to the foregoing, it is noted that in some embodiments the first CE device 1644 may also include an infrared (IR) transmitter and/or IR receiver and/or IR transceiver 1678 such as an IR data association (IRDA) device. A battery (not shown) may be provided for powering the first CE device 1644. The CE device 1644 may communicate with the AVD 1612 through any of the above-described communication modes and related components.
The second CE device 1646 may include some or all of the components shown for the CE device 1644. Either one or both CE devices may be powered by one or more batteries.
Now in reference to the afore-mentioned at least one server 1680, it includes at least one server processor 1682, at least one tangible computer readable storage medium 1684 such as disk-based or solid state storage, and at least one network interface 1686 that, under control of the server processor 1682, allows for communication with the other devices of
Accordingly, in some embodiments the server 1680 may be an Internet server or an entire server “farm”, and may include and perform “cloud” functions such that the devices of the system 1600 may access a “cloud” environment via the server 1680 in example embodiments for, e.g., network gaming applications. Or, the server 1680 may be implemented by one or more game consoles or other computers in the same room as the other devices shown in
The methods herein may be implemented as software instructions executed by a processor, suitably configured application specific integrated circuits (ASIC) or field programmable gate array (FPGA) modules, or any other convenient manner as would be appreciated by those skilled in those art. Where employed, the software instructions may be embodied in a non-transitory device such as a CD ROM or Flash drive. The software code instructions may alternatively be embodied in a transitory arrangement such as a radio or optical signal, or via a download over the internet.
It will be appreciated that whilst present principals have been described with reference to some example embodiments, these are not intended to be limiting, and that various alternative arrangements may be used to implement the subject matter claimed herein.