This application relates generally to augmented reality (AR) applications and, more particularly, to a handheld augmented reality projector.
Digital augmented reality is defined as the overlay of digital information on the real world by use of mobile computing, and it has been realized through a variety of devices including smart phones, tablets, and various head-mounted displays—where the augmentation exists on a digital screen or display. Analog AR is likewise considered to be overlay of information, but through the use of an active laser projection system or visible light based computer presentation projectors to project images onto physical surfaces.
Augmented reality is used in industrial applications in 2018 primarily through the use of mobile interface devices. As used herein, the term “mobile interface device” refers to any mobile computing solution that is used to facilitate communication and display information, including, but not limited to tablet computers, smartphones, and wearable heads-up displays. Typically, cameras and/or other sensor(s) (e.g., infra-red, visual-inertial, or GPS) are used by such devices to establish device pose (position of the device relative to a reference frame or physical object), and the device displays the camera feed with accurately overlaid digital information.
Mobile interface device display of AR information, however, has significant shortfalls.
Optically transparent, head mounted displays (HMDs) using beam-splitting or wave-guide technology are a common solution to the “hands free” shortfall of tablets and smart phones. As shown in
Modern laser projection systems offer an analog AR solution to many industrial problems such as part placement, inspection, lofting, etc. Laser projection systems can place light fields on physical objects dynamically (changing with time), and with a high degree of accuracy relative to digital AR. Such analog tools also have limitations.
The low cost of standard computer projectors has resulted in their introduction into industrial build sites where users can use them to project drawings, data, instructions, or other information onto a real surface. This is yet another type of analog AR that can provide value, but also has shortcomings.
An illustrative aspect of the invention provides a method of dynamically projecting augmented reality information associated with a target object. The method comprises providing an augmented reality projector (ARP) comprising a digital image projector, a visual spectrum camera (VSC), and a sensor arrangement including at least one depth sensor, all aligned along an ARP line of sight. The method further comprises directing the ARP line of sight toward a target surface of the target object. A visual digital image is captured using the VSC and target surface data are captured using the sensor arrangement. The method still further comprises determining, by an automated data processor, a pose of the ARP relative to the target surface, and constructing, by the automated data processor, an augmented reality image for display on the target surface. The method still further comprises projecting, by the digital image projector, the augmented reality image onto the target surface.
Another illustrative aspect of the invention provides an augmented reality projector comprising a housing and a digital image projector at least partially disposed within the housing. The digital image projector is configured to receive digital image information and use the digital image information to project an image onto a target surface of an object along a projector line of sight. The augmented reality projector further comprises a visual spectrum camera (VSC) at least partially disposed within the housing and having a VSC line of sight substantially parallel to the projector line of sight so as to capture a visual digital image of the target surface. The augmented reality projector also comprises a sensor arrangement including at least one depth sensor oriented and aligned to capture surface information from the target surface. The augmented reality projector also comprises a data processor in communication with the digital image projector, the visual spectrum camera, and the depth sensor arrangement. The data processor is configured to receive the captured visual digital image from the VSC and the captured surface information from the sensor arrangement, to construct an augmented reality image for display on the target surface, and to transmit the augmented reality image to the digital image projector for projection onto the target surface.
Another illustrative aspect of the invention provides an augmented reality projection system comprising a sensor and projection unit and a mobile interface device. The sensor and projection unit comprises a housing and a digital image projector at least partially disposed within the housing. The digital image projector is configured to receive digital image information and use the digital image information to project an image onto a target surface of an object along a projector line of sight. The sensor and projection unit further comprises a visual spectrum camera (VSC) at least partially disposed within the housing and having a VSC line of sight substantially parallel to the projector line of sight so as to capture a visual digital image of the target surface. The sensor and projection unit comprises still further comprises a depth sensor arrangement at least partially disposed within the housing. The depth sensor arrangement is oriented and aligned to capture surface information from the target surface. An input/output arrangement is in communication with the digital image projector, the VSC and the depth sensor arrangement. The mobile interface device comprises a communication interface selectively connectable to the input/output arrangement of the sensor and projection unit and a data processor. The data processor is in selective communication with the digital image projector, the VSC, and the depth sensor arrangement when the communication interface is connected to the input/output arrangement. The data processor is configured to receive the captured visual digital image from the VSC and the captured surface information from the sensor arrangement, to construct an augmented reality image for display on the target surface, and to transmit the augmented reality image to the digital image projector for projection onto the target surface.
The invention can be more fully understood by reading the following detailed description together with the accompanying drawings, in which like reference indicators are used to designate like elements, and in which:
While the invention will be described in connection with particular embodiments, it will be understood that the invention is not limited to these embodiments. On the contrary, it is contemplated that various alternatives, modifications and equivalents are included within the spirit and scope of the invention as described.
The present invention combines analog and digital AR elements to provide an advancement in digital AR hardware and overcome the shortfalls of existing display solutions. A particular aspect of the invention provides an augmented reality projector (ARP) that overcomes the field of view, hands-free, safety, persistence, mobility, calibration, and keystone shortfalls of other AR systems, and through dedicated and unique use, can also overcome the full light field shortfall.
An ARP of the present invention overlays digital content in a new way for mobile platforms. It does not require the user to look at a video overlay on a screen nor wear hardware that requires see-through projections. Instead, as shown in
An exemplary ARP according to an embodiment of the invention may be a hand-held or mountable smart projector that when shined on an object, depicts digital information about that object as pre-programmed using onboard software. Augmented reality/computer vision (AR/CV) and depth sensing technology enable the augmentation to “lock onto” the object, so that as images projected by the ARP are moved across the surface, the physical augmentations are locked in place—effectively acting like a flashlight to “reveal” digital information on the physical surface of an object within the projector's light field.
Unlike a physical tablet AR solution, the augmentation appears to be present on the physical object through light from the visual spectrum projector. Thus, the ARP unit can be set down and the AR user will still see the augmentation on the object—without any need to view a device display screen. This eliminates the need for two people to conduct spatially driven AR tasks. Since the AR user can set the unit down and walk up to the physical object with hands free, the constraints of tablet driven AR may be eliminated.
In various embodiments, the ARP can be “turned off” by a blackout button or a physical lens cover to eliminate the persistence shortcoming, and there is no interference with an AR user's PPE. In fact, a supplemental light source is a common piece of PPE that the ARP of the invention can replace.
The ARPs of the invention may have a high degree of mobility and easier/simpler calibration requirements than analog laser systems while still providing the “on object” augmentation that laser projection systems are valued for. ARP calibration may be driven by the selected AR/CV solution, and typically a fiducial marker will set scale and origin for the computations, though object recognition or other calibration approaches are also possible, consistent with the selected AR/CV technology.
Finally, the ARPs of the invention may incorporate digital features and additional sensors to address the Keystone effect and full light field issues. Although some projectors include an automatic keystone correction feature, they tend to distort the scale of the projection. The digital keystone correction used in the ARPs of the invention counter the problem without distortion, assuring accurate scale of the augmentation. Regarding full light field effects, it will be understood that poorly designed augmentation views can present full light field issues. AR solutions, however, do not generally fill very much of the screen. Moreover, careful design practices such as using a black background to replace the visual spectrum camera feed shown on the display, can be used to ensure minimal light projection.
An exemplary embodiment of the invention will now be discussed in more detail. With reference to the schematic representation of
The data processor 120 comprises central processing unit (CPU) and a display processor or graphics processing unit (GPU) 126. The user interface 150 is in communication with the data processor 120 and may include a physical visual display 152. The visual display 152 may make the AR initialization and operation easier (although other human-computer interfaces (voice) and the projected AR display may be sufficient). In some embodiments of the invention, the data processor 120 and user interface 150 may be provided by a mobile interface device (e.g., a commercial tablet computer) connected to or connectable to the housing 110 and/or components disposed within the housing 110. The operating system of the tablet or other data processor 120 may be variable/non-specific, as may be the software programming language used. The user interface 150 may be configured for touch, voice, gesture, or other input methods and allows a user to launch applications and or use other controls. A visual display 152 of the user interface 150 (e.g., the display screen of a tablet) can optionally be used to display visual camera views and/or AR content. In some embodiments, the user interface 150 may include a head mounted display (HMD).
The sensor arrangement 130 of the exemplary ARP 100 includes a visual spectrum camera (VSC) 132, a depth sensing camera (DSC) 134, and an inertial measurement unit (IMU) 136. In embodiments where the ARP 100 incorporates a tablet or other mobile interface device, some or all of these sensors may be native to that device. In other embodiments, individual units such as an IR sensor for the DSC 134, visual camera for the VSC 132, and an accelerometer-based IMU 136 may be disposed within or attached to the housing 110. The output of these sensors is received by the CPU 122, which comprises isometric transformation software (ITS) 124 and AR/CV software 123 to perform the calculations necessary to determine the pose of the ARP 100 and to generate the appropriate AR output. Many commercial options are available for AR/CV software 123 including, without limitation, HoloKit, ARKit, ARCore, Vuforia, and others. In embodiments where a mobile interface device is used, but the sensors are separate from the mobile interface device, the AR/CV software 123 may comprise software such as PrimeSense, MS Kinect or the like. The specific approach to sensor/software combinations may be driven by such factors as cost, range, accuracy, and battery endurance.
The AR/CV software 123 may use previously determined information related to a target object to generate the AR image. Such information may be retrieved by the CPU 122 from an on-board memory 127. In some embodiments, however, the CPU 122 may retrieve the information from an external server via a wireless or wired communication interface 128. In some embodiments, the AR image content may be provided by a user via the user interface 150.
The DSC 134 provides information necessary to determine the pose of the surface on which AR information is to be displayed. This differs from the situation where the information is merely displayed on a tablet screen or an HIVID. In a typical tablet-type display solution, this is not necessary. Projecting AR onto surfaces in the physical world, however, requires that the isometric orientation of the projector relative to the surface be more accurately understood and taken into account for the projection. The ITS software 124 is configured to transform the AR/CV pose-calculated augmentation so that it accounts for the physical keystone effect. It may also account for variations in the physical surface (i.e., the surface topography). Such transformations can only be done accomplished with accurate information on the range and orientation of the ARP 100 relative to the projection surface.
In some applications, the projection surface distance and orientation may be known in advance and included in the AR/CV geometric model environment. In such cases, calculations may be made based on that digital representation, but this method may be less reliable for most industrial applications.
In some embodiments, the output of the DSC 134 and/or the VSC 132 could also be used for AR tracking.
The projection arrangement 140 includes an optical (visual spectrum) projector (VSP) 142 for projection of digital AR imagery, which may be corrected for keystone effects and surface topography. Any suitable commercial VSP unit may be used. The VSP may be connected to an on-board power supply (e.g., a battery pack). Alternatively, the VSP and/or other ARP components may be powered by an external power source. The VSP is fixed relative to the sensor arrangement 130 to assure that the proper projection pose is maintained.
The ARP 100 may also include a handle arrangement 180 and one or more lens caps 170 to cover one or more of the projector lens, a DSC lens, and a VSC lens. A tripod receiver 190 or other mechanism may be included for attaching the housing 100 to a fixed or portable support.
The ARP 100 is capable of mobility because it uses data from the DSC 134, VSC 132 and IMU 136 to initialize and maintain spatial awareness of the physical environment that it is in and uses CV software to calculate device pose, and constantly adjusts not only what content it should project based on its pose, but also how to project the content using its knowledge of the current projection surface. This is in direct contrast to a standard projector.
Another way to understand how this is accomplished is by imagining that a virtual camera exists in the digital environment at the same point and orientation in space as the physical ARP in the physical environment. The physical ARP only projects the point of view of a virtual camera existent in the digital environment, and since the physical and digital environments are linked by CV software, the foreshortened point of view of the digital content captured by the virtual camera will project as seen foreshortened, creating an equally opposing skew when physically projected. By this effect, the digital content remains true to the physical environment.
In some embodiments, the sensor and projection unit 205 may include an on-board data processor connected to the VSC 232, DSC 234 and VSP 242 and/or additional sensors such as an IMU. In such embodiments, the on-board data processor may receive and process sensor data to determine the pose of the device and generate AR information for projection by the VSP 242. Processed information may also be passed to the mobile interface device 250 via the cable 260 and the communication interface of the mobile device.
In alternative embodiments, the primary processing capability of the system may be provided by a data processor of the mobile interface device 250. In these embodiments, data from the cameras and/or other sensors may be passed via the input/output arrangement 228 to the data processor of the mobile interface device 250, which may then combine this information with its own sensor data (e.g., IMU data), determine the pose of the sensor and projection unit 205 and generate the appropriate AR image information. The AR image information may then be passed via the input/output to the VSP 242 for projection.
The mobile interface device 250 may be any conventional tablet or other mobile processing device. In various embodiments, the mobile interface device 250 can be used for all user input-output functions. For ease of use, the mobile interface device 250 may be attached or attachable to the sensor unit housing 210 in such a way that it is generally aligned with the viewing direction Dv of the sensor and projection unit 205. Communications to and from the mobile interface device are passed to the device's data processor via a communications interface. A pistol-grip handle 280 may be attached to the bottom of the housing 210 to allow a user to hold the combined sensor and projection unit and mobile interface device with one hand and enter information on the mobile interface device 250 with the other.
At S150, the data processor constructs an AR image for display on a surface of the target object or area. In particular embodiments, the AR image is constructed using information relevant to the target object or area. This information may be set by a user or may be retrieved from data storage by the data processor based on identification of the target object or area. In certain embodiments, the data processor may request and receive object information from a remote server and/or object database. The information may include specific indicia to be displayed and the location where the indicia is to be displayed relative to the target object.
The AR image is constructed so that its appearance and position relative to the target object are the same regardless of the position and orientation of the ARP. The ARP data processor uses the surface topography of the target object or area and the pose (i.e., distance, line of sight angle, and orientation) of the ARP to determine the area of the target object that will be covered by the projected image and to determine the adjustments that must be made to the digital AR image so that when projected, the image will appear undistorted to any viewer within the environment. At S160, the AR image is projected onto the target object or area. The actions of the method may be repeated continuously in real time so that the pose of the ARP and the AR image are continuously updated so that as the ARP is moved, the appearance of the AR image remains unchanged as long as the ARP is directed toward the same portion of the target object or area. If the line of sight of the ARP is moved, the AR image will change, but the content associated with the target object will remain static as illustrated in
It will be readily understood by those persons skilled in the art that the present invention is susceptible to broad utility and application. Many embodiments and adaptations of the present invention other than those herein described, as well as many variations, modifications and equivalent arrangements, will be apparent from or reasonably suggested by the present invention and foregoing description thereof, without departing from the substance or scope of the invention.
This application claims priority to U.S. Prov. App. No. 62/763,597, filed Jun. 22, 2018, the complete disclosure of which is incorporated herein by reference.
| Number | Date | Country | |
|---|---|---|---|
| 62763597 | Jun 2018 | US |