The present invention relates to augmented reality methods and systems. More specifically, the present invention relates to methods and systems for accessing applications in a mobile, augmented reality environment. Even more specifically, the present invention relates to methods and systems for initiating the installation of applications and, thereafter, accessing the applications in an augmented reality mobile device.
Augmented reality is changing the way people view the world around them. Augmented reality, in general, involves augmenting one's view of and interaction with the physical, real world environment with graphics, video, sound or other forms of computer-generated information. Augmented reality introduces the computer-generated information so that one's augmented reality experience is an integration of the physical, real world and the computer-generated information.
Augmented reality methods and systems are often implemented in mobile devices, such as smart phones, tablets and, as is well known in the art, augmented reality glasses having wireless communication capabilities. In fact, mobile device technology is, in part, driving the development of augmented reality technology. As such, almost any mobile device user could benefit from augmented reality technology. For example, a tourist wearing a pair of augmented reality glasses wishing to find a suitable restaurant may select an option that requests a listing of local restaurants. In response, a computer-generated list of local restaurants may appear in the user's field of view on the augmented reality glasses.
In general, software running on mobile devices can be categorized as active software or passive software. Active software requires that the user perform some affirmative action to initiate the software's functionality. Passive software does not require the user to perform any affirmative action to initiate the software's functionality. In the above example, the tourist wishing to find a suitable restaurant must perform one or more affirmative actions in order to obtain the local restaurant listing. For example, the tourist must select the appropriate application so that the operating system will execute the application. The tourist then may have to select an option requesting the specific restaurant listing. It will be understood that the software application providing the restaurant listing is active software.
To some extent, the use of active software applications defeats the purpose of and diminishes the experience that one expects when using augmented reality technology. For instance, in a virtual reality environment, a user must interact with the technology—select a program, enter data, make a selection from a menu. In the real world, one isn't interacting with the virtual world at all. In the augmented reality world, one wants the experience to be as near a real experience as possible, not a virtual experience. It is, therefore, desirable that augmented reality software applications make the user's experience as much like the real world as possible and less like the virtual world.
The present invention obviates the aforementioned deficiencies associated with conventional augmented reality systems and methods. In general, the present invention involves an augmented reality system and method that allows a user to initiate the installation of an application on an augmented reality mobile device (e.g., by downloading into the device over a wireless network connection), with reduced or no direct user interaction. This, in turn, substantially enhances the user's augmented reality experience.
Thus, in accordance with one aspect of the present invention, the above-identified and other objects are achieved by an augmented reality mobile device. The device comprises a processor that includes a module configured to receive and process a first signal, where the first signal reflects the environment in which the augmented reality mobile device is operating. The module is also configured to generate a second signal based on the processed first signal. The mobile device also comprises a passively activated application program. The functionality of the passively activated application program is activated without direct user interaction. The passively activated application program is configured to receive the second signal from the processor, recognize an environmental trigger encoded in the second signal, and effect the installation of an application in the augmented reality mobile device, where the application corresponds with the environmental trigger.
In accordance with another aspect of the present invention, the above-identified and other objects are achieved by a method of installing an application in an augmented reality mobile device. The method comprises receiving and processing a first signal that reflects the environment in which the augmented reality mobile device is operating. The method also comprises generating a second signal that is based on the processed first signal. Then, without any direct, prior user interaction, the method comprises decoding and analyzing the second signal for the presence of an environmental trigger. If it is determined that an environmental trigger is encoded in the second signal, an application is installed on the augmented reality mobile device, where the installed application corresponds with the environmental trigger.
Several figures are provided herein to further the explanation of the present invention. More specifically:
It is to be understood that both the foregoing general description and the following detailed description are exemplary. As such, the descriptions herein are not intended to limit the scope of the present invention. Instead, the scope of the present invention is governed by the scope of the appended claims.
As shown in
The augmented reality glasses 10 also include a Global Positioning System (GPS) unit 16. GPS units receive signals transmitted by a plurality of geosynchronous earth orbiting satellites in order to triangulate the location of the GPS unit. In more sophisticated systems, the GPS unit may repeatedly forward a location signal to an IMU to supplement the IMUs ability to compute position and velocity, thereby improving the accuracy of the IMU. GPS units are also well known.
As mentioned above, the augmented reality glasses 10 include a number of features relating to sensory input and sensory output. Here, augmented reality glasses 10 include at least a front facing camera 18 to provide visual (e.g., video) input, a display (e.g., a translucent or a stereoscopic translucent display) 20 to provide a medium for displaying computer-generated information to the user, a microphone 22 to provide sound input and audio buds/speakers 24 to provide sound output.
The augmented reality glasses 10 must have network communication capabilities, similar to conventional mobile devices. As such, the augmented reality glasses 10 will be able to communicate with other devices over network connections, including intranet and internet connections through a cellular, WIFI and/or Bluetooth transceiver 26.
Of course, the augmented reality glasses 10 will also comprise an on-board microprocessor 28. The on-board microprocessor 28, in general, will control the aforementioned and other features associated with the augmented reality glasses 10. The on-board microprocessor 28 will, in turn, include certain hardware and software modules and components described in greater detail below.
In the future, augmented reality glasses may include many other features to further enhance the user's augmented reality experience. Such features may include an IMU with barometric sensor capability for detecting accurate elevation changes; multiple cameras; 3D audio; range finders; proximity sensors; an ambient environment thermometer; physiological monitoring sensors (e.g., heartbeat sensors, blood pressure sensors, body temperature sensors, brain wave sensors); and chemical sensors. One of ordinary skill will understand that these additional features are exemplary, and still other features may be employed in the future.
As explained above with respect to
It will be understood that the term “processor,” in the context of
The processor will, of course, execute various routines in order to operate and control the augmented reality mobile device 30. Among these is a software program, referred to herein and throughout this description as the “app store.” In accordance with exemplary embodiments of the present invention, the processor executes the app store program in the background. In accordance with one exemplary embodiment, the processor executes the app store program in the background whenever the augmented reality mobile device is turned on and operating. In another exemplary embodiment, the user may have to initiate the app store program, after which time, the processor will continue to execute the program in the background.
As stated above, the output device may be a translucent display (e.g., translucent display 20). However, other device and display types are possible. For example, if the output device is a display device, the display device may comprise transparent lenses rather than translucent lenses. The display device may even involve opaque lenses, where the images seen by the user are projected onto opaque lenses based on input signals from a forward looking camera as well as other computer-generated information. Furthermore, the display may employ a waveguide, or it may project information using holographic images. In fact, the output device may involve something other than a display. As mentioned below, the output device may involve audio, in lieu or, most likely, in addition to video. The key here is that the present invention is not limited by the type and/or nature of the output device.
In
The app store program is passive. As explained above, this means, the functionality associated with the app store program is capable of being initiated by any one of a number of triggers that are present or occur in the surrounding, real world environment. Direct user action, on the other hand, is not required to initiate the app store functionality as is the case with software providing similar or like functionality in conventional augmented reality methods and systems. As illustrated in
At the present time, the most common triggers are likely to be computer vision based, where the camera (e.g., camera 18) captures an image. Within that image there may be an object or glyph that the app store program recognizes. The recognition of the object or glyph then causes an event, for example, the display of computer-generated information specifically corresponding to that object or glyph. The computer-generated information may be an icon representing an application that the user may wish to install (e.g., download). In the fast food restaurant example described in detail below, the application, if the user chooses to install it, might provide the user with a coupon or other special offers available at the restaurant. The application may allow the user to view a food and beverage menu through the augmented reality mobile device so the user can order food without standing in line—a benefit if the restaurant happens to be crowded. The application may provide nutritional information about the various food and beverage items offered at the restaurant. As technology advances and marketing becomes more creative, other types of triggers are likely to become more prevalent.
In another example, the trigger passively initiating the app store program may be a tone played over a sound system in the surrounding environment. The tone would be picked up by the microphone (e.g., microphone 22). If the app store program recognizes the tone, the app store program then causes an event, such as the display of computer-generated information specifically corresponding to that tone.
In yet another example of a trigger passively initiating the app store program, the user may be attending a sporting event, such as a baseball game. If the augmented reality mobile device has a temperature sensor, and the actual temperature at the game exceeds a predefined temperature, that combined with the GPS coordinates of the stadium or a particular concession stand at the stadium may trigger the app store program to display computer-generated information, such as an icon that, if selected by the user, initiates the installation of an application that offers a discount on a cold beverage. On a cool day, the application may, alternatively, offer the user a discount on a hot beverage or a warm meal.
Social triggers are also possible. In this example, a group of like users who are present in a common place, based on the GPS coordinates of that place, may receive a special, limited offer. For example, if the like users are attending a concert at a venue with GPS coordinates that are recognized by the app store program, the computer-generated information may be an icon that, if selected by the user would make the user eligible to receive a limited edition t-shirt. The offer may be made available only to the first 100 users that select the icon and install (e.g., download) the corresponding application. In another example of a social trigger, a user may subscribe to a particular social networking group. Then, if one or more subscribers in that group, in proximity to the user, just downloaded a particular application, the user's mobile device may receive a signal over a network connection, where that signal serves as an environmental trigger initiating the functionality of the app store program to, thereafter, offer the user the same application. One might imagine that this social feature will become quite popular and may be a major driving force in promoting products and motivating users to perform some activity.
Table I below is intended to provide a list of exemplary triggers. These triggers may be supported by conventional augmented reality technology, and some may be more likely in the near future as the technology advances. In no way is the list in Table I intended to be limiting in any way.
After the app store program is passively triggered to present computer-generated information to the user through the augmented reality mobile device (e.g., by displaying a corresponding icon on the display or by playing a corresponding audio sequence through the ear buds/speakers), the user now may be required to take some affirmative action (referred to herein as a “processing action”) in order to utilize or otherwise take advantage of the computer-generated information provided by the app store program.
It will be understood that a processing action may take on any number of different forms. Computer Vision, for example, offers one convenient way to effect a processing action. In the world of augmented reality, computer vision may allow the user to reach out and “touch” the virtual object (e.g., the icon presented on the display). It will be understood, however, that simply placing a hand over the virtual object may result in false acceptances or accidental selection as moving one's hand in front of or over the augmented reality mobile device may be a common thing to do even when the user is not trying to initiate a process action. Accordingly, the processing action should be somewhat unique to advert false acceptances or accidental selections. Thus, the processing action may come in the form of fingers bending in a unique pattern, or moving one's hand in along a predefined path that would be hard to accidentally mimic without prior knowledge. Another example might be the use of the thumb extending outward, and then moving one's hand inward to symbolize a click. The camera would of course capture these user movements and the app store program would be programmed to recognize them as a processing action.
Computer vision is, of course, only one way to implement a processing action. Sound is another way to implement a processing action. With advancements in speech detection, the app store program will be able to decipher specific words, for example, “select icon,” “purchase item,” “order product” or “cancel order,” just to name a few. In addition, specific sounds, tones, changes in pitch and amplitude all could be used to implement a user processing action.
Table II below is intended to summarize some of the ways in which a user may initiate a processing action. Again, the list presented in Table II is exemplary, and it is not intended to be limiting in any way.
The environmental processor 72 plays a very important role in the present invention. The environmental processor 72 may be implemented in software, hardware or a combination thereof. The environmental processor 72 may be integrated with other processing software and/or hardware, as shown in
The visual module 74 receives and processes information in video frames captured by the augmented reality mobile device camera (e.g., camera 18). In processing each of these video frames, the visual module 74 is looking for the occurrence of certain things in the surrounding, real world environment, such as, objects, glyphs, gestural inputs and the like. The visual module 74 includes two components, and environmental component and an interactive component. The environmental component is looking for objects, glyphs and other passive occurrences in the surrounding environment. In contrast, the interactive component is looking for gestural inputs and the like.
The visual module 74 is but one of several modules that make up the environmental processor 72. However, it will be understood that if the functionality associated with the visual module 74 is particularly complex, the visual module 74 may be implemented separate from the environmental processor 72 in the form of it own ASIC.
The audible module 76 receives and processes signals carrying sounds from the surrounding, real world environment. As shown, the audible module 76 includes two components, a speech module for detecting and recognizing words, phrases and speech patterns, and a tonal module for detecting certain tonal sequences, such as musical sequences.
The geolocational module 78 receives and processes signals relating to the location of the augmented reality mobile device. The signals may, for example, reflect GPS coordinates, the location of a WIFI hotspot, or the proximity to one or more local cell towers.
The positional module 80 receives and processes signals relating to the position, velocity, acceleration, direction and orientation of the augmented reality mobile device. The positional module 80 may receive these signals from an IMU (e.g., IMU 12).
The app store program is a separate software element. In accordance with exemplary embodiments of the present invention, it resides in the third party application layer 62, along with any other applications that either came with the mobile device or were later downloaded by the user. Alternatively, the app store program may reside in the augmented reality shell 64. The app store program communicates with the various environmental processor software modules in order to recognize triggers embedded in the information received and processed by the environmental processor software modules. In addition, the app store program communicates with the other software elements in the shell to, for example, display virtual objects and other information to the user or reproduce audible sequences for the user. The app store program communicates with yet other software elements in the shell to upload or download information over a network connection.
The example illustrated in
In response to decoding signal 90, the app store program then generates a signal 91 and sends it back to the augmented reality shell 64 (
The icon 71 would then appear on the translucent display 20 as illustrated in story board frame 3 (
It is important to reiterate that, in accordance with a preferred embodiment, the app store program is passively running in the background. Thus, the process of recognizing the object or glyph in the fast food restaurant, the generation and processing of signals 90, 91 and 92, and the rendering of the icon 71 on the translucent display 20, occurred without any direct action or involvement by the user. It is also important to reiterate that while the passive triggering of the app store program was, in the present example, caused by the presence of and recognition of a real world glyph in the fast food restaurant, alternatively, it could have been caused by a sound or tonal sequence picked up by microphone 22, and detected and processed by the tonal component of the audible module 76 in environmental processor 72. Still further, it could have been caused by the augmented reality mobile device 10 coming within a certain range of the GPS coordinates associated with the fast food restaurant, as detected by the geolocational module 78. Even further, it could have been caused by the augmented reality mobile device, or more specifically, the network interaction service module 70, detecting the WIFI hotspot associated with the fast food establishment. One skilled in the art will readily appreciate that these passive triggers are all exemplary, and other triggers are possible, as illustrated in Table I above.
Returning to the exemplary method illustrated in
Although, in the present example, the application is accepted by selecting icon 71, presented on translucent display 20, through the use of a hand gesture, it will be understood from Table II above that the way in which the user accepts the application may differ based on the manner in which the app store program presents the computer-generated information to the user. If, alternatively, the app store program presents the user with an audible option (in contrast to a visual option like icon 71) in response to its recognition of glyph 73, for example, the audible sequence, “ARE YOU INTERESTED IN DOWNLOADING A DISCOUNT FOOD COUPON,” user acceptance may take the form of speaking the word “YES” or “NO.” The user's words would be picked up by microphone 22, detected and processed by audible module 76, and recognized by the app store program. The app store program would then process the user response accordingly, for example, by generating the necessary signals to download the corresponding discount food coupon application into the augmented reality mobile device.
It will be noted that the user may be required to take further action to effect the downloading of the application. In the present example, the user must “drag and drop” icon 71, as indicated in story board 7 (
The purpose of the installed application could be almost anything, as suggested above. For example, it may be an application that allows the user to order and purchase food online more quickly, in the event the line of customers waiting to order food is exceedingly long. It may be an application that allows the user to obtain a discount on various food and beverage items offered by the restaurant. It may be an application that provides the user with nutritional information about the various menu items offered by the restaurant. Thus, one skilled in the art will appreciate that the present example is not intended to limit the invention to any one type of application.
The present invention has been described above in terms of a preferred embodiment and one or more alternative embodiments. Moreover, various aspects of the present invention have been described. One of ordinary skill in the art should not interpret the various aspects or embodiments as limiting in any way, but as exemplary. Clearly, other embodiments are well within the scope of the present invention. The scope the present invention will instead be determined by the appended claims.