Although many companies offer an augmented reality (AR) software development toolkit (SDK), these software libraries only sense the position of the user's mobile device and display graphical content. These SDKs are only useful to easily build trivial applications for a single user in a single location. It's too expensive and time-consuming to build all the software that would host a community of users. It's too expensive and time-consuming to build a persistent shared world filled with content that users can see by moving through the real world.
This limits the potential of many good potential AR applications. For example, a developer may want to build an AR Wizard of Oz world for thousands of players that is spread out across an entire city. AR SDKs can show simple content, but developers are left on their own to:
A platform would help. A “platform” is a set of shared functionality that is provided to apps so that developers don't need to code everything from scratch. This reduces time and cost to build those apps. A platform is needed for third party AR experience developers.
Each of these apps could run in a single shared environment, so that users could easily switch between them, and so that all apps would potentially have access to all users. You would launch the platform, choose your AR experience, and see it all around you through your mobile device.
Briefly, the techniques described herein assumes an augmented reality SDK that provides location sensing for a mobile device and graphical display. As in
To begin developing an AR app, a developer imports content such as 3D models into the AR content builder. Then the developer writes code in an AR interactivity scripting language to configure how AR items and AR characters behave.
The developer then submits the app for vetting, and the app goes live for users to select. Users may browser events on an app store by keyword, topic, or popularity, or may discover app content that is pinned to map locations on a map that the user browses.
Users may belong to a single shared community that can search, download, and install an app from a ‘store’ of AR applications. After opening the app, a user is then presented with nearby AR items and characters, which have event listeners. When an event listener is triggered, the AR item or character executes a script and performs the specified actions, if any.
User and app performance metrics are collected, and load balancing and redistribution of the servers may be triggered. App updates may be automatically distributed to users.
Users may rate and review the apps. Users may anonymously communicate with or meet with other users. Users may have an “inventory” that represents owned AR items.
The challenges are then:
These problems are solved with a method according to a preferred embodiment in the following way:
The foregoing and other features, and advantages will be apparent from the following more particular description of preferred embodiments, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views.
A description of preferred embodiments follows.
Each is specially suited to an AR development environment. To start with, a third party app developer uses the AR App Builder 104 to import content such as a 3D model and then to make it AR responsive as in
Once the AR app has been created, it is submitted to an AR App Store 105 for possible vetting and for display. From there, a Cloud Hosted Environment 106 allows users to download and install an app, as in
Users are part of an AR User Community 107 which can message each other, trade items, or meet in the real world. This makes extensive use of AR Mapping 108, where virtual AR items are placed at map locations in the real world. Complicated routes may be calculated for AR items, and sets of AR items may be distributed along a pathway or randomly inside a map region, as in
Users see these AR items through the AR Display & Sensors 109 on their mobile phones, which is not part of the this invention but supplied by a third party and integrated into the system. The AR Display & Sensors 109 also track the position of the phone so that AR items can be drawn in the correct locations.
Although each app will have its own specific AR item interactions, the platform provides for some of the most common User AR Interactivity 110, such as clicking on an AR item to see a description of the potential experience should the user wish to engage. The platform also allows users to pick up an AR item into the user's inventory, and to place items down from the user's inventory.
App developers will naturally wish to track AR App Metrics 111 about how their AR App is performing. They could for example track activity of users, AR Items, especially as user's install the app and spend money, as show in
The 3rd party developer does not use the platform for generic tasks such as creating 3D models, images, video, audio, and other forms of media that can become AR items. Instead, External Content 202 is made elsewhere and loaded into the AR App 203. In this loading, AR Items are placed into the environment at real world map locations, either statically or algorithmically using Map Descriptors 301.
The AR Items are then associated with scripts in an AR Scripting Language 204, either individually or by object-oriented class type, that give each AR Item a relationship with its virtual environment (other AR Items), real-world environment (maps), and users, including user behavior.
The AR Scripting Language 204 include Event Triggers 207, which are snippets of code, as shown in
Item
Sensory elements can be combined into high-level sensory algorithms. For example, a user's position and speed could be used in a new algorithm that senses whether an AR Item in motion is being followed by the user.
Once the AR Senses 206 active an Event Trigger 207, an associated snippet of code in the AR Scripting Language 204 is executed, which may result in an AR Action 208 being performed. The platform could come with several item actions that have an augmented reality context, such as an AR Item:
The AR Actions 208 become building blocks for high-level Coded AR Behaviors 209, which affect what happens in the AR world in the AR App 203. This would typically involve User Interaction 209, but the AR Items can also simply interact with each other and map features, or interact with simulated users as part of a Simulation/Testing Environment 210.
After development and testing, the 3rd Party Developer 201 uploads the AR App 203 into an AR App Store 211 for users to discover, consider, and potentially download, install, and use.
Descriptors are Hooks for Scripted AR Interactivity
Instead of a pinpoint on a map, scripts may also refer to compass headings, paths and regions, for example:
Items may be laid out along paths or across regions, for example:
Finally, points, paths, and regions may be used to perform database queries for users and AR Items, for example:
Videogames and other kinds of simulations have used behavioral scripts for ages, but an AR App Script is novel because of its real-time awareness of the user (e.g. map position, motion of the phone), map information, and placement of AR items into real world locations.
In this behavioral script, a set ARItem creatures are created on line 1 called Jeffy_the_Elf. The creature is instantiated from a class of ARItem called GreyElf, which is defined elsewhere. These instantiations are distributed randomly (but spread out 500 meters) through every map region in the world that matches the query ALLPARKS.
ALLPARKS is defined on line 5 to be every outdoor space that is at least 0.5 square kilometer. So copies of Jeffy_the_Elf would pop up across every park space in the world that isn't tiny.
Then on line 8 we attach an Event Listener to the set of AR Items in Jeffy_The_Elf. The Event Listener waits until something changes in the world, CLOSEBYUSER, and then activates function ITEMQUEST, passing in the detected user as a parameter to the function.
CLOSEBYUSER is defined on line 11 to be any user within 300 meters that the ARItem can transit to within 240 seconds at a walking speed.
Any instantiation of Jeffy_The_Elf that gets triggered will executive the behavioral script at line 15, ITEMQUEST.
On line 16, the ARItem attempts to intercept the user by walking to the user's location. Of course, the user's location may be changing in real-time, and the AR Item will give up attempting to intercept in 300 seconds, which could happen for example if the user is running away from the ARItem.
On line 17, the ARItem turns to face the user. On line 18, the ARItem searches for a nearby treasure, a database query that returns the closest ARItem with attribute “gold” that can be walked to within 5 minutes, according to map pathways.
On line 22, if a nearby item was found, Jeffy_The_Elf tells the user, “Follow me!” (lines 23, 24) and then walks to the item (line 26). Presumably the treasure is not moving, so the intercept is straightforward. Jeffy then executes a pointing animation on line 27, pointing to the treasure, and tells the user where it is.
Of course, this script omits several complications, such as:
One could also imagine an adventure where Jeffy, instead of leading the user to the treasure, gave directions verbally to the user, or spoke about the treasure's location through autogenerated clues such as “It's just to the south of the nearest pharmacy whose name starts with ‘R’. Good luck!”
The App Developer 501 loads the app, including its scripts and items, into the platform's Cloud Servers 502, so that they become Hosted Live App Worlds 503 whose scripts are executing, taking information from real-world inputs such as user locations, and rendering AR items at real-world locations.
The map of the entire world is segmented 504 into computing regions. Each server in the array of Cloud Servers 502 is responsible for responding to user behavior and simulating AR items across a set of specific computing regions. Special algorithms will try to keep each server's set of computing regions contiguous, so that users and AR items that transition from one computing region to a neighboring computing region won't always need to be sent to a different server.
Algorithms also Optimize Network Traffic 505, for example, reducing communication delays by assigning computing regions to Cloud Servers 502 that are physically housed in a server farm close to the computing region. Servers in Japan should not be responsible for computing regions in Scotland.
Network traffic must also be optimized for bandwidth, as users may have mobile devices without healthy Internet connections. For example, 3D models with reduced quality (but also reduced size) may be sent to users, and content may be preloaded to the user. Some types of behavior script execution may even be delegated client-side to the user's mobile device, so that responses can be instant without any Internet delay. Scripts delegated to a user client device, which may be hacked, could perhaps be fact-checked later by a redundant behavioral script execution on the server side.
The platform has a shared User Community 506, where users have User Profiles 508 and an inventory for holding AR Items 508. Users may also have User Avatars 509 that represent the user in the world. For example, when an
AR Item wishes to fight, it would be facing and fighting the user's avatar, not facing the user's mobile device camera.
Users from the User Community 506 can pick and choose which apps they wish to run 507. AR Content is Displayed 510 to a user from whichever app or apps he or she is running. The user may of course interact 511 with AR Items.
Two users who are physically collocated may of course speak to each other in the real world without any intermediary. However, users who are not collocated will want to interact through Anonymizing Servers 513. These disguise the user's IP addresses of origin, making it possible for two users to engage with each other safely without giving up each other's true IP address. Through the Anonymizing Servers 513, a user may for example:
Users who get matched together may be given map routes to a designated meeting location, and then see each other's GPS position live 517, in real-time, to reduce anxiety about whether the other person is really coming, and to reduce confusion about how to physically meet.
Users may be able to discover AR Apps by the AR Items are placed in a selected map region. For example, when visiting the Statue of Liberty, a user could see on his or her map the apps that are hosting AR Item content in the vicinity. Users may be forbidden from using the app in certain real-world environments, for example at night, over the water, or while driving. Users may be warned that the app requires users to run, to be able to hear, or to enter areas that may not be handicapped accessible.
This application claims priority to U.S. Provisional Application Ser. No. 62/505,440 filed May 12, 2017, the entire contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
62505440 | May 2017 | US |