The invention generally relates to a system and method for interacting with virtual objects in augmented realities, and in particular, to enabling users to create and deploy virtual objects having custom visual designs and embedded content or other virtual items that can be shared with other users to any suitable location, interact with virtual objects and embedded content or other virtual items that other users created and deployed into certain locations, participate in treasure or scavenger hunts to locate and/or collect the virtual objects, obtain special offers, coupons, deals, incentives, and targeted advertisements associated with the virtual objects, play games that involve interaction with the virtual objects, and engage in social networking to stay in touch with friends or meet new people via interaction with the virtual objects, among other things.
Augmented reality generally refers to a field in computer technology that relates to combining computer-generated data and real-world data, which typically involves overlaying virtual imagery over real-world imagery. For example, many television sports broadcasts now incorporate augmented reality applications to superimpose or otherwise overlay virtual images over real-world game action to provide viewers additional information about the broadcast or otherwise enhance the viewer experience (e.g., football broadcasts often superimpose a virtual “first down” marker to show the distance that the offense has to cover to continue a current drive, baseball broadcasts often superimpose a virtual “strike zone” marker to indicate whether a certain pitch was a ball or strike, etc.). However, augmented reality systems have historically required substantial computing resources, a requirement which has interfered with the ability to deliver augmented reality applications to everyday users that would otherwise benefit from having virtual imagery provide to better interact with real-world environments. Further, developing augmented reality applications that the common consumer can use has tended to be restrained due to difficulties in suitably tracking user viewpoints that applications need to know in order to properly render virtual imagery based on where the user may be looking in the real-world.
More recently, substantial increases in the computing resources associated with many consumer electronics have brought about new opportunities to deliver augmented reality applications to everyday users. For example, common features in many (if not all) smartphones and computers available in the marketplace today include built-in cameras, video capabilities, location detection systems, high-resolution displays, and high-speed data access, among others. As such, many modern consumer electronics can now have capabilities that can suitably overlay virtual imagery over real-world imagery, which may support new tools to enhance how users experience physical reality. For example, using the built-in camera and the built-in location detection system on a mobile device to sense the viewpoint and physical location associated with a user, an augmented reality application may add virtual imagery associated with the sensed physical location to a display that represents the sensed viewpoint to visually represent additional information about the physical reality visible through the camera lens. Accordingly, augmented realities have significant potential to change how users view the real-world in many ways because augmented realities can show more information about user surroundings than would otherwise be available through the physical senses alone.
According to one aspect of the invention, the system and method described herein may support interacting with virtual objects in augmented realities. In particular, the system and method described herein may generally host augmented reality software on an augmented reality server, which may support the interaction with the virtual objects in augmented realities on mobile devices (e.g., a smartphone, augmented reality glasses, augmented reality contact lenses, head-mounted displays, augmented realities directly tied to the human brain, etc.). In one implementation, the augmented reality interaction supported in the system and method described herein may generally enable users to create and deploy virtual objects having custom visual designs and embedded content or other virtual items that can be shared with other users to any suitable worldwide location, interact with virtual objects that other users created and deployed in addition to any content or other virtual items that may be embedded therein, participate in treasure hunts, scavenger hunts, or other interactive games that involve interaction with the virtual objects, obtain special offers, coupons, incentives, and targeted advertisements associated with the virtual objects or the interaction therewith, and engage in social networking to stay in touch with friends or meet new people via interaction with the virtual objects, among other things.
According to one aspect of the invention, to use the system and method described herein, a user associated with a mobile device may download an augmented reality application over a network from an application marketplace, wherein the augmented reality application may be free, associated with a one-time fee, or available on a subscription basis. Alternatively (and/or additionally), certain features associated with the augmented reality application may be free and the user may be required to pay to upgrade the augmented reality application or activate additional features associated therewith (e.g., a purchasing option in the augmented reality application may enable the user to buy virtual objects having certain types, buy certain designs that can be applied to the virtual objects, upload custom designs that can be applied thereto, etc.). In one implementation, the augmented reality server may therefore make the augmented reality application available to the mobile device via the application marketplace, which may share collected revenue associated with any fees charged to the user associated with the mobile device with an entity that provides the augmented reality server. Alternatively, in one implementation, the augmented reality server may make the augmented reality application directly available to the mobile device without employing the application marketplace to collect or otherwise charge any fees to the user associated with the mobile device. Moreover, in one implementation, one or more companies or other suitable entities may sponsor certain virtual objects for advertising or promotional purposes, in which case revenue associated therewith may be shared with or paid to the entity that provides the augmented reality server, used to reduce or eliminate fees that may be charged to the user, or otherwise customize certain features or contractual arrangements associated with the system.
According to one aspect of the invention, to support the augmented reality application, the mobile device may include a processor to execute the augmented reality application, location sensors to sense information relating to a current location, position, and/or orientation associated with the mobile device (e.g., a GPS sensor, compass, accelerometer, gyroscope), location data that relates to the current location, position, and/or orientation associated with the mobile device and other worldwide location-dependent information, a camera to sense a physical reality that represents a current viewpoint associated with the mobile device, and a user interface that shows the physical reality sensed with the camera and any virtual objects that may be present therein on a display. In one implementation, the physical reality that may be combined with virtual reality or virtual objects in any particular augmented reality described herein need not be limited to any particular geography, in that the physical reality may be on water, at a certain altitude in the air or above ground, or even in space, on the moon, on other planets, or any other location in the world or the universe to the extent that current or future technologies may permit sensing and tracking location information associated therewith. In addition, the mobile device may include one or more other applications functionality integrated with the augmented reality application (e.g., a social networking application that the augmented reality application may use to interact with other users). In one implementation, the mobile device may further include a database or another suitable repository that contains text, pictures, graphics, audio, video, icons, games, software, or other content or other virtual items that may be embedded in or otherwise associated with virtual objects that the user creates or interacts with via the augmented reality application, wherein content or other virtual items to embed in the virtual objects may be uploaded to the augmented reality server (e.g., to make the embedded content or other virtual items available to other users).
According to one aspect of the invention, the augmented reality server may include a processor, augmented reality software, and various databases or data repositories to support interaction with the virtual objects in the augmented realities via the augmented reality application installed on the mobile device, wherein the various databases or data repositories may contain user and mobile device data, virtual object content, incentive data, and advertising data. In particular, the augmented reality server may initially register the user associated with the augmented reality application in response to the augmented reality application having been installed on the mobile device and used to initiate communication with the augmented reality server, wherein registering the user may include the augmented reality server obtaining personal data associated with the user, identifying data associated with the mobile device, or any other information that may suitably relate to using the augmented reality application to access services, content, virtual items, or other data via the augmented reality server. As such, the augmented reality server may store the personal data associated with the registered user, the identifying data associated with the mobile device, and any other suitable information relating to the user and/or the mobile device in the user and mobile device data repository. In addition, the user may create a personal profile page associated with the augmented reality application and subsequently post, add, link, or otherwise submit information to customize the personal profile page, wherein the information associated therewith may be further stored in the user and mobile device data repository. In one implementation, additional information stored in the user and mobile device data repository may include payment information that the user submits to the augmented reality server, usage data associated with the augmented reality application, and records that relate to the location associated with the mobile device, among other things.
According to one aspect of the invention, the virtual object content may include content or other virtual items that the user submits in relation to virtual objects that the user created, collected, or otherwise interacted with via the augmented reality application. For example, the virtual object content may include any text, pictures, graphics, audio, video, icons, games, software, virtual currency, real currency that can be exchanged for actual cash or electronic credits, maps, offers, coupons, or other content or virtual items embedded in or otherwise associated with the virtual objects and any designs or other customizations that have been applied thereto. For example, the user may choose the design to apply to any particular virtual object from defaults available via the augmented reality application, upload a custom design to the augmented reality server, or take a picture to create the design to apply to the virtual object, and in each case the design chosen by the user may be applied to a surface associated with the virtual object (e.g., wrapped around a three-dimensional surface associated with the virtual object). In addition, the virtual object content may include data to represent the virtual objects that the user and/or other users have created and deployed into various worldwide locations, which may be associated with GPS coordinates, compass headings associated with rotational orientations, or other suitable location data that indicates the worldwide locations where the virtual objects have been deployed. In one implementation, the augmented reality server may dynamically update the GPS coordinates, compass headings, or other suitable location data associated with the virtual objects in response to one or more users finding and/or moving the virtual objects to a new location (i.e., to reflect the new locations where the virtual objects may have been moved).
According to one aspect of the invention, the incentive data may generally include content or other virtual items relating to deals, special offers, coupons, or other incentives that may be available to users associated with the augmented reality application. For example, various third-parties may submit the deals, special offers, coupons, or other incentives to the augmented reality server and specify certain locations where the deals, special offers, coupons, or other incentives may be available via the augmented reality application. Thus, in one implementation, the incentive content relating to the deals, special offers, coupons, or other incentives may be associated with virtual objects that can be found in the specified locations via the augmented reality application, and the augmented reality server may deliver the deals, special offers, coupons, or other incentives to the augmented reality application on the mobile device in response to the user finding and interacting with the associated virtual objects in the specified locations. In one implementation, the advertising data may similarly include advertisement content that third-parties submit to the augmented reality server, which may be delivered to the augmented reality application on the mobile device in a manner targeted to the user associated therewith (e.g., based on the personal data associated with the user, friends associated with the user, the identifying data associated with the mobile device, the location data associated with the mobile device, etc.). In one implementation, the advertising data may similarly be associated with virtual objects that can be found in certain locations, whereby the advertisements may be delivered to the augmented reality application in response to the user finding and interacting with the associated virtual objects. Alternatively (or additionally), the advertisements and the deals, special offers, coupons, or other incentives may not necessarily be associated with any particular virtual objects and instead delivered to the augmented reality application in response to certain conditions or criteria.
According to one aspect of the invention, in one implementation, the augmented reality server may therefore maintain and utilize the user and mobile device data, the virtual object content, the incentive data, and the advertising data to support interaction among the augmented reality application and other users associated with other mobile devices that have the augmented reality application installed thereon. For example, the user may grant the augmented reality application access to social networking or other third-party applications installed on the mobile device that relate to the other users or the other mobile devices, whereby the augmented reality application may access the social networking or other third-party applications to support the interaction among the augmented reality application and other users associated with other mobile devices that have the augmented reality application installed thereon. In addition, the augmented reality server may use the user and mobile device data, the virtual object content, the incentive data, and the advertising data to deliver information to the augmented reality application that relates to targeted advertisements, incentives, special offers, coupons, and new features associated with the augmented reality application. Furthermore, the augmented reality server may store cookies or other state data on the mobile device to preserve settings associated with the augmented reality application, or the user may have the option to disable or otherwise decline to store the cookies or other state data on the mobile device, in which case certain features associated with the augmented reality application that require the cookies or other state data may be disabled.
According to one aspect of the invention, in response to having registered the user associated with the augmented reality application and suitably populating the user and mobile device data, the virtual object content, the incentive data, and the advertising data, the augmented reality application may be used to interact with virtual objects in the augmented realities. For example, in one implementation, the location sensors associated with the mobile device may continuously obtain location data that represents the current location, position, and/or orientation associated with the mobile device, which the augmented reality application may continuously communicate to the augmented reality server. As such, the augmented reality server may use the current location data associated with the mobile device to derive real-world coordinates that represent the viewpoint associated with the camera on the mobile device. For example, in one implementation, the augmented reality server may use image registration, image recognition, visual odometry, or other suitable techniques to detect interest points, fiduciary markers, or optical flow information from the location data to detect real-world features that represent the viewpoint associated with the camera (e.g., corners, edges, or other real-world features in a scene that represents the camera viewpoint). The augmented reality server may then map geometry associated with the real-world features that represent the camera viewpoint to construct real-world coordinates that represent the scene corresponding to the camera viewpoint, which may be correlated to the coordinates associated with the virtual objects managed on the augmented reality server.
According to one aspect of the invention, in response to identifying any virtual objects in the virtual object content repository having coordinates that are present within the camera viewpoint, the augmented reality server may deliver the virtual object coordinates and any custom designs, embedded content or other virtual items, or other suitable data relating thereto to the augmented reality application on the mobile device, which may render the identified virtual objects on the user interface associated therewith in combination with the scene corresponding to the camera viewpoint. For example, the augmented reality application may cause the user interface to superimpose the virtual objects over a real-world image that represents the camera viewpoint, thereby generating an augmented reality that combines the virtual objects and the scene that represents the camera viewpoint. Further, as noted above, the augmented reality application may continuously communicate current location, position, and/or orientation data associated with the mobile device to the augmented reality server, which may use the current location, position, and/or orientation data to continuously derive, map, or otherwise determine the current viewpoint associated with the camera. As such, based on the current camera viewpoint, the augmented reality server may determine whether any virtual objects are present in the current camera viewpoint, whereby the augmented reality application on the mobile device may use data that the augmented reality server determines in relation thereto to continually refresh the augmented reality shown in the user interface to reflect movements or changes in the current camera viewpoint. For example, the augmented reality application may refresh the location in the augmented reality where the virtual objects exist to reflect the current viewpoint, remove the virtual objects from the augmented reality shown in the user interface if virtual objects that were previously displayed therein are no longer present in the current viewpoint, or otherwise refresh the augmented reality shown in the user interface based on any virtual objects that may or may not be present in the current camera viewpoint. Accordingly, the user may simply point the camera associated with the mobile device at real-world surroundings, and the augmented reality application may transparently communicate with the augmented reality server in a substantially continuous manner to refresh the augmented reality shown in the user interface to reflect whether or not any virtual objects are present in the surroundings where the camera currently points, and further to properly orient and re-orient the virtual objects with respect to distances, positions, and rotations associated therewith and/or other virtual objects relative to where the camera currently points in substantially real-time.
Other objects and advantages of the invention will be apparent to those skilled in the art based on the following drawings and detailed description.
According to one aspect of the invention,
In one implementation, in order to use the system 100 and interact with virtual objects in augmented realities, a user associated with the mobile device 110 may download an augmented reality application 130 over a network from an application marketplace 190 (e.g., iTunes, Android Market, etc.), wherein the augmented reality application 130 may be free, associated with a one-time fee, or available on a subscription basis. Furthermore, in one implementation, certain features associated with the augmented reality application 130 may be free and the user may be required to pay one or more fees to upgrade the augmented reality application 130 and activate one or more additional features (e.g., a purchasing option in the augmented reality application 130 may enable the user to buy virtual objects having certain types, buy certain designs that can be applied to the virtual objects, upload custom designs that can be applied thereto, etc.). In one implementation, the augmented reality server 150 may therefore make the augmented reality application 130 available to the mobile device 110 via the application marketplace 190, which may share collected revenue associated with any fees charged to the user associated with the mobile device 110 with an entity that provides the augmented reality server 150. Alternatively, in one implementation, the augmented reality server 150 may make the augmented reality application 130 directly available to the mobile device 110 without employing the application marketplace 190 to collect or otherwise charge any fees to the user associated with the mobile device 110. Moreover, in one implementation, one or more companies or other suitable entities may sponsor certain virtual objects for advertising or promotional purposes, in which case revenue associated therewith may be shared with or paid to the entity that provides the augmented reality server 150, used to reduce or eliminate fees that may be charged to the user, or otherwise customize certain features or contractual arrangements associated with the system 100.
In one implementation, to support executing the augmented reality application 130, the mobile device 110 may generally include a processor 140 to execute the augmented reality application 130, location sensors 115a (e.g., a GPS sensor, compass, accelerometer, gyroscope, etc.) to sense information relating to a current location, position, and/or orientation associated with the mobile device 110, location data 115b that relates to maps, points of interest, or other location-dependent information in any suitable worldwide location, a camera 120 to sense a physical reality that represents a current viewpoint associated with the mobile device 110, and a user interface 145 that shows the physical reality sensed with the camera 120 and any virtual objects that may be present therein on a display associated with the mobile device 110. In one implementation, the physical reality that may be combined with virtual reality or virtual objects in any particular augmented reality described herein need not be limited to any particular geography. As such, the augmented reality application 130 may be used to interact with virtual realities or virtual objects in physical realities located on water (e.g., in boats), in three-dimensions upward (e.g., at altitudes in the air or above ground using light aircraft, jetpacks, hang gliders, etc.), in three-dimensions downward (e.g., in underground caves, mines, etc.), or even in space, on the moon, on other planets, or any other location in the world or the universe to the extent that current or future technologies may permit sensing and tracking location information associated therewith. In addition, the mobile device 110 may include one or more other applications 135 functionality integrated with the augmented reality application 130. For example, the other applications 135 may include a social networking application that the augmented reality application 130 may use to interact with friends, contacts, or other users (e.g., notifying friends that the user associated with the mobile device 110 has created a virtual object to interact with, finding virtual objects that friends have created, etc.). In one implementation, the mobile device 110 may further include a database or repository containing media content 125, which may include text, pictures, graphics, audio, video, icons, games, software, or other content or virtual items that may be embedded in or associated with virtual objects that the user interacts with via the augmented reality application 130 (e.g., virtual objects that the user created via the augmented reality application 130, virtual objects created by other users via the augmented reality application 130 and subsequent found or collected by the user via the augmented reality application 130, etc.).
In one implementation, to support the augmented reality application 130 on the mobile device 110 interacting with the virtual objects in the augmented realities, the augmented reality server 150 may include various databases or repositories that contain user and mobile device data 155, virtual object content 160, incentive data 165, and advertising data 170, and may further include a processor 175 to execute the augmented reality software 180 hosted thereon and store, maintain, or otherwise utilize the user and mobile device data 155, the virtual object content 160, the incentive data 165, and the advertising data 170 contained in the repositories to support the augmented reality application 130.
In particular, in one implementation, the augmented reality server 150 may register the user associated with the augmented reality application 130 in response to the augmented reality application 130 having been installed on the mobile device 110 and used to initiate communication with the augmented reality server 150. For example, in one implementation, the augmented reality server 150 may obtain personal data associated with the user (e.g., a name, email address, phone number, birthday, etc.), identifying data associated with the mobile device 110 (e.g., a network address, operating system, browser, etc.), or any other suitable information that relates to using the augmented reality application 130 to access services, content, virtual items, or other data via the augmented reality server 150. As such, to register the user, the augmented reality server 150 may store the personal data associated with the user, the identifying data associated with the mobile device 110, and the other information relating to the user and/or the mobile device 110 in the user and mobile device data repository 160. Moreover, in one implementation, the user may create a personal profile page associated with the augmented reality application 130 and subsequently post, add, link, or otherwise submit information to customize the personal profile page, wherein the information associated with the personal profile page may be further stored in the user and mobile device data repository 160. In one implementation, additional information stored in the user and mobile device data repository 160 may include payment information that the user submits to the augmented reality server 150 (e.g., to purchase the augmented reality application 130, activate additional features associated therewith, etc.), usage data associated with the augmented reality application 130 (e.g., to measure how often users access certain features associated therewith), and records that relate to the location associated with the mobile device 110.
In one implementation, the virtual object content repository 160 may store content or other virtual items that the user submits or otherwise uploads in relation to virtual objects that the user created, collected, or otherwise interacted with via the augmented reality application 130. For example, the content or other virtual items stored in the virtual object content repository 160 may include any text, pictures, graphics, audio, video, icons, games, software, virtual currency, real currency that can be exchanged for actual cash or electronic credits, maps, offers, coupons, or other content or virtual items embedded in or otherwise associated with the virtual objects, and any designs or other customizations that have been applied to the virtual objects. For example, the user may choose the design to apply to any particular virtual object from defaults available via the augmented reality application 130, upload a custom design to the augmented reality server 150, or take a picture (e.g., with the camera 120) to create the design to apply to the virtual object, and in each case the design chosen by the user may be applied to a surface associated with the virtual object (e.g., wrapped around a three-dimensional surface associated with the virtual object). In one implementation, the content or other virtual items to embed in the virtual objects and information relating to the designs or other customizations applied to the virtual objects may be uploaded to the augmented reality server 150 and stored in the virtual object content repository 160 (e.g., to make the embedded content, virtual items, designs, and customizations available to other users that may interact with the virtual objects). Furthermore, in one implementation, the virtual objects that the user creates may be secured to restrict or otherwise control whether other users may be permitted to interact therewith (e.g., virtual objects may be secured to only be visible to the user that created the virtual objects, one or more users in a list defined by the user, certain friends or groups of friends defined in a social networking application 135, anyone, and/or users that have satisfy certain demographic, geographic, or other criteria. Moreover, in one implementation, the user may define criteria to specify how other users can interact with the virtual objects that the user creates (e.g., sorting the virtual objects according to popularity or distance from the other users, permitting the virtual objects to be viewed on a map, via the augmented reality corresponding to the viewpoint associated with the camera 120, and/or various suitable combinations thereof).
In addition, the virtual object content repository 160 may further include data to represent the virtual objects that the user and/or other users have created and deployed into various worldwide locations, wherein the virtual object content repository 160 may associate the virtual objects with GPS coordinates, compass headings associated with rotational orientations, or other suitable location data that indicates the worldwide locations where the virtual objects have been deployed. In one implementation, the processor 175 and the augmented reality software 180 associated with the augmented reality server 150 may dynamically update the GPS coordinates, compass headings, or other suitable location data associated with the virtual objects in the virtual object content repository 160 in response to one or more users finding and/or moving the virtual objects to a new location (i.e., to reflect the new locations where the virtual objects may have been moved).
In one implementation, the incentive data repository 165 may generally include content or other virtual items relating to deals, special offers, coupons, or other incentives that may be available to users associated with the augmented reality application 130. For example, various third-parties may submit the deals, special offers, coupons, or other incentives to the augmented reality server 150 and specify certain worldwide locations where the deals, special offers, coupons, or other incentives may be available via the augmented reality application 130. Thus, in one implementation, the incentive data repository 165 may associate the content or other virtual items relating to the deals, special offers, coupons, or other incentives with virtual objects that can be found in the specified locations via the augmented reality application 130, and the processor 175 and the augmented reality software 180 may deliver the deals, special offers, coupons, or other incentives to the augmented reality application 130 in response to the user finding and interacting with the associated virtual objects in the specified locations. In one implementation, the advertising data repository 170 may similarly include advertisement content that third-parties submit to the augmented reality server 150, which may be delivered to the augmented reality application 130 on the mobile device 110 in a manner targeted to the user associated therewith based on the personal data associated with the user, friends associated with the user, the identifying data associated with the mobile device 110, the records that relate to the location associated with the mobile device 110, behavior patterns associated with using the augmented reality application 130, or other suitable criteria. Furthermore, in one implementation, the advertising data repository 170 may similarly associate the advertisement content with virtual objects that can be found in certain locations, whereby the advertisements may be delivered to the augmented reality application 130 in response to the user finding and interacting with the associated virtual objects. Alternatively (or additionally), the advertisements and the deals, special offers, coupons, or other incentives may not necessarily be associated with any particular virtual objects and instead delivered to the augmented reality application 130 in response to certain conditions or criteria (e.g., a special offer or coupon may be delivered to a particular user that wins a treasure hunt, scavenger hunt, or other interactive game involving the virtual objects, advertisements may be delivered to mobile devices 110 in certain locations, etc.).
Accordingly, in one implementation, the augmented reality server 150 may maintain and utilize the user and mobile device data 155, the virtual object content 160, the incentive data 165, and the advertising data 170 contained in the repositories to support the augmented reality application 130 interacting with other users associated with other mobile devices 110 that have the augmented reality application 130 installed thereon (e.g., via granting the augmented reality application 130 access to social networking or other third-party applications 135 that relate to the other users or the other mobile devices 110), and further to provide customized content or other virtual items relating to advertisements, incentives, special offers, coupons, and new features associated with the augmented reality application 130 (e.g., based on the user and mobile data 155). Furthermore, in one implementation, the augmented reality server 150 may store one or more cookies or other state data on the mobile device 110 to preserve settings associated with the augmented reality application 130, although the user may have the option to disable or otherwise decline to store the cookies or other state data on the mobile device 110, in which case certain features associated with the augmented reality application 130 that require the cookies or other state data may be disabled.
In one implementation, in response to having registered the user associated with the augmented reality application 130, the augmented reality server 150 may execute the augmented reality software 180 to support interaction with the virtual objects in the augmented realities via the augmented reality application 130. For example, in one implementation, the location sensors 115a associated with the mobile device may continuously obtain the location data 115b that represents the current location, position, and/or orientation associated with the mobile device 110, wherein the augmented reality application 130 may continuously communicate the location data 115b that represents the current location, position, and/or orientation associated with the mobile device 110 to the augmented reality server 150. For example, in one implementation, the augmented reality application 130 and the augmented reality server 150 may generally use Comet or similar low-latency communication technology to manage the augmented realities. In particular, the augmented reality application 130 may open a persistent connection with the augmented reality server 150 to send and receive data relating to all events associated with the augmented realities, whereby the low-latency communication technology used in the system 100 may cause the augmented reality server 150 and the augmented reality application 130 to push the data relating to the events associated with the augmented realities to one another at any time without closing the persistent connection.
Accordingly, in one implementation, the low-latency communication technology may substantially reduce latencies between a time when the user takes certain actions with respect to the virtual objects or the augmented realities associated therewith and the time when the actions are reflected on the display associated with other users that may be interacting therewith (e.g., to a half-second or less). Furthermore, in one implementation, the augmented reality server 150 may use the low-latency communication technology to adjudicate situations in which multiple users trigger different actions on a particular virtual object at substantially the same time. For example, if the different actions are incompatible (e.g. because the multiple users moved the virtual object in opposite directions) the augmented reality server 150 may determine which action happened first, execute the action that happened first and discard any subsequent actions incompatible therewith, and then notify augmented reality applications 130 on the mobile devices 110 associated with any other users that triggered the subsequent incompatible actions. In one implementation, the augmented reality applications 130 associated with the other users may then update or otherwise correct the user interface 145 to properly reflect the action that happened first in lieu of the subsequent incompatible actions that the other users triggered (e.g., if the user interface 145 was updated to indicate that the subsequent action occurred, the augmented reality application 130 may update or otherwise correct the user interface 145 to undo the subsequent action even if the user interface 145 previously showed that the action was triggered). As such, in one implementation, the low-latency communication technology may result in the augmented reality server 150 delivering an event relating to the action that happened first to the augmented reality application 130, wherein the event may cause the augmented reality application 130 to update or otherwise correct the user interface 145 in approximately half a second (e.g., to reflect that the subsequent incompatible action was not triggered on the virtual object and that the first action initiated by the other user was triggered). In one implementation, further exemplary detail relating to the low-latency communication technology that may be used in the system may be provided in “Comet: Low Latency Data for the Browser,” the contents of which are hereby incorporated by reference in their entirety.
As such, in one implementation, the augmented reality server 150 may use the current location data 115b associated with the mobile device 110 to derive real-world coordinates that represent the viewpoint associated with the camera 120. For example, the augmented reality software 180 may use image registration, image recognition, visual odometry, or any other suitable technique to detect interest points, fiduciary markers, or optical flow information from the location data 115b, which may be used to detect real-world features that represent the viewpoint associated with the camera 120 (e.g., corners, edges, or other real-world features in a scene that represents the viewpoint associated with the camera 120). The augmented reality software 180 may then map geometry associated with the real-world features that represent the viewpoint associated with the camera 120 to construct real-world coordinates that represent the scene corresponding to the viewpoint, which may be correlated to the coordinates associated with the virtual objects in the virtual object content repository 155. While the above description provides exemplary techniques that may be used to map the viewpoint associated with the camera 120, other suitable techniques may be used, including those described in “Marker Tracking and HMD Calibration for a Video-based Augmented Reality Conferencing System,” the contents of which are hereby incorporated by reference in their entirety.
As such, in response to identifying any virtual objects in the virtual object content repository 155 having coordinates within the viewpoint associated with the camera 120, the augmented reality software 180 may deliver information relating to the coordinates, any custom designs, embedded content, virtual items, or other suitable data relating to the identified virtual objects to the augmented reality application 130, which may render the identified virtual objects on the user interface 145 associated with the mobile device 110 in combination with the scene corresponding to the viewpoint associated with the camera 120. For example, the augmented reality application 130 may superimpose the virtual objects displayed in the user interface 145 over a real-world image that represents the viewpoint associated with the camera 120 to generate an augmented reality that combines the virtual objects and scene that represents the viewpoint associated with the camera 120. Further, as noted above, the augmented reality application 130 may continuously communicate the current location, position, and/or orientation associated with the mobile device 110 to the augmented reality server 150 over the network, which the augmented reality software 180 hosted thereon may use to continuously derive, map, or otherwise determine the current viewpoint associated with the camera 120. As such, based on the current viewpoint associated with the camera 120, the augmented reality software 180 may determine whether any virtual objects are present in the current viewpoint associated with the camera 120, and the augmented reality application 130 may use data that the augmented reality server 150 delivers in relation thereto to continually refresh the user interface 145 to reflect movements or changes in the current viewpoint associated with the camera 120 in substantially real-time. For example, in one implementation, locations where the virtual objects are displayed in the user interface 145 may be refreshed to reflect the current viewpoint, remove the virtual objects from the user interface 145 if previously displayed virtual objects are no longer present in the current viewpoint, reflect movements associated with the camera 120, the virtual objects themselves, or combinations thereof, or otherwise reflect changes to the locations associated with the virtual objects and/or the current viewpoint in substantially real-time. For example, if a particular virtual object moves left and the camera 120 moves right, if multiple virtual objects move into the current viewpoint in different directions, or other changes to the locations associated with the virtual objects and/or the current viewpoint occur, the augmented reality server 150 and the augmented reality application 130 may cooperatively communicate to seamlessly handle refreshing the user interface 145 to reflect the virtual object and camera 120 movements or other changes to the locations associated therewith. Furthermore, in one implementation, the user may simultaneously view the virtual objects from different viewpoints (e.g., on a map, in a live viewpoint associated with the camera 120, etc.), and the augmented reality application 130 may automatically and substantially immediately update the view associated with the virtual objects to reflect any actions that other users may take to interact with the virtual objects that the user may be viewing. Accordingly, the user associated with the mobile device 110 may simply point the camera 120 at real-world surroundings, and the augmented reality application 130 may communicate with the augmented reality server 150 in a substantially continuous manner to refresh the user interface 145 based on whether or not any virtual objects are present in the surroundings where the camera 120 currently points, properly orient and re-orient the virtual objects with respect to distances, positions, and rotations associated therewith and/or other virtual objects relative to where the camera 120 currently points, or otherwise manage the virtual objects displayed in the user interface 145.
Having provided the above overview generally describing the architectural components and functionality associated with the system 100 that enables interaction with virtual objects in augmented realities, the following description relating to
For example, in one implementation,
In one implementation, the user interface 200 may further include an interaction menu 210 that includes one or more navigation options to navigate the augmented reality application (e.g., a back button to return to a previous user interface) and one or more interaction options to interact with any virtual objects 260 that may be displayed in the augmented reality area 240. For example, in one implementation, the interaction options may include a take button to collect or otherwise interact with a virtual object 260 displayed in the augmented reality area 240, a destroy button to delete or otherwise discard a virtual object 260 displayed in the augmented reality area 240, or various other buttons that may be used to interact with the virtual objects 260 (e.g., an edit button to modify the virtual objects 260, an open button to view, see, or otherwise interact with content or virtual items embedded in the virtual objects 260, an add button to embed content or virtual items in the virtual objects 260, a move button to relocate the virtual objects 260 to another place, a pocket button to deliver or share the virtual objects 260 with another user, etc.). Furthermore, in one implementation, the user interface 200 may include a virtual object menu 220 having various options to manage virtual objects 260 associated with the user (e.g., the virtual objects 260 created by the user, collected virtual objects 260 that other users created, any virtual objects 260 shown in the augmented reality area 240, etc.). For example, in one implementation, the virtual object menu 220 may include a view option to view the virtual objects 260 within the augmented reality area 240, an actions option to take, collect, destroy, move, and/or otherwise interact with the virtual objects 260, a comments option to post and share comments relating to the virtual objects 260 with friends, contacts, or other users, and a contents option to view, embed, remove, or otherwise interact with text, pictures, graphics, audio, video, icons, games, software, or other content or virtual items associated with the virtual objects 260. Moreover, in one implementation, the user interface 200 may include a main menu 230 having various options to use the augmented reality application to interact with the virtual objects 260. For example, in one implementation, the main menu 230 may include a virtual objects option to display the virtual objects 260 associated with the user in the augmented reality area 240 (e.g., virtual objects 260 created by, collected by, shared with, or otherwise associated with the user), a map option to render or otherwise display a map in the augmented reality area 240 in addition to any virtual objects 260 present therein, a live view option to show a current camera viewpoint in the augmented reality area 240 and any virtual objects 260 present therein, a create option to design, develop, or otherwise create a new virtual object 260, and a profile option to create, update, edit, or otherwise modify a personal profile page associated with the user (e.g., to provide name, address, contact data, or other information associated with the user, update payment information needed to upgrade the augmented reality application, activate certain features, buy virtual objects 260 having certain types, buy certain designs that can be applied to the virtual objects 260, upload custom designs that can be applied thereto, post information to share with other users, etc.).
In one implementation,
In one implementation, augmented reality area 340 may further include a compass icon to show the current direction or orientation associated with the mobile device relative to planetary magnetic poles (i.e., to true north) and a zoom option to increase or decrease the physical area shown in the map. In one implementation, the user interface 300 associated with the map option may include a clustering capability to represent multiple virtual objects 360 having respective locations within a suitable proximity to one another. For example, in response to the user selecting the zoom option to increase the physical area shown in the map or otherwise zoom out to a higher geographic level in which multiple virtual objects 360 are located within a certain physical proximity, the multiple virtual objects 360 may be formed into clusters 360, which may represent the multiple virtual objects 360 located within a proximity to locations associated with the clusters 360. Furthermore, in one implementation, the clusters 360 may have different icons than the virtual objects 360 represented therewith (e.g., the virtual objects 360 themselves may have icons corresponding to designs or other customizations that have been applied thereto, whereas the clusters 360 may have specific icons to indicate that multiple virtual objects 360 are represented thereby in addition to information indicating how many virtual objects 360 are contained therein). In addition, the augmented reality area 340 may change, combine, merge, or separate the clusters 360 or the multiple virtual objects 360 represented therewith depending on how much the user has zoomed in or zoomed out the physical area shown in the map, and the user may touch any particular cluster 360 shown in the map to see a list that identifies the multiple virtual objects 360 contained in the cluster 360 and then select a particular one of the multiple virtual objects 360 to further interact with (if so desired). For example, in one implementation, the mobile device may include a touch-screen display that shows the user interface 300, the augmented reality area 340 displayed therein, and all other user interfaces described herein, whereby the user may simply touch the cluster 360 on the map to see and further interact with the multiple virtual objects 360 contained therein.
Accordingly, the map option in the main menu 330 may generally cause the augmented reality area 340 to combine one or more images that represent a certain physical area with data that represents virtual objects 360 located in that physical area, thereby representing an augmented reality that superimposes or otherwise overlays the data that represents virtual objects 360 over the images that represent physical reality, which may enable the user to see where any virtual objects 360 that may be present or otherwise near to the current location associated with the mobile device are located. Additionally, in one implementation, the user interface 300 may include a user menu 350 to configure which virtual objects 360 to display in the augmented reality area 340. For example, in one implementation, the user menu 350 may include an option to display all virtual objects 360 located within the physical reality currently shown in the augmented reality area 340, or alternatively to restrict the displayed virtual objects 360 located therein to those that the user created or to those that friends, contacts, or other users (besides the current user) created. Furthermore, the various roads, points of interest, satellite images (in the hybrid mode), and other visual features associated with the physical reality shown in the augmented reality area 340 in combination with the compass, the zoom option, and the indicator 370 that visually represents the current location associated with the mobile device in the augmented reality area 340 may enable the user to navigate to the locations associated with virtual objects 360 and thereby collect or otherwise interact with the virtual objects 360. More particularly, in response to the location sensors tracking movements or other changes in the current location associated with the mobile device, the indicator 370 in the augmented reality area 340 may be dynamically updated to reflect the current location associated with the mobile device in a substantially continuous manner, and the user may reference the compass, change the zoom level, toggle between the map mode and the hybrid mode, or otherwise interact with the information displayed in the augmented reality area 340 in order to track down and collect virtual objects 360, properly orient and re-orient the virtual objects 360 with respect to distances, positions, and rotations associated therewith and/or other virtual objects 360 relative to the augmented reality area 340, or otherwise manage the virtual objects 360 displayed in the augmented reality area 340.
In one implementation,
Accordingly, the live view option in the main menu 430 may generally cause the augmented reality area 440 to combine a physical reality image that represents a current viewpoint associated with the camera and data that represents virtual objects 460 located in the forward projected plane, thereby creating an augmented reality that may enable the user to locate, collect, and otherwise interact with any virtual objects 460 that may be located therein. Furthermore, the manner in which the augmented reality area 440 shows the virtual objects 460 against the physical reality image may be dynamically updated in response to the location sensors tracking movements or other changes in the current location, position, and/or orientation associated with the mobile device, which may reflect changes in the current viewpoint associated with the camera. For example, in one implementation, based on the changes in the current location, position, and/or orientation associated with the mobile device, the augmented reality application may use the location data associated with the virtual objects 460 to suitably update the current location, position, and/or orientation that the virtual objects 460 have within the current camera viewpoint (e.g., showing visual design features associated with the virtual objects 460 based on the camera viewpoint relative to the location associated with the virtual objects 460, making the virtual objects larger or smaller within the augmented reality area 440 depending on whether the camera viewpoint has moved closer or farther away from the location associated with the virtual objects, etc.). In one implementation, in response to the augmented reality application determining that the virtual objects 460 are within a suitable proximity to the current camera viewpoint (e.g., substantially close enough that the user could touch a particular virtual object 460 if the virtual object 460 was physically present), the augmented reality application may then enable the user to collect, view, move, and/or otherwise interact with virtual objects 460.
Furthermore, in one implementation,
In one implementation,
Additionally, in one implementation, the user interface 500 may include the interaction menu 510 to enable the user to take, collect, destroy, kick, move, and/or otherwise interact with the virtual object 560, and the augmented reality area 540 may further include an action option 590 to invoke the appropriate action on the virtual object 560. For example, as noted above, the touch-screen display associated with the mobile device may show the user interface 500, the augmented reality area 540 displayed therein, and all other user interfaces described herein, whereby the user may simply touch any suitable virtual object 560 on the touch-screen display to open the virtual object 560, see content or other virtual items that may be embedded therein, and interact with the embedded content or other virtual items embedded therein if so desired (e.g., as described in further detail below with respect to
Furthermore, in one implementation,
Additionally, in one implementation,
For example, in one implementation,
For example, in one implementation,
In one implementation, the live view 940a may generally superimpose the virtual ball 960 and/or the virtual goal over the physical reality image corresponding to the current camera viewpoint if current locations associated with the virtual ball 960 and/or the virtual goal fall within the current camera viewpoint. Alternatively, if the current locations associated with the virtual ball 960 and/or the virtual goal fall outside the current camera viewpoint, the live view 940a may include a direction indicator 975 and a distance indicator 995 to respectively show where and how far away the virtual ball 960 and/or the virtual goal are located relative to the current camera viewpoint. For example, in one implementation,
In one implementation, to play the interactive game, the live view 940a may include an action option 990 that may be invoked to trigger a bump-kick, nudge, or other action on the virtual ball 960, or the live view 940a may alternatively omit the action option 990 to enable a user (or player) to trigger the action on the virtual ball 960 simply via moving the mobile device. For example, in the latter case, the action may be invoked in response to the player approaching the virtual ball 960 and coming within a certain proximity thereto, at which time the virtual ball 960 may automatically move to a new location based on a current direction, elevation angle, or other orientation associated with the mobile device. Furthermore, in one implementation, the distance that the virtual ball 960 moves may depend on a speed at which the player moved the mobile device over the ground at the time that the mobile device came close enough to the virtual ball 960 to trigger the action. Alternatively, in the former case where the live view 940a includes the action option 990, the player may select the action option 990 to trigger the bump-kick, nudge, or other action on the virtual ball 960 once the mobile device has been moved within the proximity to the virtual ball 960 required to trigger the action. For example, in one implementation, the action option 990 may be disabled or only appear once the mobile device has been moved within the required proximity, at which time the action option 990 may be selected and subsequent gestures that involve moving the mobile device may trigger the action. As such, the action option 990 may be used to move the virtual ball 960 in a generally similar manner to the automatic mechanism described above via the subsequent gestures (e.g., moving the location associated the virtual ball 960 a certain distance based on the direction, elevation angle, orientation, and speed at which the gestures occurred).
Accordingly, in one implementation, the user interface 900 shown in
Implementations of the invention may be made in hardware, firmware, software, or any suitable combination thereof. The invention may also be implemented as instructions stored on a machine-readable medium that can be read and executed on one or more processing devices. For example, the machine-readable medium may include various mechanisms that can store and transmit information that can be read on the processing devices or other machines (e.g., read only memory, random access memory, magnetic disk storage media, optical storage media, flash memory devices, or any other storage or non-transitory media that can suitably store and transmit machine-readable information). Furthermore, although firmware, software, routines, or instructions may be described in the above disclosure with respect to certain exemplary aspects and implementations performing certain actions or operations, it will be apparent that such descriptions are merely for the sake of convenience and that such actions or operations in fact result from processing devices, computing devices, processors, controllers, or other hardware executing the firmware, software, routines, or instructions. Moreover, to the extent that the above disclosure describes executing or performing certain operations or actions in a particular order or sequence, such descriptions are exemplary only and such operations or actions may be performed or executed in any suitable order or sequence.
Furthermore, aspects and implementations may be described in the above disclosure as including particular features, structures, or characteristics, but it will be apparent that every aspect or implementation may or may not necessarily include the particular features, structures, or characteristics. Further, where particular features, structures, or characteristics have been described in connection with a specific aspect or implementation, it will be understood that such features, structures, or characteristics may be included with other aspects or implementations, whether or not explicitly described. Thus, various changes and modifications may be made to the preceding disclosure without departing from the scope or spirit of the invention, and the specification and drawings should therefore be regarded as exemplary only, with the scope of the invention determined solely by the appended claims.