Location mapping for large scale augmented-reality

Information

  • Patent Grant
  • 11915400
  • Patent Number
    11,915,400
  • Date Filed
    Friday, June 10, 2022
    a year ago
  • Date Issued
    Tuesday, February 27, 2024
    2 months ago
Abstract
An Augmented-Reality which performs operations that include: accessing a data object that comprises image data, location data, and orientation data; applying a transformation to the data object to produce a rectified data object; generating a point cloud based on the rectified data object; assigning the point cloud to a location based on at least the location data of the data object; detecting a client device at the location; and loading the point cloud to the client device in response to the detecting the client device at the location.
Description
TECHNICAL FIELD

Embodiments of the present disclosure relate generally to mobile computing technology and, more particularly, but not by way of limitation, to systems for presenting augmented-reality (AR) content at a client device.


BACKGROUND

Augmented-reality is an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. The primary value of augmented reality is the manner in which components of the digital world blend into a person's perception of the real world, not as a simple display of data, but through the integration of immersive sensations, which are perceived as natural parts of an environment.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS

To easily identify the discussion of any particular element or act, the most significant digit or digits in a reference number refer to the figure number in which that element is first introduced.



FIG. 1 is a block diagram showing an example messaging system for exchanging data (e.g., messages and associated content) over a network in accordance with some embodiments, wherein the messaging system includes an augmented-reality system.



FIG. 2 is block diagram illustrating further details regarding a messaging system, according to example embodiments.



FIG. 3 is a block diagram illustrating various modules of an augmented-reality system, according to certain example embodiments.



FIG. 4 is a flowchart depicting a method of generating a point cloud, according to certain example embodiments.



FIG. 5 is a flowchart depicting a method of generating a point cloud, according to certain example embodiments.



FIG. 6 is a flowchart depicting a method of loading a portion of a point cloud at a client device, according to certain example embodiments.



FIG. 7 is a diagram depicting a method of selecting a portion of a point cloud, according to certain example embodiments.



FIG. 8 is a block diagram illustrating a representative software architecture, which may be used in conjunction with various hardware architectures herein described and used to implement various embodiments.



FIG. 9 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein.





DETAILED DESCRIPTION

As discussed above, augmented-reality (AR) is an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual information. Some AR systems make use of point clouds to generate and present AR content, wherein a point cloud is a set of data points in space which measure surface features and external surfaces of objects around them.


Use of point clouds is generally limited to small areas, due to the amount of data required to actually generate a point cloud. For example, the creation of a point cloud that defines the surface features in a single room may be relatively straight forward, while the creation of a point cloud that represents surface features of a neighborhood or city may be logistically impossible under current systems for a number of reasons. Collection of the data necessary to generate such a large point cloud is inherently tedious and time consuming and requires a great deal of organization and analysis. Furthermore, the resulting point cloud generated by such a system would be very large and computationally demanding, making it inefficient and impractical for use in the display of AR content at client devices that include mobile devices.


Accordingly, in certain example embodiments, an AR system is disclosed which performs operations that include: accessing a data object that comprises image data, location data, and orientation data; applying a transformation to the data object to produce a rectified data object; generating a point cloud based on the rectified data object; assigning the point cloud to a location based on at least the location data of the data object; detecting a client device at the location; and loading the point cloud to the client device in response to the detecting the client device at the location.


In some example embodiments, the data object may include images and videos collected by a plurality of client devices and indexed within a database based on location data that corresponds with the images and videos. The AR system may access the database and generate the point cloud for a given location based on the image data and location data from the images and videos collected from the plurality of client devices.


In some example embodiments, the data object may include images and videos collected from an omnidirectional camera (360 camera), wherein the 360 camera has a field of view that covers approximately an entire sphere or at least a full circle in the horizontal plane. In such embodiments, images and videos may be collected from the 360 camera wherein the images and videos include timestamps and location data.


In some example embodiments, to generate the point cloud, the AR system may access video data that comprises a set of video frames, wherein each video frame comprises a timestamp, location data, orientation data, and image data. The AR system may export a portion of the set of video frames, and generate a point cloud based on the portion of the set of video frames. To generate the point cloud based on the data objects, in certain embodiments, the AR system may perform a transformation upon the data object, wherein the transformation includes a linear rectification.


In some embodiments, loading the point cloud at the client device may include operations to identify a portion of the point cloud to be loaded at the client device. For example, as discussed above, a technical issue with the use of point clouds to present AR content in a large environment is the computational demand of a large-scale point cloud. Accordingly, in certain embodiments, the AR system may identify a portion of the point cloud to be loaded at the client device based on one or more contextual conditions or factors.


In some embodiments, the contextual factors may include a location of the client device, wherein the location and orientation of the client device defines a viewpoint of the client device. The AR system may determine what landmarks and surface features are visible from the viewpoint of the client device and identify a portion of the point cloud based on the visible landmarks and surface features from the viewpoint of the client device.


In some embodiments, the contextual factors may include a time of day. In such embodiments, the data objects may comprise images and videos that include timestamps which indicate a time of day in which the images and videos were collected. Accordingly, the point cloud may comprise a plurality of points, wherein a single surface feature may be represented by more than one point, and each point may be based on a different time of day. For example, a given surface feature or landmark may have a first set of points that represent the surface feature or landmark at a first time of day (i.e., morning), and a second set of points that represent the surface feature or landmark at a second time of day (i.e., evening). The AR system may therefore identify the portion of the point cloud based on temporal considerations including a time of day in which the client device is at a given location or in which the client device requests AR content.


In some embodiments, the contextual factors may include attributes of the client device itself, including a memory or storage capacity of the client device, as well as a network connectivity speed of the client device. Accordingly, an optimal size of a portion of a point cloud may be determined based on the device attributes of the client device, and a portion of the point cloud may be selected based on the optimal size.



FIG. 1 is a block diagram showing an example messaging system 100 for exchanging data (e.g., messages and associated content) over a network. The messaging system 100 includes one or more client device 102 which host a number of applications including a messaging client application 104. Each messaging client application 104 is communicatively coupled to other instances of the messaging client application 104 and a messaging server system 108 via a network 106 (e.g., the Internet).


Accordingly, each messaging client application 104 is able to communicate and exchange data with another messaging client application 104 and with the messaging server system 108 via the network 106. The data exchanged between messaging client applications 104, and between a messaging client application 104 and the messaging server system 108, includes functions (e.g., commands to invoke functions) as well as payload data (e.g., text, audio, video or other multimedia data).


The messaging server system 108 provides server-side functionality via the network 106 to a particular messaging client application 104. While certain functions of the messaging system 100 are described herein as being performed by either a messaging client application 104 or by the messaging server system 108, it will be appreciated that the location of certain functionality either within the messaging client application 104 or the messaging server system 108 is a design choice. For example, it may be technically preferable to initially deploy certain technology and functionality within the messaging server system 108, but to later migrate this technology and functionality to the messaging client application 104 where a client device 102 has a sufficient processing capacity.


The messaging server system 108 supports various services and operations that are provided to the messaging client application 104. Such operations include transmitting data to, receiving data from, and processing data generated by the messaging client application 104. In some embodiments, this data includes, message content, client device information, geolocation information, media annotation and overlays, message content persistence conditions, social network information, and live event information, as examples. In other embodiments, other data is used. Data exchanges within the messaging system 100 are invoked and controlled through functions available via GUIs of the messaging client application 104.


Turning now specifically to the messaging server system 108, an Application Program Interface (API) server 110 is coupled to, and provides a programmatic interface to, an application server 112. The application server 112 is communicatively coupled to a database server 118, which facilitates access to a database 120 in which is stored data associated with messages processed by the application server 112.


Dealing specifically with the Application Program Interface (API) server 110, this server receives and transmits message data (e.g., commands and message payloads) between the client device 102 and the application server 112. Specifically, the Application Program Interface (API) server 110 provides a set of interfaces (e.g., routines and protocols) that can be called or queried by the messaging client application 104 in order to invoke functionality of the application server 112. The Application Program Interface (API) server 110 exposes various functions supported by the application server 112, including account registration, login functionality, the sending of messages, via the application server 112, from a particular messaging client application 104 to another messaging client application 104, the sending of media files (e.g., images or video) from a messaging client application 104 to the messaging server application 114, and for possible access by another messaging client application 104, the setting of a collection of media data (e.g., story), the retrieval of a list of friends of a user of a client device 102, the retrieval of such collections, the retrieval of messages and content, the adding and deletion of friends to a social graph, the location of friends within a social graph, opening and application event (e.g., relating to the messaging client application 104).


The application server 112 hosts a number of applications and subsystems, including a messaging server application 114, an image processing system 116, a social network system 122, and an AR system 124. The AR system 124 is configured to generate a point cloud based on image data, and load the point cloud at a client device 102, according to certain example embodiments. Further details of the AR system 124 can be found in FIG. 3 below.


The messaging server application 114 implements a number of message processing technologies and functions, particularly related to the aggregation and other processing of content (e.g., textual and multimedia content) included in messages received from multiple instances of the messaging client application 104. As will be described in further detail, the text and media content from multiple sources may be aggregated into collections of content (e.g., called stories or galleries). These collections are then made available, by the messaging server application 114, to the messaging client application 104. Other processor and memory intensive processing of data may also be performed server-side by the messaging server application 114, in view of the hardware requirements for such processing.


The application server 112 also includes an image processing system 116 that is dedicated to performing various image processing operations, typically with respect to images or video received within the payload of a message at the messaging server application 114.


The social network system 122 supports various social networking functions services, and makes these functions and services available to the messaging server application 114. To this end, the social network system 122 maintains and accesses an entity graph 304 within the database 120. Examples of functions and services supported by the social network system 122 include the identification of other users of the messaging system 100 with which a particular user has relationships or is “following,” and also the identification of other entities and interests of a particular user.


The application server 112 is communicatively coupled to a database server 118, which facilitates access to a database 120 in which is stored data associated with messages processed by the messaging server application 114.



FIG. 2 is block diagram illustrating further details regarding the messaging system 100, according to example embodiments. Specifically, the messaging system 100 is shown to comprise the messaging client application 104 and the application server 112, which in turn embody a number of some subsystems, namely an ephemeral timer system 202, a collection management system 204 and an annotation system 206.


The ephemeral timer system 202 is responsible for enforcing the temporary access to content permitted by the messaging client application 104 and the messaging server application 114. To this end, the ephemeral timer system 202 incorporates a number of timers that, based on duration and display parameters associated with a message, collection of messages (e.g., a collection of media), or graphical element, selectively display and enable access to messages and associated content via the messaging client application 104. Further details regarding the operation of the ephemeral timer system 202 are provided below.


The collection management system 204 is responsible for managing collections of media (e.g., collections of text, image video and audio data). In some examples, a collection of content (e.g., messages, including images, video, text and audio) may be organized into an “event gallery” or an “event story.” Such a collection may be made available for a specified time period, such as the duration of an event to which the content relates. For example, content relating to a music concert may be made available as a “story” for the duration of that music concert. The collection management system 204 may also be responsible for publishing an icon that provides notification of the existence of a particular collection to the user interface of the messaging client application 104.


The collection management system 204 furthermore includes a curation interface 208 that allows a collection manager to manage and curate a particular collection of content. For example, the curation interface 208 enables an event organizer to curate a collection of content relating to a specific event (e.g., delete inappropriate content or redundant messages). Additionally, the collection management system 204 employs machine vision (or image recognition technology) and content rules to automatically curate a content collection. In certain embodiments, compensation may be paid to a user for inclusion of user generated content into a collection. In such cases, the curation interface 208 operates to automatically make payments to such users for the use of their content.


The annotation system 206 provides various functions that enable a user to annotate or otherwise modify or edit media content associated with a message. For example, the annotation system 206 provides functions related to the generation and publishing of media overlays for messages processed by the messaging system 100. The annotation system 206 operatively supplies a media overlay (e.g., a filter, lens) to the messaging client application 104 based on a geolocation of the client device 102. In another example, the annotation system 206 operatively supplies a media overlay to the messaging client application 104 based on other information, such as, social network information of the user of the client device 102. A media overlay may include audio and visual content and visual effects. Examples of audio and visual content include pictures, texts, logos, animations, and sound effects, as well as animated facial models. An example of a visual effect includes color overlaying. The audio and visual content or the visual effects can be applied to a media content item (e.g., a photo or video) at the client device 102. For example, the media overlay including text that can be overlaid on top of a photograph generated taken by the client device 102. In another example, the media overlay includes an identification of a location overlay (e.g., Venice beach), a name of a live event, or a name of a merchant overlay (e.g., Beach Coffee House). In another example, the annotation system 206 uses the geolocation of the client device 102 to identify a media overlay that includes the name of a merchant at the geolocation of the client device 102. The media overlay may include other indicia associated with the merchant. The media overlays may be stored in the database 120 and accessed through the database server 118.


In one example embodiment, the annotation system 206 provides a user-based publication platform that enables users to select a geolocation on a map, and upload content associated with the selected geolocation. The user may also specify circumstances under which a particular media overlay should be offered to other users. The annotation system 206 generates a media overlay that includes the uploaded content and associates the uploaded content with the selected geolocation.


In another example embodiment, the annotation system 206 provides a merchant-based publication platform that enables merchants to select a particular media overlay associated with a geolocation via a bidding process. For example, the annotation system 206 associates the media overlay of a highest bidding merchant with a corresponding geolocation for a predefined amount of time



FIG. 3 is a block diagram illustrating components of the AR system 124 that configure the AR system 124 to perform operations to generate and cause display of a notification based on a classification associated with a user connection, according to certain example embodiments.


The AR system 124 is shown as including An Image module 302, a rectification module 304, and a point cloud module 306, all configured to communicate with each other (e.g., via a bus, shared memory, or a switch). Any one or more of these modules may be implemented using one or more processors 308 (e.g., by configuring such one or more processors to perform functions described for that module) and hence may include one or more of the processors 308. In certain embodiments, the avatar notification system 124 may include or have access to the database 120, wherein the database 120 may comprise a collection of media content indexed based on user attributes and astrological signs.


Any one or more of the modules described may be implemented using hardware alone (e.g., one or more of the processors 308 of a machine) or a combination of hardware and software. For example, any module described of the avatar notification system 124 may physically include an arrangement of one or more of the processors 308 (e.g., a subset of or among the one or more processors of the machine) configured to perform the operations described herein for that module. As another example, any module of the avatar notification system 124 may include software, hardware, or both, that configure an arrangement of one or more processors 308 (e.g., among the one or more processors of the machine) to perform the operations described herein for that module. Accordingly, different modules of the avatar notification system 124 may include and configure different arrangements of such processors 308 or a single arrangement of such processors 308 at different points in time. Moreover, any two or more modules of the avatar notification system 124 may be combined into a single module, and the functions described herein for a single module may be subdivided among multiple modules. Furthermore, according to various example embodiments, modules described herein as being implemented within a single machine, database, or device may be distributed across multiple machines, databases, or devices.



FIG. 4 is a flowchart depicting a method 400 of generating a point cloud, according to certain example embodiments. Operations of the method 400 may be performed by the modules described above with respect to FIG. 3. As shown in FIG. 4, the method 400 includes one or more operations 402, 404, 406, 408, 410 and 412.


At operation 402, the image module 302 accesses a data object that comprises image data, location data, and orientation data. For example, in some embodiments, the image module 302 may access a repository (i.e., the databases 120), wherein the repository comprises a collection of data objects which are indexed based on location. For example, in some embodiments, the data objects may be collected from a plurality of client devices 102 and indexed within the databases 120 based on the corresponding location data. In some embodiments, the data objects may be generated by an omnidirectional camera and indexed within the databases 120 based on the corresponding location data.


At operation 404, the rectification module 304 applies a transformation to the data object to produce a rectified data object. For example, in some embodiments, the data object may include omnidirectional camera images, wherein the omnidirectional camera images depicts a 360 degree view of an area. The rectification module 304 may access the data object and apply one or more linear rectification techniques to bring the image into a common image plane.


At operation 406, the point cloud module 306 generates a point cloud based on the rectified data object, wherein the point cloud comprises a set of data points in space that define properties of the external surfaces of objects in the space. For example, from a given perspective, a point of a point cloud may define a distance of a surface from the perspective.


At operation 408, the point cloud module 306 assigns the point cloud to a location based on at least the location data of the data object. For example, the point cloud module 306 may assign the point cloud to a geo-fence that encompasses a location. In some embodiments, assigning the point cloud to a location may include aligning the point cloud with a location based on landmarks within the location.


At operation 410, the image module 302 detects a client device 102 at a location. For example, the client device 102 may generate a request that includes location data that identifies the location, or may enter into a geo-fence that encompasses the location. In some embodiments, the client device 102 may generate image data that depicts one or more landmarks associated with the location, and the image module 302 may identify the location based on the image data from the client device 102.


At operation 412, responsive to the image module 302 detecting the client device 102 at the location, the point cloud module 306 loads the point cloud that corresponds with the location at the client device 102. Accordingly, AR content may be displayed at the client device 102 based on the point cloud.



FIG. 5 is a flowchart depicting a method 500 of generating a point cloud, according to certain example embodiments. Operations of the method 500 may be performed by the modules described above with respect to FIG. 3. As shown in FIG. 5, the method 500 includes one or more operations 502, and 504, that may be performed as a part (i.e., a subroutine) of operation 402 from the method 400.


At operation 502, the image module 302 accesses video data that comprises a set of video frames. For example, the video data may be generated by one or more client devices 102 and indexed at a memory location within a database 120 associated with the location.


At operation 504, the image module 302 exports a portion of the set of video frames, wherein each video frame among the portion of the set of video frames comprises image data, location data, and orientation data.


In some embodiments, the portion of the set of video frames may be exported based on properties of surface features of an area depicted by the video. For example, more complex surfaces may require a greater number of video frames to accurately depict the surfaces with a point cloud, while simple surfaces (i.e., few objects, and only a few surfaces) may require fewer video frames.


In some embodiments, the portion of the set of video frames may be exported based on a collection rate associated with the video data, wherein the collection rate may be defined as a speed of travel. For example, the faster a sensor-devices moves through an area, a larger number of video frames may need to be exported.


Accordingly, the method 500 may continue to operation 404 of the method 400, wherein the rectification module 304 performs linear rectification upon the set of video frames to generate the rectified data object.



FIG. 6 is a flowchart depicting a method 600 of loading a portion of a point cloud at a client device 102, according to certain example embodiments. Operations of the method 600 may be performed by the modules described above with respect to FIG. 3. As shown in FIG. 6, the method 600 includes one or more operations 602, 604, and 606, and may be performed as a part of operation 412 of the method 400.


At operation 602, the point cloud module 306 determines a contextual condition associated with the client device 102. The contextual condition may include a temporal condition (i.e., a time of day), a device attribute or property, as well as location data of the client device 102.


For example, in some embodiments, the contextual condition may include an indication of a device type of the client device 102, a network speed associated with the client device 102, as well as a memory capacity of the client device 102.


In some embodiments, the contextual condition may include a time of day associated with a request from the client device 102. For example, the time of day may be determined based on metadata associated with requests from the client device 102 or based on properties of images generated at the client device 102.


In some embodiments, the contextual condition may include a perspective, of point of view associated with the client device 102, wherein the point of view may provide an indication of landmarks which may be visible from the location of the client device 102.


At operation 604, the point cloud module 306 identifies a portion of the point cloud associated with the location based on the contextual condition of the client device 102. At operation 606, the portion of the point cloud is loaded at the client device 102.



FIG. 7 is a diagram 700 depicting a method of selecting a portion of a point cloud, according to certain example embodiments. The diagram 700 includes depictions of a point cloud 705 and a point cloud 710, wherein the point clouds each represent surface features of the same geographic region.


As discussed in the method 600 depicted in FIG. 6, certain embodiments of the AR system 124 provides functionality to selectively filter portions of a point cloud to be loaded at a client device 102, based on a number of factors that may include contextual factors. Accordingly, point cloud 710 represents a selected portion of points from the point cloud 705.


In certain embodiments, the point cloud module 306 may select the portion of the point cloud 705 such that a distribution of points remains the same. For example, as seen in the diagram 700, a distribution of points of the point cloud 710 is roughly the same as the distribution of points seen in the point cloud 705.


In certain embodiments, the point cloud module 306 may select a portion of the point cloud 705 based on a position of a user 715. For example, the point cloud module 306 may access location data from a client device 102 of the user 715 and determine visible landmarks and surfaces from the location of the user 715. A portion of the point cloud 705 may be selected based on what landmarks are visible.


In certain embodiments, the point cloud module 306 may select a portion of the point cloud 705 based on context factors associated with the user 715 (i.e., location, time, etc.), and attributes of each point of the point cloud 705. For example, a point may have attributes that indicate a time of day in which they were collected. Accordingly, responsive to determining a current time associated with the user 715, the point cloud module 715 may select all points from the point cloud 705 which represent surface features of an area at the same time. As an illustrative example, the point cloud module 306 may generate the point cloud 705 based on data objects that include image data, wherein the image data represents an object or location at a specific time or time of day (i.e., night, day). Accordingly, each point of the point cloud 705 may comprise attributes that indicate a time of day in which the point was collected.


In some embodiments, the point cloud module 306 may generate the point cloud 710 by starting with an “empty” point cloud, and then adding one point from the point cloud 705 at a time until a target point cloud property has been reached. For example, the property may include a ratio of visible points to all points from a given perspective of a client device 102. In some embodiments, the property may include a size of the point cloud, in terms of bytes.


Software Architecture



FIG. 8 is a block diagram illustrating an example software architecture 806, which may be used in conjunction with various hardware architectures herein described. FIG. 8 is a non-limiting example of a software architecture and it will be appreciated that many other architectures may be implemented to facilitate the functionality described herein. The software architecture 906 may execute on hardware such as machine 900 of FIG. 9 that includes, among other things, processors 904, memory 914, and I/O components 918. A representative hardware layer 852 is illustrated and can represent, for example, the machine 800 of FIG. 8. The representative hardware layer 852 includes a processing unit 854 having associated executable instructions 804. Executable instructions 804 represent the executable instructions of the software architecture 806, including implementation of the methods, components and so forth described herein. The hardware layer 852 also includes memory and/or storage modules memory/storage 856, which also have executable instructions 804. The hardware layer 852 may also comprise other hardware 858.


In the example architecture of FIG. 8, the software architecture 806 may be conceptualized as a stack of layers where each layer provides particular functionality. For example, the software architecture 806 may include layers such as an operating system 802, libraries 820, applications 816 and a presentation layer 814. Operationally, the applications 816 and/or other components within the layers may invoke application programming interface (API) API calls 808 through the software stack and receive a response as in response to the API calls 808. The layers illustrated are representative in nature and not all software architectures have all layers. For example, some mobile or special purpose operating systems may not provide a frameworks/middleware 818, while others may provide such a layer. Other software architectures may include additional or different layers.


The operating system 802 may manage hardware resources and provide common services. The operating system 802 may include, for example, a kernel 822, services 824 and drivers 826. The kernel 822 may act as an abstraction layer between the hardware and the other software layers. For example, the kernel 822 may be responsible for memory management, processor management (e.g., scheduling), component management, networking, security settings, and so on. The services 824 may provide other common services for the other software layers. The drivers 826 are responsible for controlling or interfacing with the underlying hardware. For instance, the drivers 826 include display drivers, camera drivers, Bluetooth® drivers, flash memory drivers, serial communication drivers (e.g., Universal Serial Bus (USB) drivers), Wi-Fi® drivers, audio drivers, power management drivers, and so forth depending on the hardware configuration.


The libraries 820 provide a common infrastructure that is used by the applications 816 and/or other components and/or layers. The libraries 820 provide functionality that allows other software components to perform tasks in an easier fashion than to interface directly with the underlying operating system 802 functionality (e.g., kernel 822, services 824 and/or drivers 826). The libraries 820 may include system libraries 844 (e.g., C standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematical functions, and the like. In addition, the libraries 820 may include API libraries 846 such as media libraries (e.g., libraries to support presentation and manipulation of various media format such as MPREG4, H.264, MP3, AAC, AMR, JPG, PNG), graphics libraries (e.g., an OpenGL framework that may be used to render 2D and 3D in a graphic content on a display), database libraries (e.g., SQLite that may provide various relational database functions), web libraries (e.g., WebKit that may provide web browsing functionality), and the like. The libraries 820 may also include a wide variety of other libraries 848 to provide many other APIs to the applications 816 and other software components/modules.


The frameworks/middleware 818 (also sometimes referred to as middleware) provide a higher-level common infrastructure that may be used by the applications 816 and/or other software components/modules. For example, the frameworks/middleware 818 may provide various graphic user interface (GUI) functions, high-level resource management, high-level location services, and so forth. The frameworks/middleware 818 may provide a broad spectrum of other APIs that may be utilized by the applications 816 and/or other software components/modules, some of which may be specific to a particular operating system 802 or platform.


The applications 816 include built-in applications 838 and/or third-party applications 840. Examples of representative built-in applications 838 may include, but are not limited to, a contacts application, a browser application, a book reader application, a location application, a media application, a messaging application, and/or a game application. Third-party applications 840 may include an application developed using the ANDROID™ or IOS™ software development kit (SDK) by an entity other than the vendor of the particular platform, and may be mobile software running on a mobile operating system such as IOS™, ANDROID™, WINDOWS® Phone, or other mobile operating systems. The third-party applications 840 may invoke the API calls 808 provided by the mobile operating system (such as operating system 802) to facilitate functionality described herein.


The applications 816 may use built in operating system functions (e.g., kernel 822, services 824 and/or drivers 826), libraries 820, and frameworks/middleware 818 to create user interfaces to interact with users of the system. Alternatively, or additionally, in some systems interactions with a user may occur through a presentation layer, such as presentation layer 814. In these systems, the application/component “logic” can be separated from the aspects of the application/component that interact with a user.



FIG. 9 is a block diagram illustrating components of a machine 900, according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein. Specifically, FIG. 9 shows a diagrammatic representation of the machine 900 in the example form of a computer system, within which instructions 910 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 900 to perform any one or more of the methodologies discussed herein may be executed. As such, the instructions 910 may be used to implement modules or components described herein. The instructions 910 transform the general, non-programmed machine 900 into a particular machine 900 programmed to carry out the described and illustrated functions in the manner described. In alternative embodiments, the machine 900 operates as a standalone device or may be coupled (e.g., networked) to other machines. In a networked deployment, the machine 900 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine 900 may comprise, but not be limited to, a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 910, sequentially or otherwise, that specify actions to be taken by machine 900. Further, while only a single machine 900 is illustrated, the term “machine” shall also be taken to include a collection of machines that individually or jointly execute the instructions 910 to perform any one or more of the methodologies discussed herein.


The machine 900 may include processors 904, memory memory/storage 906, and I/O components 918, which may be configured to communicate with each other such as via a bus 902. The memory/storage 906 may include a memory 914, such as a main memory, or other memory storage, and a storage unit 916, both accessible to the processors 904 such as via the bus 902. The storage unit 916 and memory 914 store the instructions 910 embodying any one or more of the methodologies or functions described herein. The instructions 910 may also reside, completely or partially, within the memory 914, within the storage unit 916, within at least one of the processors 904 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by the machine 900. Accordingly, the memory 914, the storage unit 916, and the memory of processors 904 are examples of machine-readable media.


The I/O components 918 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components 918 that are included in a particular machine 900 will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that the I/O components 918 may include many other components that are not shown in FIG. 9. The I/O components 918 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, the I/O components 918 may include output components 926 and input components 928. The output components 926 may include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor, resistance mechanisms), other signal generators, and so forth. The input components 928 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location and/or force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.


In further example embodiments, the I/O components 918 may include biometric components 930, motion components 934, environmental environment components 936, or position components 938 among a wide array of other components. For example, the biometric components 930 may include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. The motion components 934 may include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth. The environment components 936 may include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometer that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., gas detection sensors to detection concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment. The position components 938 may include location sensor components (e.g., a Global Position system (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.


Communication may be implemented using a wide variety of technologies. The I/O components 918 may include communication components 940 operable to couple the machine 900 to a network 932 or devices 920 via coupling 922 and coupling 924 respectively. For example, the communication components 940 may include a network interface component or other suitable device to interface with the network 932. In further examples, communication components 940 may include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities. The devices 920 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).


Moreover, the communication components 940 may detect identifiers or include components operable to detect identifiers. For example, the communication components 940 may include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, UCC RSS-2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals). In addition, a variety of information may be derived via the communication components 940, such as, location via Internet Protocol (IP) geo-location, location via Wi-Fi® signal triangulation, location via detecting a NFC beacon signal that may indicate a particular location, and so forth.


Glossary

“CARRIER SIGNAL” in this context refers to any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such instructions. Instructions may be transmitted or received over the network using a transmission medium via a network interface device and using any one of a number of well-known transfer protocols.


“CLIENT DEVICE” in this context refers to any machine that interfaces to a communications network to obtain resources from one or more server systems or other client devices. A client device may be, but is not limited to, a mobile phone, desktop computer, laptop, portable digital assistants (PDAs), smart phones, tablets, ultra books, netbooks, laptops, multi-processor systems, microprocessor-based or programmable consumer electronics, game consoles, set-top boxes, or any other communication device that a user may use to access a network.


“COMMUNICATIONS NETWORK” in this context refers to one or more portions of a network that may be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), the Internet, a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a Wi-Fi® network, another type of network, or a combination of two or more such networks. For example, a network or a portion of a network may include a wireless or cellular network and the coupling may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or other type of cellular or wireless coupling. In this example, the coupling may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1×RTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standard, others defined by various standard setting organizations, other long range protocols, or other data transfer technology.


“EMPHEMERAL MESSAGE” in this context refers to a message that is accessible for a time-limited duration. An ephemeral message may be a text, an image, a video and the like. The access time for the ephemeral message may be set by the message sender. Alternatively, the access time may be a default setting or a setting specified by the recipient. Regardless of the setting technique, the message is transitory.


“MACHINE-READABLE MEDIUM” in this context refers to a component, device or other tangible media able to store instructions and data temporarily or permanently and may include, but is not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., Erasable Programmable Read-Only Memory (EEPROM)) and/or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., code) for execution by a machine, such that the instructions, when executed by one or more processors of the machine, cause the machine to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se.


“COMPONENT” in this context refers to a device, physical entity or logic having boundaries defined by function or subroutine calls, branch points, application program interfaces (APIs), or other technologies that provide for the partitioning or modularization of particular processing or control functions. Components may be combined via their interfaces with other components to carry out a machine process. A component may be a packaged functional hardware unit designed for use with other components and a part of a program that usually performs a particular function of related functions. Components may constitute either software components (e.g., code embodied on a machine-readable medium) or hardware components. A “hardware component” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware components of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware component that operates to perform certain operations as described herein. A hardware component may also be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware component may include dedicated circuitry or logic that is permanently configured to perform certain operations. A hardware component may be a special-purpose processor, such as a Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC). A hardware component may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware component may include software executed by a general-purpose processor or other programmable processor. Once configured by such software, hardware components become specific machines (or specific components of a machine) uniquely tailored to perform the configured functions and are no longer general-purpose processors. It will be appreciated that the decision to implement a hardware component mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations. Accordingly, the phrase “hardware component” (or “hardware-implemented component”) should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware components are temporarily configured (e.g., programmed), each of the hardware components need not be configured or instantiated at any one instance in time. For example, where a hardware component comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware components) at different times. Software accordingly configures a particular processor or processors, for example, to constitute a particular hardware component at one instance of time and to constitute a different hardware component at a different instance of time. Hardware components can provide information to, and receive information from, other hardware components. Accordingly, the described hardware components may be regarded as being communicatively coupled. Where multiple hardware components exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware components. In embodiments in which multiple hardware components are configured or instantiated at different times, communications between such hardware components may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware components have access. For example, one hardware component may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware component may then, at a later time, access the memory device to retrieve and process the stored output. Hardware components may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information). The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented components that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented component” refers to a hardware component implemented using one or more processors. Similarly, the methods described herein may be at least partially processor-implemented, with a particular processor or processors being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented components. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an Application Program Interface (API)). The performance of certain of the operations may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processors or processor-implemented components may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented components may be distributed across a number of geographic locations.


“PROCESSOR” in this context refers to any circuit or virtual circuit (a physical circuit emulated by logic executing on an actual processor) that manipulates data values according to control signals (e.g., “commands”, “op codes”, “machine code”, etc.) and which produces corresponding output signals that are applied to operate a machine. A processor may, for example, be a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio-Frequency Integrated Circuit (RFIC) or any combination thereof. A processor may further be a multi-core processor having two or more independent processors (sometimes referred to as “cores”) that may execute instructions contemporaneously.


“TIMESTAMP” in this context refers to a sequence of characters or encoded information identifying when a certain event occurred, for example giving date and time of day, sometimes accurate to a small fraction of a second.


“LIFT” in this context is a measure of the performance of a targeted model at predicting or classifying cases as having an enhanced response (with respect to a population as a whole), measured against a random choice targeting model.


“PHONEME ALIGNMENT” in this context, a phoneme is a unit of speech that differentiates one word from another. One phoneme may consist of a sequence of closure, burst, and aspiration events; or, a dipthong may transition from a back vowel to a front vowel. A speech signal may therefore be described not only by what phonemes it contains, but also the locations of the phonemes. Phoneme alignment may therefore be described as a “time-alignment” of phonemes in a waveform, in order to determine an appropriate sequence and location of each phoneme in a speech signal.


“AUDIO-TO-VISUAL CONVERSION” in this context refers to the conversion of audible speech signals into visible speech, wherein the visible speech may include a mouth shape representative of the audible speech signal.


“TIME DELAYED NEURAL NETWORK (TDNN)” in this context, a TDNN is an artificial neural network architecture whose primary purpose is to work on sequential data. An example would be converting continuous audio into a stream of classified phoneme labels for speech recognition.


“BI-DIRECTIONAL LONG-SHORT TERM MEMORY (BLSTM)” in this context refers to a recurrent neural network (RNN) architecture that remembers values over arbitrary intervals. Stored values are not modified as learning proceeds. RNNs allow forward and backward connections between neurons. BLSTM are well-suited for the classification, processing, and prediction of time series, given time lags of unknown size and duration between events.

Claims
  • 1. A method comprising: receiving a request that identifies a location, the request comprising metadata;accessing a point cloud associated with the location responsive to the request;determining a contextual condition based on the metadata from the request, the contextual condition including a perspective of the client device and temporal data that indicates a time of day of the request;identifying a portion of the point cloud based on the contextual condition that includes the perspective and the time of day; andloading the portion of the point cloud at a client device.
  • 2. The method of claim 1, wherein the request comprises an image that comprises a set of image features, and the method further comprises: identifying the location based on the set of image features of the image.
  • 3. The method of claim 1, wherein the contextual condition includes a connectivity speed of a network associated with the client device.
  • 4. The method of claim 1, wherein the contextual condition includes user profile data.
  • 5. The method of claim 1, wherein the request comprises image data, and the method further comprises: identifying the portion of the point cloud based on the image data.
  • 6. The method of claim 1, further comprising: presenting AR content at the client device based on the portion of the point cloud.
  • 7. A system comprising: a memory; andat least one hardware processor coupled to the memory and comprising instructions that causes the system to perform operations comprising:receiving a request that identifies a location, the request comprising metadata;accessing a point cloud associated with the location responsive to the request;determining a contextual condition based on the metadata from the request, the contextual condition including a perspective of the client device and temporal data that indicates a time of day of the request;identifying a portion of the point cloud based on the contextual condition that includes the perspective and the time of day; andloading the portion of the point cloud at a client device.
  • 8. The system of claim 7, wherein the request comprises an image that comprises a set of image features, and the operations further comprise: identifying the location based on the set of image features of the image.
  • 9. The system of claim 7, wherein the contextual condition includes a connectivity speed of a network associated with the client device.
  • 10. The system of claim 7, wherein the contextual condition includes user profile data.
  • 11. The system of claim 7, wherein the request comprises image data, and the method further comprises: identifying the portion of the point cloud based on the image data.
  • 12. The system of claim 7, further comprising: presenting AR content at the client device based on the portion of the point cloud.
  • 13. A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising: receiving a request that identifies a location, the request comprising metadata;accessing a point cloud associated with the location responsive to the request;determining a contextual condition based on the metadata from the request, the contextual condition including a perspective of the client device and temporal data that indicates a time of day of the request;identifying a portion of the point cloud based on the contextual condition that includes the perspective and the time of day; andloading the portion of the point cloud at a client device.
  • 14. The non-transitory machine-readable storage medium of claim 13, wherein the request comprises an image that comprises a set of image features, and the method further comprises: identifying the location based on the set of image features of the image.
  • 15. The non-transitory machine-readable storage medium of claim 13, wherein the contextual condition includes a connectivity speed of a network associated with the client device.
  • 16. The non-transitory machine-readable storage medium of claim 13, wherein the contextual condition includes user profile data.
  • 17. The non-transitory machine-readable storage medium of claim 13, wherein the request comprises image data, and the method further comprises: identifying the portion of the point cloud based on the image data.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation of U.S. patent application Ser. No. 16/833,160, filed Mar. 27, 2020, which is incorporated by reference herein in its entirety.

US Referenced Citations (627)
Number Name Date Kind
666223 Shedlock Jan 1901 A
4581634 Williams Apr 1986 A
4975690 Torres Dec 1990 A
5072412 Henderson, Jr. et al. Dec 1991 A
5493692 Theimer et al. Feb 1996 A
5713073 Warsta Jan 1998 A
5754939 Herz et al. May 1998 A
5855008 Goldhaber et al. Dec 1998 A
5883639 Walton et al. Mar 1999 A
5999932 Paul Dec 1999 A
6012098 Bayeh et al. Jan 2000 A
6014090 Rosen et al. Jan 2000 A
6029141 Bezos et al. Feb 2000 A
6038295 Mattes Mar 2000 A
6049711 Yehezkel et al. Apr 2000 A
6154764 Nitta et al. Nov 2000 A
6167435 Druckenmiller et al. Dec 2000 A
6204840 Petelycky et al. Mar 2001 B1
6205432 Gabbard et al. Mar 2001 B1
6216141 Straub et al. Apr 2001 B1
6285381 Sawano et al. Sep 2001 B1
6285987 Roth et al. Sep 2001 B1
6310694 Okimoto et al. Oct 2001 B1
6317789 Rakavy et al. Nov 2001 B1
6334149 Davis, Jr. et al. Dec 2001 B1
6349203 Asaoka et al. Feb 2002 B1
6353170 Eyzaguirre et al. Mar 2002 B1
6446004 Cao et al. Sep 2002 B1
6449657 Stanbach et al. Sep 2002 B2
6456852 Bar et al. Sep 2002 B2
6484196 Maurille Nov 2002 B1
6487601 Hubacher et al. Nov 2002 B1
6523008 Avrunin Feb 2003 B1
6542749 Tanaka et al. Apr 2003 B2
6549768 Fraccaroli Apr 2003 B1
6618593 Drutman et al. Sep 2003 B1
6622174 Ukita et al. Sep 2003 B1
6631463 Floyd et al. Oct 2003 B1
6636247 Hamzy et al. Oct 2003 B1
6636855 Holloway et al. Oct 2003 B2
6643684 Malkin et al. Nov 2003 B1
6658095 Yoakum et al. Dec 2003 B1
6665531 Soderbacka et al. Dec 2003 B1
6668173 Greene Dec 2003 B2
6684238 Dutta Jan 2004 B1
6684257 Camut et al. Jan 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6700506 Winkler Mar 2004 B1
6720860 Narayanaswami Apr 2004 B1
6724403 Santoro et al. Apr 2004 B1
6757713 Ogilvie et al. Jun 2004 B1
6832222 Zimowski Dec 2004 B1
6834195 Brandenberg et al. Dec 2004 B2
6836792 Chen Dec 2004 B1
6898626 Ohashi May 2005 B2
6959324 Kubik et al. Oct 2005 B1
6970088 Kovach Nov 2005 B2
6970907 Ullmann et al. Nov 2005 B1
6980909 Root et al. Dec 2005 B2
6981040 Konig et al. Dec 2005 B1
7020494 Spriestersbach et al. Mar 2006 B2
7027124 Foote et al. Apr 2006 B2
7072963 Anderson et al. Jul 2006 B2
7085571 Kalhan et al. Aug 2006 B2
7110744 Freeny, Jr. Sep 2006 B2
7124164 Chemtob Oct 2006 B1
7149893 Leonard et al. Dec 2006 B1
7173651 Knowles Feb 2007 B1
7188143 Szeto Mar 2007 B2
7203380 Chiu et al. Apr 2007 B2
7206568 Sudit Apr 2007 B2
7227937 Yoakum et al. Jun 2007 B1
7237002 Estrada et al. Jun 2007 B1
7240089 Boudreau Jul 2007 B2
7269426 Kokkonen et al. Sep 2007 B2
7280658 Amini et al. Oct 2007 B2
7315823 Brondrup Jan 2008 B2
7349768 Bruce et al. Mar 2008 B2
7356564 Hartselle et al. Apr 2008 B2
7394345 Ehlinger et al. Jul 2008 B1
7411493 Smith Aug 2008 B2
7423580 Markhovsky et al. Sep 2008 B2
7454442 Cobleigh et al. Nov 2008 B2
7508419 Toyama et al. Mar 2009 B2
7519670 Hagale et al. Apr 2009 B2
7535890 Rojas May 2009 B2
7546554 Chiu et al. Jun 2009 B2
7607096 Oreizy et al. Oct 2009 B2
7639943 Kalajan Dec 2009 B1
7650231 Gadler Jan 2010 B2
7668537 DeVries Feb 2010 B2
7770137 Forbes et al. Aug 2010 B2
7778973 Choi Aug 2010 B2
7779444 Glad Aug 2010 B2
7787886 Markhovsky et al. Aug 2010 B2
7796946 Eisenbach Sep 2010 B2
7801954 Cadiz et al. Sep 2010 B2
7856360 Kramer et al. Dec 2010 B2
8001204 Burtner et al. Aug 2011 B2
8032586 Challenger et al. Oct 2011 B2
8082255 Carlson, Jr. et al. Dec 2011 B1
8090351 Klein Jan 2012 B2
8098904 Ioffe et al. Jan 2012 B2
8099109 Altman et al. Jan 2012 B2
8112716 Kobayashi Feb 2012 B2
8131597 Hudetz Mar 2012 B2
8135166 Rhoads Mar 2012 B2
8136028 Loeb et al. Mar 2012 B1
8146001 Reese Mar 2012 B1
8161115 Yamamoto Apr 2012 B2
8161417 Lee Apr 2012 B1
8195203 Tseng Jun 2012 B1
8199747 Rojas et al. Jun 2012 B2
8208943 Petersen Jun 2012 B2
8214443 Hamburg Jul 2012 B2
8234350 Gu et al. Jul 2012 B1
8276092 Narayanan et al. Sep 2012 B1
8279319 Date Oct 2012 B2
8280406 Ziskind et al. Oct 2012 B2
8285199 Hsu et al. Oct 2012 B2
8287380 Nguyen et al. Oct 2012 B2
8301159 Hamynen et al. Oct 2012 B2
8306922 Kunal et al. Nov 2012 B1
8312086 Velusamy et al. Nov 2012 B2
8312097 Siegel et al. Nov 2012 B1
8326315 Phillips et al. Dec 2012 B2
8326327 Hymel et al. Dec 2012 B2
8332475 Rosen et al. Dec 2012 B2
8352546 Dollard Jan 2013 B1
8379130 Forutanpour et al. Feb 2013 B2
8385950 Wagner et al. Feb 2013 B1
8402097 Szeto Mar 2013 B2
8405773 Hayashi et al. Mar 2013 B2
8418067 Cheng et al. Apr 2013 B2
8423409 Rao Apr 2013 B2
8471914 Sakiyama et al. Jun 2013 B2
8472935 Fujisaki Jun 2013 B1
8510383 Hurley et al. Aug 2013 B2
8527345 Rothschild et al. Sep 2013 B2
8554627 Svendsen et al. Oct 2013 B2
8560612 Kilmer et al. Oct 2013 B2
8594680 Ledlie et al. Nov 2013 B2
8613089 Holloway et al. Dec 2013 B1
8660358 Bergboer et al. Feb 2014 B1
8660369 Llano et al. Feb 2014 B2
8660793 Ngo et al. Feb 2014 B2
8682350 Altman et al. Mar 2014 B2
8718333 Wolf et al. May 2014 B2
8724622 Rojas May 2014 B2
8732168 Johnson May 2014 B2
8744523 Fan et al. Jun 2014 B2
8745132 Obradovich Jun 2014 B2
8761800 Kuwahara Jun 2014 B2
8768876 Shim et al. Jul 2014 B2
8775972 Spiegel Jul 2014 B2
8788680 Naik Jul 2014 B1
8790187 Walker et al. Jul 2014 B2
8797415 Arnold Aug 2014 B2
8798646 Wang et al. Aug 2014 B1
8856349 Jain et al. Oct 2014 B2
8874677 Rosen et al. Oct 2014 B2
8886227 Schmidt et al. Nov 2014 B2
8909679 Root et al. Dec 2014 B2
8909725 Sehn Dec 2014 B1
8972357 Shim et al. Mar 2015 B2
8995433 Rojas Mar 2015 B2
9015285 Ebsen et al. Apr 2015 B1
9020745 Johnston et al. Apr 2015 B2
9040574 Wang et al. May 2015 B2
9055416 Rosen et al. Jun 2015 B2
9094137 Sehn et al. Jul 2015 B1
9100806 Rosen et al. Aug 2015 B2
9100807 Rosen et al. Aug 2015 B2
9113301 Spiegel et al. Aug 2015 B1
9119027 Sharon et al. Aug 2015 B2
9123074 Jacobs et al. Sep 2015 B2
9143382 Bhogal et al. Sep 2015 B2
9143681 Ebsen et al. Sep 2015 B1
9152477 Campbell et al. Oct 2015 B1
9191776 Root et al. Nov 2015 B2
9204252 Root Dec 2015 B2
9225897 Sehn et al. Dec 2015 B1
9258459 Hartley Feb 2016 B2
9344606 Hartley et al. May 2016 B2
9385983 Sehn Jul 2016 B1
9396354 Murphy et al. Jul 2016 B1
9407712 Sehn Aug 2016 B1
9407816 Sehn Aug 2016 B1
9430783 Sehn Aug 2016 B1
9439041 Parvizi et al. Sep 2016 B2
9443227 Evans et al. Sep 2016 B2
9450907 Pridmore et al. Sep 2016 B2
9459778 Hogeg et al. Oct 2016 B2
9489661 Evans et al. Nov 2016 B2
9491134 Rosen et al. Nov 2016 B2
9532171 Allen et al. Dec 2016 B2
9537811 Allen et al. Jan 2017 B2
9628950 Noeth et al. Apr 2017 B1
9710821 Heath Jul 2017 B2
9852149 Taylor Dec 2017 B1
9854219 Sehn Dec 2017 B2
9984499 Jurgenson et al. May 2018 B1
10515480 Hare et al. Dec 2019 B1
10657708 Jurgenson et al. May 2020 B1
10956743 Li et al. Mar 2021 B1
10997783 Jurgenson et al. May 2021 B2
11263459 Li et al. Mar 2022 B2
11348265 Nielsen et al. May 2022 B1
11380051 Jurgenson et al. Jul 2022 B2
11430091 Mccormack et al. Aug 2022 B2
20020047868 Miyazawa Apr 2002 A1
20020078456 Hudson et al. Jun 2002 A1
20020087631 Sharma Jul 2002 A1
20020097257 Miller et al. Jul 2002 A1
20020122659 Mcgrath et al. Sep 2002 A1
20020128047 Gates Sep 2002 A1
20020144154 Tomkow Oct 2002 A1
20030001846 Davis et al. Jan 2003 A1
20030016247 Lai et al. Jan 2003 A1
20030017823 Mager et al. Jan 2003 A1
20030020623 Cao et al. Jan 2003 A1
20030023874 Prokupets et al. Jan 2003 A1
20030037124 Yamaura et al. Feb 2003 A1
20030052925 Daimon et al. Mar 2003 A1
20030101230 Benschoter et al. May 2003 A1
20030110503 Perkes Jun 2003 A1
20030126215 Udell Jul 2003 A1
20030148773 Spriestersbach et al. Aug 2003 A1
20030164856 Prager et al. Sep 2003 A1
20030229607 Zellweger et al. Dec 2003 A1
20040027371 Jaeger Feb 2004 A1
20040064429 Hirstius et al. Apr 2004 A1
20040078367 Anderson et al. Apr 2004 A1
20040111467 Willis Jun 2004 A1
20040158739 Wakai et al. Aug 2004 A1
20040189465 Capobianco et al. Sep 2004 A1
20040203959 Coombes Oct 2004 A1
20040215625 Svendsen et al. Oct 2004 A1
20040243531 Dean Dec 2004 A1
20040243688 Wugofski Dec 2004 A1
20050021444 Bauer et al. Jan 2005 A1
20050022211 Veselov et al. Jan 2005 A1
20050048989 Jung Mar 2005 A1
20050078804 Yomoda Apr 2005 A1
20050097176 Schatz et al. May 2005 A1
20050102381 Jiang et al. May 2005 A1
20050104976 Currans May 2005 A1
20050114783 Szeto May 2005 A1
20050119936 Buchanan et al. Jun 2005 A1
20050122405 Voss et al. Jun 2005 A1
20050193340 Amburgey et al. Sep 2005 A1
20050193345 Klassen et al. Sep 2005 A1
20050198128 Anderson Sep 2005 A1
20050223066 Buchheit et al. Oct 2005 A1
20050288954 McCarthy et al. Dec 2005 A1
20060026067 Nicholas et al. Feb 2006 A1
20060107297 Toyama et al. May 2006 A1
20060114338 Rothschild Jun 2006 A1
20060119882 Harris et al. Jun 2006 A1
20060242239 Morishima et al. Oct 2006 A1
20060252438 Ansamaa et al. Nov 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060270419 Crowley et al. Nov 2006 A1
20060287878 Wadhwa et al. Dec 2006 A1
20070004426 Pfleging et al. Jan 2007 A1
20070038715 Collins et al. Feb 2007 A1
20070040931 Nishizawa Feb 2007 A1
20070073517 Panje Mar 2007 A1
20070073823 Cohen et al. Mar 2007 A1
20070075898 Markhovsky et al. Apr 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070136228 Petersen Jun 2007 A1
20070192128 Celestini Aug 2007 A1
20070198340 Lucovsky et al. Aug 2007 A1
20070198495 Buron et al. Aug 2007 A1
20070208751 Cowan et al. Sep 2007 A1
20070210936 Nicholson Sep 2007 A1
20070214180 Crawford Sep 2007 A1
20070214216 Carrer et al. Sep 2007 A1
20070233556 Koningstein Oct 2007 A1
20070233801 Eren et al. Oct 2007 A1
20070233859 Zhao et al. Oct 2007 A1
20070243887 Bandhole et al. Oct 2007 A1
20070244750 Grannan et al. Oct 2007 A1
20070255456 Funayama Nov 2007 A1
20070281690 Altman et al. Dec 2007 A1
20080022329 Glad Jan 2008 A1
20080025701 Ikeda Jan 2008 A1
20080032703 Krumm et al. Feb 2008 A1
20080033930 Warren Feb 2008 A1
20080043041 Hedenstroem et al. Feb 2008 A2
20080049704 Witteman et al. Feb 2008 A1
20080062141 Chandhri Mar 2008 A1
20080076505 Ngyen et al. Mar 2008 A1
20080092233 Tian et al. Apr 2008 A1
20080094387 Chen Apr 2008 A1
20080104503 Beall et al. May 2008 A1
20080109844 Baldeschweiler et al. May 2008 A1
20080120409 Sun et al. May 2008 A1
20080147730 Lee et al. Jun 2008 A1
20080148150 Mall Jun 2008 A1
20080158230 Sharma et al. Jul 2008 A1
20080168033 Ott et al. Jul 2008 A1
20080168489 Schraga Jul 2008 A1
20080189177 Anderton et al. Aug 2008 A1
20080207176 Brackbill et al. Aug 2008 A1
20080208692 Garaventi et al. Aug 2008 A1
20080214210 Rasanen et al. Sep 2008 A1
20080222545 Lemay Sep 2008 A1
20080255976 Altberg et al. Oct 2008 A1
20080256446 Yamamoto Oct 2008 A1
20080256577 Funaki et al. Oct 2008 A1
20080266421 Takahata et al. Oct 2008 A1
20080270938 Carlson Oct 2008 A1
20080288338 Wiseman et al. Nov 2008 A1
20080306826 Kramer et al. Dec 2008 A1
20080313329 Wang et al. Dec 2008 A1
20080313346 Kujawa et al. Dec 2008 A1
20080318616 Chipalkatti et al. Dec 2008 A1
20090006191 Arankalle et al. Jan 2009 A1
20090006565 Velusamy et al. Jan 2009 A1
20090015703 Kim et al. Jan 2009 A1
20090024956 Kobayashi Jan 2009 A1
20090030774 Rothschild et al. Jan 2009 A1
20090030999 Gatzke et al. Jan 2009 A1
20090040324 Nonaka Feb 2009 A1
20090042588 Lottin et al. Feb 2009 A1
20090058822 Chaudhri Mar 2009 A1
20090079846 Chou Mar 2009 A1
20090089678 Sacco et al. Apr 2009 A1
20090089710 Wood et al. Apr 2009 A1
20090093261 Ziskind Apr 2009 A1
20090132341 Klinger May 2009 A1
20090132453 Hangartner et al. May 2009 A1
20090132665 Thomsen et al. May 2009 A1
20090148045 Lee et al. Jun 2009 A1
20090153492 Popp Jun 2009 A1
20090157450 Athsani et al. Jun 2009 A1
20090157752 Gonzalez Jun 2009 A1
20090160970 Fredlund et al. Jun 2009 A1
20090163182 Gatti et al. Jun 2009 A1
20090177299 Van De Sluis Jul 2009 A1
20090192900 Collision Jul 2009 A1
20090199242 Johnson et al. Aug 2009 A1
20090215469 Fisher et al. Aug 2009 A1
20090232354 Camp, Jr. et al. Sep 2009 A1
20090234815 Boerries et al. Sep 2009 A1
20090239552 Churchill et al. Sep 2009 A1
20090249222 Schmidt et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090265647 Martin et al. Oct 2009 A1
20090288022 Almstrand et al. Nov 2009 A1
20090291672 Treves et al. Nov 2009 A1
20090292608 Polachek Nov 2009 A1
20090319607 Belz et al. Dec 2009 A1
20090327073 Li Dec 2009 A1
20100062794 Han Mar 2010 A1
20100082427 Burgener et al. Apr 2010 A1
20100082693 Hugg et al. Apr 2010 A1
20100100568 Papin et al. Apr 2010 A1
20100113065 Narayan et al. May 2010 A1
20100130233 Lansing May 2010 A1
20100131880 Lee et al. May 2010 A1
20100131895 Wohlert May 2010 A1
20100153144 Miller et al. Jun 2010 A1
20100159944 Pascal et al. Jun 2010 A1
20100161658 Hamynen et al. Jun 2010 A1
20100161831 Haas et al. Jun 2010 A1
20100162149 Sheleheda et al. Jun 2010 A1
20100183280 Beauregard et al. Jul 2010 A1
20100185552 Deluca et al. Jul 2010 A1
20100185665 Horn et al. Jul 2010 A1
20100191631 Weidmann Jul 2010 A1
20100197318 Petersen et al. Aug 2010 A1
20100197319 Petersen et al. Aug 2010 A1
20100198683 Aarabi Aug 2010 A1
20100198694 Muthukrishnan Aug 2010 A1
20100198826 Petersen et al. Aug 2010 A1
20100198828 Petersen et al. Aug 2010 A1
20100198862 Jennings et al. Aug 2010 A1
20100198870 Petersen et al. Aug 2010 A1
20100198917 Petersen et al. Aug 2010 A1
20100201482 Robertson et al. Aug 2010 A1
20100201536 Robertson et al. Aug 2010 A1
20100214436 Kim et al. Aug 2010 A1
20100223128 Dukellis et al. Sep 2010 A1
20100223343 Bosan et al. Sep 2010 A1
20100250109 Johnston et al. Sep 2010 A1
20100257196 Waters et al. Oct 2010 A1
20100259386 Holley et al. Oct 2010 A1
20100273509 Sweeney et al. Oct 2010 A1
20100281045 Dean Nov 2010 A1
20100306669 Della Pasqua Dec 2010 A1
20110004071 Faiola et al. Jan 2011 A1
20110010205 Richards Jan 2011 A1
20110029512 Folgner et al. Feb 2011 A1
20110040783 Uemichi et al. Feb 2011 A1
20110040804 Peirce et al. Feb 2011 A1
20110050909 Ellenby et al. Mar 2011 A1
20110050915 Wang et al. Mar 2011 A1
20110064388 Brown et al. Mar 2011 A1
20110066743 Hurley et al. Mar 2011 A1
20110083101 Sharon et al. Apr 2011 A1
20110102630 Rukes May 2011 A1
20110119133 Igelman et al. May 2011 A1
20110137881 Cheng et al. Jun 2011 A1
20110145564 Moshir et al. Jun 2011 A1
20110159890 Fortescue et al. Jun 2011 A1
20110164163 Bilbrey et al. Jul 2011 A1
20110197194 D'Angelo et al. Aug 2011 A1
20110202598 Evans et al. Aug 2011 A1
20110202968 Nurmi Aug 2011 A1
20110211534 Schmidt et al. Sep 2011 A1
20110213845 Logan et al. Sep 2011 A1
20110215966 Kim et al. Sep 2011 A1
20110225048 Nair Sep 2011 A1
20110238763 Shin et al. Sep 2011 A1
20110255736 Thompson et al. Oct 2011 A1
20110273575 Lee Nov 2011 A1
20110282799 Huston Nov 2011 A1
20110283188 Farrenkopf Nov 2011 A1
20110314419 Dunn et al. Dec 2011 A1
20110320373 Lee et al. Dec 2011 A1
20120028659 Whitney et al. Feb 2012 A1
20120033718 Kauffman et al. Feb 2012 A1
20120036015 Sheikh Feb 2012 A1
20120036443 Ohmori et al. Feb 2012 A1
20120054797 Skog et al. Mar 2012 A1
20120059722 Rao Mar 2012 A1
20120062805 Candelore Mar 2012 A1
20120084731 Filman et al. Apr 2012 A1
20120084835 Thomas et al. Apr 2012 A1
20120099800 Llano et al. Apr 2012 A1
20120108293 Law et al. May 2012 A1
20120110096 Smarr et al. May 2012 A1
20120113143 Adhikari et al. May 2012 A1
20120113272 Hata May 2012 A1
20120123830 Svendsen et al. May 2012 A1
20120123871 Svendsen et al. May 2012 A1
20120123875 Svendsen et al. May 2012 A1
20120124126 Alcazar et al. May 2012 A1
20120124176 Curtis et al. May 2012 A1
20120124458 Cruzada May 2012 A1
20120131507 Sparandara et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120143760 Abulafia et al. Jun 2012 A1
20120150978 Monaco Jun 2012 A1
20120165100 Lalancette et al. Jun 2012 A1
20120166971 Sachson et al. Jun 2012 A1
20120169855 Oh Jul 2012 A1
20120172062 Altman et al. Jul 2012 A1
20120173991 Roberts et al. Jul 2012 A1
20120176401 Hayward et al. Jul 2012 A1
20120184248 Speede Jul 2012 A1
20120197724 Kendall Aug 2012 A1
20120200743 Blanchflower et al. Aug 2012 A1
20120209924 Evans et al. Aug 2012 A1
20120210244 De Francisco et al. Aug 2012 A1
20120212632 Mate et al. Aug 2012 A1
20120220264 Kawabata Aug 2012 A1
20120226748 Bosworth et al. Sep 2012 A1
20120233000 Fisher et al. Sep 2012 A1
20120236162 Imamura Sep 2012 A1
20120239761 Linner et al. Sep 2012 A1
20120250951 Chen Oct 2012 A1
20120252418 Kandekar et al. Oct 2012 A1
20120254325 Majeti et al. Oct 2012 A1
20120278387 Garcia et al. Nov 2012 A1
20120278692 Shi Nov 2012 A1
20120290637 Perantatos et al. Nov 2012 A1
20120299954 Wada et al. Nov 2012 A1
20120304052 Tanaka et al. Nov 2012 A1
20120304080 Wormald et al. Nov 2012 A1
20120307096 Ford et al. Dec 2012 A1
20120307112 Kunishige et al. Dec 2012 A1
20120319904 Lee et al. Dec 2012 A1
20120323933 He et al. Dec 2012 A1
20120324018 Metcalf et al. Dec 2012 A1
20130006759 Srivastava et al. Jan 2013 A1
20130024757 Doll et al. Jan 2013 A1
20130036364 Johnson Feb 2013 A1
20130045753 Obermeyer et al. Feb 2013 A1
20130050260 Reitan Feb 2013 A1
20130055083 Fino Feb 2013 A1
20130057587 Leonard et al. Mar 2013 A1
20130059607 Herz et al. Mar 2013 A1
20130060690 Oskolkov et al. Mar 2013 A1
20130063369 Malhotra et al. Mar 2013 A1
20130067027 Song et al. Mar 2013 A1
20130071093 Hanks et al. Mar 2013 A1
20130080254 Thramann Mar 2013 A1
20130085790 Palmer et al. Apr 2013 A1
20130086072 Peng et al. Apr 2013 A1
20130090171 Holton et al. Apr 2013 A1
20130095857 Garcia et al. Apr 2013 A1
20130104053 Thornton et al. Apr 2013 A1
20130110885 Brundrett, III May 2013 A1
20130111514 Slavin et al. May 2013 A1
20130128059 Kristensson May 2013 A1
20130129252 Lauper May 2013 A1
20130132477 Bosworth et al. May 2013 A1
20130145286 Feng et al. Jun 2013 A1
20130159110 Rajaram et al. Jun 2013 A1
20130159919 Leydon Jun 2013 A1
20130169822 Zhu et al. Jul 2013 A1
20130173729 Starenky et al. Jul 2013 A1
20130182133 Tanabe Jul 2013 A1
20130185131 Sinha et al. Jul 2013 A1
20130191198 Carlson et al. Jul 2013 A1
20130194301 Robbins et al. Aug 2013 A1
20130198176 Kim Aug 2013 A1
20130218965 Abrol et al. Aug 2013 A1
20130218968 Mcevilly et al. Aug 2013 A1
20130222323 Mckenzie Aug 2013 A1
20130227476 Frey Aug 2013 A1
20130232194 Knapp et al. Sep 2013 A1
20130263031 Oshiro et al. Oct 2013 A1
20130265450 Barnes, Jr. Oct 2013 A1
20130267253 Case et al. Oct 2013 A1
20130275505 Gauglitz et al. Oct 2013 A1
20130290443 Collins et al. Oct 2013 A1
20130304646 De Geer Nov 2013 A1
20130311255 Cummins et al. Nov 2013 A1
20130325964 Berberat Dec 2013 A1
20130344896 Kirmse et al. Dec 2013 A1
20130346869 Asver et al. Dec 2013 A1
20130346877 Borovoy et al. Dec 2013 A1
20140006129 Heath Jan 2014 A1
20140011538 Mulcahy et al. Jan 2014 A1
20140019264 Wachman et al. Jan 2014 A1
20140032682 Prado et al. Jan 2014 A1
20140043204 Basnayake et al. Feb 2014 A1
20140045530 Gordon et al. Feb 2014 A1
20140047016 Rao Feb 2014 A1
20140047045 Baldwin et al. Feb 2014 A1
20140047335 Lewis et al. Feb 2014 A1
20140049652 Moon et al. Feb 2014 A1
20140052485 Shidfar Feb 2014 A1
20140052633 Gandhi Feb 2014 A1
20140057660 Wager Feb 2014 A1
20140082651 Sharifi Mar 2014 A1
20140092130 Anderson et al. Apr 2014 A1
20140096029 Schultz Apr 2014 A1
20140114565 Aziz et al. Apr 2014 A1
20140122658 Haeger et al. May 2014 A1
20140122787 Shalvi et al. May 2014 A1
20140129953 Spiegel May 2014 A1
20140143143 Fasoli et al. May 2014 A1
20140149519 Redfern et al. May 2014 A1
20140155102 Cooper et al. Jun 2014 A1
20140173424 Hogeg et al. Jun 2014 A1
20140173457 Wang et al. Jun 2014 A1
20140189592 Benchenaa et al. Jul 2014 A1
20140207679 Cho Jul 2014 A1
20140214471 Schreiner, III Jul 2014 A1
20140222564 Kranendonk et al. Aug 2014 A1
20140258405 Perkin Sep 2014 A1
20140265359 Cheng et al. Sep 2014 A1
20140266703 Dalley, Jr. et al. Sep 2014 A1
20140279061 Elimeliah et al. Sep 2014 A1
20140279436 Dorsey et al. Sep 2014 A1
20140279540 Jackson Sep 2014 A1
20140280537 Pridmore et al. Sep 2014 A1
20140282096 Rubinstein et al. Sep 2014 A1
20140287779 O'keefe et al. Sep 2014 A1
20140289833 Briceno Sep 2014 A1
20140306986 Gottesman et al. Oct 2014 A1
20140317302 Naik Oct 2014 A1
20140324627 Haver et al. Oct 2014 A1
20140324629 Jacobs Oct 2014 A1
20140325383 Brown et al. Oct 2014 A1
20150016666 Payne, Jr. Jan 2015 A1
20150020086 Chen et al. Jan 2015 A1
20150046278 Pei et al. Feb 2015 A1
20150071619 Brough Mar 2015 A1
20150087263 Branscomb et al. Mar 2015 A1
20150088622 Ganschow et al. Mar 2015 A1
20150095020 Leydon Apr 2015 A1
20150096042 Mizrachi Apr 2015 A1
20150116529 Wu et al. Apr 2015 A1
20150169827 Laborde Jun 2015 A1
20150172534 Miyakawa et al. Jun 2015 A1
20150178260 Brunson Jun 2015 A1
20150222814 Li et al. Aug 2015 A1
20150261917 Smith Sep 2015 A1
20150312184 Langholz et al. Oct 2015 A1
20150350136 Flynn, III et al. Dec 2015 A1
20150365795 Allen et al. Dec 2015 A1
20150378502 Hu et al. Dec 2015 A1
20160006927 Sehn Jan 2016 A1
20160014063 Hogeg et al. Jan 2016 A1
20160085773 Chang et al. Mar 2016 A1
20160085863 Allen et al. Mar 2016 A1
20160099901 Allen et al. Apr 2016 A1
20160180887 Sehn Jun 2016 A1
20160182422 Sehn et al. Jun 2016 A1
20160182875 Sehn Jun 2016 A1
20160239248 Sehn Aug 2016 A1
20160277419 Allen et al. Sep 2016 A1
20160321708 Sehn Nov 2016 A1
20170006094 Abou Mahmoud et al. Jan 2017 A1
20170061308 Chen et al. Mar 2017 A1
20170262154 Black et al. Sep 2017 A1
20170287006 Azmoodeh et al. Oct 2017 A1
20180176483 Knorr et al. Jun 2018 A1
20180204469 Moster Jul 2018 A1
20190026400 Fuscoe Jan 2019 A1
20190102941 Khan et al. Apr 2019 A1
20190149725 Adato May 2019 A1
20190188477 Mair Jun 2019 A1
20190279420 Moreno Sep 2019 A1
20200065711 Clément Feb 2020 A1
20200073969 Kursar Mar 2020 A1
20200090409 Fink et al. Mar 2020 A1
20200167956 Herman May 2020 A1
20200219312 Jurgenson et al. Jul 2020 A1
20200250858 Li et al. Aug 2020 A1
20200265548 Burleigh et al. Aug 2020 A1
20200276973 Meijburg Sep 2020 A1
20200363216 Elvanoglu Nov 2020 A1
20210019946 Sonasath et al. Jan 2021 A1
20210125411 Choi et al. Apr 2021 A1
20210174578 Jurgenson et al. Jun 2021 A1
20210297502 Seul Sep 2021 A1
20210303859 Li et al. Sep 2021 A1
20210304369 Mccormack et al. Sep 2021 A1
20220148309 Li et al. May 2022 A1
Foreign Referenced Citations (39)
Number Date Country
2887596 Jul 2015 CA
115335820 Nov 2022 CN
115698907 Feb 2023 CN
2051480 Apr 2009 EP
2151797 Feb 2010 EP
3547157 Oct 2019 EP
2399928 Sep 2004 GB
19990073076 Oct 1999 KR
20010078417 Aug 2001 KR
20220154816 Nov 2022 KR
WO-1996024213 Aug 1996 WO
WO-1999063453 Dec 1999 WO
WO-2000058882 Oct 2000 WO
WO-2001029642 Apr 2001 WO
WO-2001050703 Jul 2001 WO
WO-2006118755 Nov 2006 WO
WO-2007092668 Aug 2007 WO
WO-2009043020 Apr 2009 WO
WO-2011040821 Apr 2011 WO
WO-2011119407 Sep 2011 WO
WO-2013008238 Jan 2013 WO
WO-2013045753 Apr 2013 WO
WO-2014006129 Jan 2014 WO
WO-2014068573 May 2014 WO
WO-2014115136 Jul 2014 WO
WO-2014194262 Dec 2014 WO
WO-2015192026 Dec 2015 WO
WO-2016044424 Mar 2016 WO
WO-2016054562 Apr 2016 WO
WO-2016065131 Apr 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100342 Jun 2016 WO
WO-2016149594 Sep 2016 WO
WO-2016179166 Nov 2016 WO
WO-2021195192 Sep 2021 WO
WO-2021195670 Sep 2021 WO
WO-2021252201 Dec 2021 WO
WO-2022147031 Jul 2022 WO
Non-Patent Literature Citations (41)
Entry
“A Whole New Story”, Snap, Inc., [Online] Retrieved from the Internet: <URL: https://www.snap.com/en-US/news/>, (2017), 13 pgs.
“Adding photos to your listing”, eBay, [Online] Retrieved from the Internet: <URL: http://pages.ebay.com/help/sell/pictures.html>, (accessed May 24, 2017), 4 pgs.
“U.S. Appl. No. 16/833,087, Notice of Allowance dated Nov. 23, 2020”, 10 pgs.
“U.S. Appl. No. 16/833,160, Non Final Office Action dated Nov. 30, 2021”, 8 pgs.
“U.S. Appl. No. 16/833,160, Notice of Allowability dated May 4, 2022”, 2 pgs.
“U.S. Appl. No. 16/833,160, Notice of Allowance dated Apr. 25, 2022”, 6 pgs.
“U.S. Appl. No. 16/833,160, Response filed Feb. 28, 2022 to Non Final Office Action dated Nov. 30, 2021”, 9 pgs.
“U.S. Appl. No. 17/119,597, Non Final Office Action dated Aug. 24, 2021”, 8 Pgs.
“U.S. Appl. No. 17/119,597, Notice of Allowance dated Oct. 20, 2021”, 9 pgs.
“U.S. Appl. No. 17/119,597, Response filed Sep. 14, 2021 to Non Final Office Action dated Aug. 24, 2021”, 7 pgs.
“U.S. Appl. No. 17/119,597, Supplemental Notice of Allowability dated Nov. 3, 2021”, 6 pgs.
“BlogStomp”, StompSoftware, [Online] Retrieved from the Internet: <URL: http://stompsoftware.com/blogstomp>, (accessed May 24, 2017), 12 pgs.
“Cup Magic Starbucks Holiday Red Cups come to life with AR app”, Blast Radius, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20160711202454/http://www.blastradius.com/work/cup-magic>, (2016), 7 pgs.
“Daily App: InstaPlace (iOS/Android): Give Pictures a Sense of Place”, TechPP, [Online] Retrieved from the Internet: <URL: http://techpp.com/2013/02/15/instaplace-app-review>, (2013), 13 pgs.
“InstaPlace Photo App Tell The Whole Story”, [Online] Retrieved from the Internet: <URL: youtu.be/uF_gFkg1hBM>, (Nov. 8, 2013), 113 pgs., 1:02 min.
“International Application Serial No. PCT/US2015/037251, International Search Report dated Sep. 29, 2015”, 2 pgs.
“International Application Serial No. PCT/US2021/023854, International Search Report dated Jun. 29, 2021”, 4 pgs.
“International Application Serial No. PCT/US2021/023854, Written Opinion dated Jun. 29, 2021”, 8 pgs.
“International Application Serial No. PCT/US2021/070318, International Search Report dated Jun. 30, 2021”, 4 pgs.
“International Application Serial No. PCT/US2021/070318, Written Opinion dated Jun. 30, 2021”, 5 pgs.
“Introducing Snapchat Stories”, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20131026084921/https://www.youtube.com/watch?v=88Cu3yN-LIM>, (Oct. 3, 2013), 92 pgs.; 00:47 min.
“Macy's Believe-o-Magic”, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20190422101854/https://www.youtube.com/watch?v=xvzRXy3J0Z0&feature=youtu.be>, (Nov. 7, 2011), 102 pgs.; 00:51 min.
“Macy's Introduces Augmented Reality Experience in Stores across Country as Part of Its 2011 Believe Campaign”, Business Wire, [Online] Retrieved from the Internet: <URL: https://www.businesswire.com/news/home/20111102006759/en/Macys-Introduces-Augmented-Reality-Experience-Stores-Country>, (Nov. 2, 2011), 6 pgs.
“Starbucks Cup Magic”, [Online] Retrieved from the Internet: <URL: https://www.youtube.com/watch?v=RWwQXi9RG0w>, (Nov. 8, 2011), 87 pgs.; 00:47 min.
“Starbucks Cup Magic for Valentine's Day”, [Online] Retrieved from the Internet: <URL: https://www.youtube.com/watch?v=8nvqOzjq10w>, (Feb. 6, 2012), 88 pgs.; 00:45 min.
“Starbucks Holiday Red Cups Come to Life, Signaling the Return of the Merriest Season”, Business Wire, [Online] Retrieved from the Internet: <URL: http://www.businesswire.com/news/home/20111115005744/en/2479513/Starbucks-Holiday-Red-Cups-Life-Signaling-Return>, (Nov. 15, 2011), 5 pgs.
Carthy, Roi, “Dear All Photo Apps: Mobli Just Won Filters”, TechCrunch, [Online] Retrieved from the Internet: <URL: https://techcrunch.com/2011/09/08/mobli-filters>, (Sep. 8, 2011), 10 pgs.
Janthong, Isaranu, “Instaplace ready on Android Google Play store”, Android App Review Thailand, [Online] Retrieved from the Internet: <URL: http://www.android-free-app-review.com/2013/01/instaplace-android-google-play-store.html>, (Jan. 23, 2013), 9 pgs.
Macleod, Duncan, “Macys Believe-o-Magic App”, [Online] Retrieved from the Internet: <URL: http://theinspirationroom.com/daily/2011/macys-believe-o-magic-app>, (Nov. 14, 2011), 10 pgs.
Macleod, Duncan, “Starbucks Cup Magic Lets Merry”, [Online] Retrieved from the Internet: <URL: http://theinspirationroom.com/daily/2011/starbucks-cup-magic>, (Nov. 12, 2011), 8 pgs.
Notopoulos, Katie, “A Guide To The New Snapchat Filters And Big Fonts”, [Online] Retrieved from the Internet: <URL: https://www.buzzfeed.com/katienotopoulos/a-guide-to-the-new-snapchat-filters-and-big-fonts?utm_term =. bkQ9qVZWe#.nv58YXpkV>, (Dec. 22, 2013), 13 pgs.
Panzarino, Matthew, “Snapchat Adds Filters, A Replay Function And For Whatever Reason, Time, Temperature And Speed Overlays”, TechCrunch, [Online] Retrieved form the Internet: <URL: https://techcrunch.com/2013/12/20/snapchat-adds-filters-new-font-and-for-some-reason-time-temperature-and-speed-overlays/>, (Dec. 20, 2013), 12 pgs.
Tripathi, Rohit, “Watermark Images in PHP And Save File on Server”, [Online] Retrieved from the Internet: <URL: http://code.rohitink.com/2012/12/28/watermark-images-in-php-and-save-file-on-server>, (Dec. 28, 2012), 4 pgs.
U.S. Appl. No. 16/833,087 U.S. Pat. No. 10,956,743, filed Mar. 27, 2020, Shared Augmented Reality System.
U.S. Appl. No. 17/119,597 U.S. Pat. No. 11,263,459, filed Dec. 11, 2020, Shared Augmented Reality System.
U.S. Appl. No. 17/584,946, filed Jan. 26, 2022, Shared Augmented Reality System.
U.S. Appl. No. 16/833,160, filed Mar. 27, 2020, Location Mapping for Large Scale Augmented-Reality.
“International Application Serial No. PCT/US2021/023854, International Preliminary Report on Patentability dated Oct. 6, 2022”, 10 pgs.
“International Application Serial No. PCT/US2021/070318, International Preliminary Report on Patentability dated Oct. 6, 2022”, 7 pgs.
“U.S. Appl. No. 17/584,946, Notice of Allowance dated May 12, 2023”, 9 pgs.
“U.S. Appl. No. 17/584,946, Supplemental Notice of Allowability dated May 24, 2023”, 2 pgs.
Related Publications (1)
Number Date Country
20220301122 A1 Sep 2022 US
Continuations (1)
Number Date Country
Parent 16833160 Mar 2020 US
Child 17837713 US