System and method for an augmented reality experience via an artificial intelligence bot

Information

  • Patent Grant
  • 11842454
  • Patent Number
    11,842,454
  • Date Filed
    Wednesday, December 15, 2021
    3 years ago
  • Date Issued
    Tuesday, December 12, 2023
    a year ago
Abstract
The present disclosure describes systems and methods that apply artificial intelligence to augmented reality for enhanced user experience. A sensor, such as a camera, on a mobile phone can capture output data of a laptop computer or other computing device, such as a user interface. The captured image can be assessed by the mobile phone to determine an intent of a user and to display a three dimensional rendering, such as an avatar, on the mobile phone overlaid on top of the user interface of the laptop. The avatar can help navigate the user, such as pointing to areas of a webpage of interest, or inviting the user to scroll down on a webpage to a portion of the webpage that may not be in current view.
Description
SUMMARY

In traditional customer servicing environments, users are wary of being served by bots because they often feel a dehumanizing experience. Also in modern times, users are also more concerned with their information privacy than ever; they want to restrict the access of Personally Identifiable Information (“PII”) to third parties. However, through the use of augmented reality and artificial intelligence bots that handle fulfillment, embodiments of an augmented reality user service experience system of the present disclosure can deliver both a rich customer experience that can also preserve user privacy.


The current limitation of bots is that they can only fulfill a pre-programed set of actions. Furthermore, those bots inform the user what to do based entirely on rules-based logic. Through the application of Artificial intelligence (“AI”), embodiments of the augmented reality (“AR”) user service experience system of the present disclosure can expand a bot's set of actions. The artificial intelligence within the bot can enable the bot to learn and eventually adapt to different circumstances in order to better serve the user. At the same time, by moving to a paradigm where the bot is deployed in augmented reality, the bots of the augmented reality user service experience system can display visual indicators to a user to show the user specific steps on how to fulfill their request, which can result in a better digital replication of a more human experience.


In some embodiments, the augmented reality user service experience system can connect to a mobile phone via a mobile application. The camera of the mobile phone can take a picture of a screen of a second device, such as a laptop or tablet, which is hosting a webpage or an application. The augmented reality user service experience system can deploy an avatar on the mobile phone that hosts the AI bot, and/or the augmented reality user service experience system can deploy the AI bot in the background of the camera. The augmented reality user service experience system can deploy the bot without a visual representation on screen. The camera (or other sensor) can detect some information on the screen it is viewing, such as an anchor, a visual fingerprint, a moving image such as a video or animated file, and/or the like, which is sent to the hosted AI bot. The AI bot can make certain decisions based on the captured image, audio, video, or sensor data, and react accordingly to deliver an immersive experience for the user.


For example, if the length of a webpage exceeds the viewable area of the screen based on the resolution, the user may be looking for a particular control and/or function on another portion of the page that is different from that which is currently being displayed, and thus may not be able to find it. The bot may identify the item the user is looking for based on assessment of the user intent and direct the user where to find the item he or she is looking for. If the control and/or function is off the screen (or on another page of the website), the bot may display or provide an indication to move to another page or portion of the current page, such as, for example, a visual indicator to the user to scroll up or down, and/or a visual indicator of the bot pointing above or below the screen. After the control and/or function of interest appears on the screen, the representation of the bot can overlay onto the control and/or function to show the user where exactly on the screen the item he or she is looking for is located.


In some embodiments, if there is no bot overlay, the camera or sensor could either overlay directly with a marker that highlights the item, or the augmented reality user service experience system could control the scrolling of the other session to display that item on the website itself. Advantageously, embodiments of the augmented reality user service experience system can improve on the technical problem of having limited display real estate, such as, for example, not having enough space (such as a small screen of a mobile device) to cover a large display area (such as a long webpage). The augmented reality user service experience system can direct the user to a portion of the screen that is not currently being shown based on an assessment of the intent of the user. The user may be a user of a system or device and may include consumers or customers.


Additionally in other embodiments, the bot can communicate with the screen session's website and reload the page entirely. This embodiment can occur in solutions where the bot experience does not control the complete session, but would need to assume control in order to deliver an enhanced user experience to the consumer. In order to assume control, the screen experience can include a browser plugin, be within a mobile app (where the browser is an overlay within the app), or within an iframe of a browser where the iframe can communicate with the bot.


Traditional customer self-service systems do not combine augmented reality functionality with artificial intelligence applications. In some embodiments, the augmented reality user service experience system merges artificial intelligence and augmented reality to show users how to perform certain actions (for example, scrolling, click locations for the “buy” button), instead of simply being told what to do (for example, click on the “buy” button).


Currently there are no self-service methods where users are shown what actions to take. Even if another third party agent is involved, there is technical risk. Since showing another third party agent personal identifiable information introduces sensitive information to a third party, the showing of sensitive information is subject to data security issues. If via transfer of the data to the third party agent and/or at the third party agent servers, the data including sensitive information can be intercepted or hacked. Advantageously, embodiments of the augmented reality user service experience system improve the technical issue of data security by not having to subject the sensitive issues to a third party agent. The augmented reality user service experience system can process the data internally on the two devices, such as the mobile phone, a laptop, an AR/VR device, a tablet and/or the like. For the purposes of this disclosure, a mobile phone and a laptop will be described, but it is understood that other devices alone or in combination can be used. In some embodiments, the data transfer can be limited to transfer between the two devices to limit external third party data transfer. In other embodiments, no data may be transferred between the two devices other than the mobile device capturing (such as via a camera) visual data displayed by the other device, such that the mobile device may suggest to a user information for the user to manually enter via the other device (e.g., via a laptop displaying a website or other user interface).


Features are described for embodiments of an augmented reality user service experience system that uses artificial intelligence and/or a bot. In some embodiments, the augmented reality user service experience system provides a two-dimensional (2D) presentation layer that can enable electronic connection to or communications with a bot that can correspond with a webpage, an application (such as a mobile application), a mobile device, and/or the like.


Embodiments of the augmented reality user service experience system can enable a bot to identify a web page and/or web application on one device (such as a laptop) from data (such as an image) captured by a second device (such as a mobile phone). By doing so, the augmented reality user service experience system can deliver a new user experience wherein users can be shown particular steps to perform through an augmented reality display. Advantageously, the augmented reality user service experience system can guide the user to perform certain steps on the first device through the display on the second device, such as a series of clicks on the website to fix things, rather than simply being told what to look for. Embodiments of the augmented reality user service experience system can deliver a new paradigm of customer service where users can be provided highly relevant and specific information useful for the user's purpose. Moreover, the augmented reality user service experience system can provide service to the users while preserving their personally identifiable information (“PII”).


In some embodiments, the augmented reality user service experience system can include and/or embed anchors within the website or application. In some embodiments, the augmented reality user service experience system can read anchors embedded within a website or application. In some embodiments, a sensor (such as a camera) on a second device can capture an image of the anchors on a website, and the second device can process the image to determine where certain items on the webpage or application are located.


In some embodiments, a smart phone running a mobile application can launch an AR bot. The user can send a request to the phone to find a particular setting or option for a webpage, such as a selectable button to apply for a loan. The bot can cause the mobile phone to display visual indicators that indicate points on the screen (or in the direction off screen to help navigation) where the setting or option is located. The visual indicators can include three dimensional renderings, such as an avatar that points in a certain direction. If the setting or controllable option is on a different screen, the visual indicators can indicate the page and/or indicate locations of navigation controls that can send the user to the correct page.


In some embodiments, a user can be browsing through a webpage, but may be having trouble figuring out how to lock his or her credit file, or revoke permission to share data with a third party. The user can ask the augmented reality user service experience system “Where is the setting or option to lock my credit?” The bot of the augmented reality user service experience system can indicate to the user to launch a particular mobile application on the user's phone and point the camera toward the user's laptop screen. After the mobile application is launched, the session on the mobile application can synchronize with the screen session on the user's laptop screen. The bot can identify that the data captured by the mobile phone camera and the website screen are synced as both are active at the same time. The bot can launch active content on a user interface of the phone and/or can scan the screen to determine how to process the request. The bot can cause display on the mobile device visual indicators that can show the user where the setting or option for the credit lock/unlock is located.


Advantageously, in some embodiments, the augmented reality user service experience system can deliver a “Showing” experience to the user, displaying visual indicators on the mobile device that show the user how to perform certain functions on another device. Embodiments of the augmented reality user service experience system improve the technical problem of graphical user interfaces with limited user interface real estate. Moreover, the augmented reality user service experience system uses a specific and useful way to convey information through a computer interface.


Some embodiments include a method comprising: capturing, by a camera of a mobile phone, an image of a display screen of a computer, wherein the display screen displays a user interface presented by the computer, wherein the computer is a different device than the mobile phone and is physically separate from the mobile phone; identifying, by the mobile phone based on image analysis of the image captured by the mobile phone, an anchor within the user interface as depicted in the image of the display screen; determining one or more content items displayed within the user interface based at least in part on the anchor; identifying an intent of a user based on the determined one or more content items, wherein the intent relates to interaction by the user with the user interface presented by the computer; determining supplemental content associated with the intent, wherein the supplemental content comprises instructions or a recommendation regarding one or more of (a) interacting with the user interface via the computer or (b) information for the user to provide to the user interface via the computer; generating a three-dimensional rendering associated with the supplemental content; orienting the three-dimensional rendering relative to the anchor; and displaying, on the mobile phone, a modified image that places the oriented three-dimensional rendering in coordinates of the image that are determined based at least in part on a position of the anchor within the image.


In some embodiments, the three-dimensional rendering includes an avatar.


In some embodiments, the intent corresponds to a portion of a website that is not currently displayed on the screen of the computer, and wherein the avatar provides information directing the user to the portion of the website not currently displayed.


In some embodiments, the avatar provides information by pointing in a direction for the user to scroll on the website.


In some embodiments, the intent corresponds to a portion of a website that is currently displayed on the screen of the computer, and wherein the avatar points to the portion of the website that corresponds to the display.


Some embodiments include a system comprising: memory; and one or more processors configured by specific executable instructions to: initiate capture, by a camera of a mobile phone, an image of a user interface displayed on a computer, wherein the computer is a different device than the mobile phone and is physically separate from the mobile phone; identify, based on image analysis of the image captured by the mobile phone, an anchor within the user interface as depicted in the image of the user interface displayed on the computer; determine one or more content items displayed within the user interface based at least in part on the anchor; identify an intent of a user based on the determined one or more content items; determine supplemental content associated with the intent; generate a three-dimensional rendering associated with the supplemental content; orient the three-dimensional rendering relative to the anchor; and display, on the mobile phone, a modified image that places the oriented three-dimensional rendering in coordinates of the image that are determined relative to the anchor.


In some embodiments, the executable instructions are implemented by a processing agent that is configured to perform processes across a plurality of devices, including the second device.


In some embodiments, to identify the intent of the user is further based on an audio snippet played from the computer and received by a microphone on the mobile device.


In some embodiments, the one or more processors are further configured to match the audio snippet with a pre-stored audio fingerprint, wherein the audio snippet comprises at least one of: a chime, audio played in the background, or audio of a certain frequency beyond the human range of hearing.


In some embodiments, the one or more processors are further configured determine a distance between the mobile phone and the computer based on a time stamp for the mobile phone receiving the audio anchor.


In some embodiments, the one or more processors are further configured to determine a distance between the computer and the mobile phone based on a size of at least a portion of the image captured by the mobile phone.


In some embodiments, the one or more processors are further configured to determine size of the three-dimensional rendering based on a size of at least a portion of the image captured by the mobile phone.


In some embodiments, the three-dimensional rendering includes an avatar.


In some embodiments, the intent corresponds to a portion of a website that is not currently displayed on the computer, and wherein the avatar provides information directing the user to the portion of the website not currently displayed.


In some embodiments, the intent corresponds to a portion of a website that is currently displayed on the computer, and wherein the avatar points to the portion of the website that corresponds to the display.


Some embodiments include a non-transitory computer storage medium storing computer-executable instructions that, when executed by a processor, cause the processor to perform operations comprising: initiating capture, by a camera of a mobile computing device, an image of a user interface displayed on a computer, wherein the computer is a different device than the mobile computing device and is physically separate from the mobile computing device; identifying, based on image analysis of the image captured by the mobile computing device, an anchor within the user interface as depicted in the image of the user interface displayed on the computer; determining one or more content items displayed within the user interface based at least in part on the anchor; identifying an intent of a user based on the determined one or more content items; determining supplemental content associated with the intent; generating a three-dimensional rendering associated with the supplemental content; orienting the three-dimensional rendering relative to the anchor; and displaying, on the mobile computing device, a modified image that places the oriented three-dimensional rendering in coordinates of the image that are determined based at least in part on a position of the anchor.


In some embodiments, the operations further comprise receiving a user selection of an option corresponding to the three-dimensional rendering, wherein in response to the user selection, the computer changes the display to correspond to the user selection.


In some embodiments, the computer adjusts the size of the first anchor based on the size of a display for the mobile device.


In some embodiments, the supplemental content is specific to a particular website presented within the user interface, wherein the particular website is determined based at least in part on one or more of: the anchor or the one or more content items.


In some embodiments, the three-dimensional rendering includes an avatar, wherein the intent corresponds to a portion of a website that is not currently displayed on the computer, and wherein the avatar provides information directing the user to the portion of the website not currently displayed.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A is a depiction of a computing device according to some embodiments.



FIG. 1B depicts two computing devices where one device captures data from another device, according to some embodiments.



FIG. 1C is a depiction of software and hardware components of a computing device, according to some embodiments.



FIG. 1D is a depiction of software and hardware components of multiple computing devices, according to some embodiments.



FIG. 1E is a depiction of a processing agent 185 across multiple computing devices 100a, 100b, 100c, according to some embodiments.



FIG. 2A is a flow diagram of the key states for a multiple device interactive experience according to some embodiments.



FIG. 2B illustrates parallel processing and linking of computing devices, according to some embodiments.



FIG. 3A illustrates a mobile phone camera taking a picture of a user interface displayed on a laptop, according to some embodiments.



FIG. 3B depicts the scenario where a mobile phone is using a sensor device (for example, a camera) and whose view is displayed on the screen of the mobile phone, according to some embodiments.



FIG. 3C depicts examples of the three dimensional representation of the bot interacting with the user, according to some embodiments.



FIG. 3D depicts visual reference anchors displayed on the user interface of a computing device for another computing device to identify, according to some embodiments.



FIG. 3E depicts virtual mapping between two computing devices, according to some embodiments.



FIGS. 4A and 4B depict a diagram of a computing device interacting with servers, according to some embodiments.



FIG. 4C depicts the flow of artificial intelligence bot functions, according to some embodiments.



FIG. 5 illustrates one embodiment of a block diagram showing one embodiment of a computing system in communication with a network and various systems that may be used as one or more devices.



FIG. 6 illustrates an example method implemented by a processing agent of a second device according to some embodiments.





DETAILED DESCRIPTION

Computing Device



FIG. 1A is a depiction of a computing device 100a according to some embodiments. The device can include one or more processing components 104 and/or one or more memory components 106 that can be connected by one or more bus or communication components 102.


In some embodiments, these computing devices can include one or more storage components 140, one or more input components 108, one or more output components 110, one or more sensor components 120, and/or one or more network interfaces 150.


In some embodiments, the network interfaces can be connected via one or more electronic communication links 190 to a network 195. The computing device 100a can connect to other computing devices or servers 100b, 100c. In some embodiments, the network can include the Internet, and/or other system where devices are served by servers that include communication information. The electronic communication links 190 can be from a single device to another device.


In some embodiments, the components in the device 100a can include logical entities, and/or in a physical representation, they may be stored or housed on the same physical chip in any combination. Such physical configurations can include “System-on-Chip” or SoC.


In some embodiments, a device 100a need not be an independent device. In the concept of shared resources, one or more logical devices may operate and share resources within a single physical device. One example of this is the concept of virtual devices.


Two Computing Devices



FIG. 1B depicts two computing devices 100a and 100b working together, according to some embodiments. These devices can communicate through a network 195, or in other embodiments may not communicate with each other aside from a visual and/or audio capture of the first computing device's output via the sensor component 121 of the second computing device. Device 100b can include one or more output components 111 that can be interpreted by the sensor component 121. One such example of this configuration can be a display as the output component 111 with a camera as the sensor component 121. Another such example can be a speaker as the output component 111 with a microphone as the sensor component 121. The examples of output components 111 and sensor components 121 need not be mutually exclusive; that is a sensor 121 could detect many such output components 111 and/or a single output component 111 could service many different 121 sensors.



FIG. 1B depicts two computing devices where one device captures data from another device, according to some embodiments. FIG. 1B depicts a system wherein many other devices and/or servers 100c can be connected to the network. These devices need not be present.


In some embodiments, a sensor component 121, such as a camera, of a first computing device 100a can detect a particular output, such as a user interface or audio, from an output component 111 of a second computing device 100b. The first computing device 100a can identify a location of a user interface and/or an intent of a user based on the detected output. For example, the first computing device 100a can match the location of a website displayed on the second computing device 100a by matching user interface data captured by a camera. The first computing device 100a can identify an intent of the user by determining that the user is viewing a particular offer, such as a credit card, on a website.


In some embodiments, the first computing device 100a and/or the second computing device 100b communicates the retrieved data from the sensor and/or derived data, such as a location on a website or intent of the user, to another computing device 100c via a network 195. The other computing device 100c can include a computing device corresponding to a marketing company, a bank, or other third party that uses the data to provide targeted marketing to the computing device 100a.


Hardware and Software Components in Computing Devices



FIG. 1C is a depiction of software and hardware components of a computing device 100a, according to some embodiments. In some embodiments, FIG. 1C demonstrates another logical view of device 100a from FIG. 1A, according to some embodiments. In this view, a device 100a can include hardware components 149 and/or software components 199. In particular, the devices 100a can include at least one network interface 150. Within the software components 199, and one or more applications 180, which can include one or more sessions 170a1, 170a2, such as instances of run-time execution of software modules.


In some embodiments, the computing device 100a includes, for example, a personal computer that is IBM, Macintosh, or Linux/Unix compatible or a server or workstation. In one embodiment, the computing device 100a comprises a server, a laptop computer, a smart phone, a personal digital assistant, a kiosk, or an media player, for example. In one embodiment, the exemplary computing device 100a includes one or more central processing unit (“CPU”), which may each include a conventional or proprietary microprocessor. The computing device 100a further includes one or more memory, such as random access memory (“RAM”) for temporary storage of information, one or more read only memory (“ROM”) for permanent storage of information, and one or more mass storage device, such as a hard drive, diskette, solid state drive, or optical media storage device. Typically, the modules of the computing device 100a are connected to the computer using a standard based bus system. In different embodiments, the standard based bus system could be implemented in Peripheral Component Interconnect (“PCI”), Microchannel, Small Computer System Interface (“SCSI”), Industrial Standard Architecture (“ISA”) and Extended ISA (“EISA”) architectures, for example. In addition, the functionality provided for in the components and modules of computing device 100a may be combined into fewer components and modules or further separated into additional components and modules.


In some embodiments, the computing device 100a is generally controlled and coordinated by operating system software, such as Windows XP, Windows Vista, Windows 7, Windows 8, Windows Server, Unix, Linux, SunOS, Solaris, iOS, Blackberry OS, or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, the computing device 100a may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.


In some embodiments, the exemplary computing device 100a may include one or more commonly available input/output (I/O) devices and interfaces, such as a keyboard, mouse, touchpad, and printer. In one embodiment, the I/O devices and interfaces include one or more display devices, such as a monitor, that allows the visual presentation of data to a user. More particularly, a display device provides for the presentation of GUIs, application software data, and multimedia presentations, for example. The computing device 100a may also include one or more multimedia devices, such as speakers, video cards, graphics accelerators, and microphones, for example.


In the embodiment of FIG. 1C, the I/O devices and interfaces provide a communication interface to various external devices. In the embodiment of FIG. 1C, the computing device 100a is electronically coupled to a network, which comprises one or more of a LAN, WAN, and/or the Internet, for example, via a wired, wireless, or combination of wired and wireless, communication link. The network communicates with various computing devices and/or other electronic devices via wired or wireless communication links.


According to FIG. 1C, in some embodiments information may be provided to the computing device 100a over the network from one or more business location data sources which may include one or more internal and/or external data sources. In some embodiments, one or more of the databases or data sources may be implemented using a relational database, such as Sybase, Oracle, CodeBase and Microsoft® SQL Server as well as other types of databases such as, for example, a flat file database, an entity-relationship database, and object-oriented database, and/or a record-based database.


In the embodiment of FIG. 1C, the computing device 100a includes one or more applications 180, or modules. These and other modules in the computing device 100a may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. In the embodiment shown in FIG. 1C, the computing device 100a is configured to perform the various methods and/or processes for the system as described herein (such as the processes described with respect to FIGS. 2A, 6, and others herein).



FIG. 1D is a depiction of software and hardware components of multiple computing devices, according to some embodiments. FIG. 1D illustrates an embodiment of 1C that can correspond with FIG. 1B, according to some embodiments. In this scenario, the device 100a can include the sensor component 111 that is detecting the output component 121 from device 100b.


In particular to FIG. 1D, devices 100a and 100b can communicate through their network interfaces 150a and 150b respectively with a third device (such as, for example a server or a webserver) 100c. This device 100c can include corresponding sessions to both devices 100a and 100b. In response to device 100a sensing the output from 121 of device 100b via the sensor 111, the device 100a and/or the device 100b can communicate this information to an external device 100c via the network interfaces 150a, 150b, 150c.


In order for these sessions to work together, the webserver can choose one of several approaches. The webserver can keep the corresponding sessions 170a1 and 170b1 separate while acting as a relay to both, and/or it can sync the sessions as depicted by devices 100a and 100b both having the same session 170a5 running.


Processing Agent Across Computing Devices



FIG. 1E is a depiction of a processing agent 185 across multiple computing devices 100a, 100b, 100c, according to some embodiments. FIG. 1E depicts a more abstracted view of FIG. 1D according to some embodiments. In particular, the system can combine the software and hardware layers together into a system component level. In some embodiments, the computing devices 100a, 100b, 100c each have their own processing agents 184a, 184b, 184c (collectively referred to herein as processing agents 184) and/or a collective processing agent 185 across each computing device 100a, 100b, 100c. Although the present disclosure explains embodiments using the processing agent 184, it is understood that the processing agent 185 can be applied, where applicable, and vice versa.


In some embodiments, the processing agent 184 includes an agent that functions on each corresponding device, such as, processing agent 184a to device 100a, processing agent 184b to device 100b, and so forth. In one embodiment, processing agent 184 could be the central processing unit of the device. In some embodiments, the processing agent 184 can include or interface with a mobile application and may also be a webpage.


In another embodiment, this configuration could represent that the processing of the request occurs off the device and merely is an agent that dispatches that processing to some other location before receiving the result. For example, a game can be hosted on another service and the processing occurs on the other service. On the device, an agent may merely dispatch the user's commands to this other service.


In some embodiments, the processing agent 185 can represent a particular scenario where processing occurs on one or more device with some combination of dispatching commands to other systems and also processing commands on its own processing unit. Note that processing agent 185 need not be on all devices as depicted. It can be any combination of two or more devices in 100a, 100b, and 100c or one or more devices. An embodiment of 185 can be a bot that operates via automated machine learning algorithms. When the user interacts with the bot, some processing can occur on the originating device 100a, which is then sent to be processed on the server 100c. In response to the server 100c finishing processing, it can send a response to the bot, which may then execute more processing before outputting to the user via output component 110a.


In some embodiments, in the embodiment of processing agent 185 being an artificial intelligence bot, the bot may not a true artificial intelligence. The bot may be a combination of rules based logic for outcomes, which are handled by machine learning processing of the input. In this particular embodiment, the processing agent 185 AI bot differs from other implementations in part because it has the ability to control the output of other devices.


In some embodiments, the processing agent 185 can be designed and/or trained as a single neural network but perform operations across multiple computing devices 100a, 100b, 100c.


In some embodiments, the processing agent 185 can serve all three devices 100a, 100b, and 100c. In some embodiments, after the server processes the input from device 100a, it can instead send its response to devices 100b, and 100b one or more of which may process the output to display to the user.


In some embodiments, while an example embodiment has been discussed with two devices with one server, there is no limitation to the number of devices or the number of servers in the system.


In some embodiments, with respect to augmented reality, there may not be a need to revisit the concept of the session 170. In a session, the client device 100a can have a session 170a1 that corresponds to a session 170a1 on a server 100c, and the session is tied to that user. When that same user creates a second session using 100b, a new session 170b1 can be created. With respect to processing agent 185, these sessions may be independent, so they may not be able to communicate. Thus, these sessions may need to be linked for processing agent 185 to function across all devices. In some embodiments, these sessions may be able to communicate and/or may already be linked.


In some embodiments, the server 100c can act as a trusted intermediary to communicate between both devices 100a and 100b. Alternatively, a new session 170a5 could be created on all devices that handles the linked session.


Key States for Multiple Device Interactive Experience



FIG. 2A is a flow diagram of the states for a multiple device interactive experience according to some embodiments. Starting in 201, a first device with a sensor captures data from a second device displaying or otherwise outputting elements that the sensor can detect. The sensing device, such as the sensor 121 on the computing device 100a, detects an output of the output device, such as the output component 111 of computing device 100b. For example, the experience containing elements can include an audio snippet or a visual anchor displayed on the user device.


In some embodiments, after the first device sensing the elements on the second device is completed, at block 203, the sessions between the first and second device can be linked. For example, the first device can identify that the second device is displaying a certain portion of a webpage based on a captured image of the user display of the second device (e.g., laptop) by the first device (e.g., mobile phone).


In some embodiments, the first device can display a three dimensional rendering of information at block 205. The first device can determine an intent of the user based on what is being displayed on the second device. The first device can identify what is being displayed on the second device via the sensor on the first device. The three dimensional rendering of information can correspond to the user's intent. For example, the user's intent may be determined to be to interact with a certain user interface element, and the information may be an instruction or a recommendation regarding how to interact with the given user interface element.


In some embodiments, the first device can receive input from the user 220. The user can select an option that was displayed on the first device. For example, the website can include an offer for a credit card and an option to find ways to improve a user's credit score. The portion of the website displayed on the second device can only be showing the offer for a credit card. The first device can display a three dimensional rendering of a selectable option to go to the portion of the website that is not currently being displayed, the portion of the website that displays ways to improve the credit score. The input can be a selection on the first device to improve the credit score.


At block 230, after this input is processed at the first device, the input can be sent to the processing agent 230. For example, information indicating that the user wants to improve his or her credit score can be sent to the processing agent 230.


At block 240, after the processing agent 240 receives the input at the first device, it can transmit the output to another device, such as the second device or an external server. Then, the second device that was displaying the website can perform further actions, such as providing a shortcut to move to the corresponding website. The website can automatically scroll to the portion of interest.


In some embodiments, the input is identified by the first device interacting with one or more experience containing elements by the sensor. For example, the first device can detect a different visual anchor that is placed on the different portion of the webpage, and the first device, upon identifying the anchor, can identify that the user has moved the webpage to the different portion.


Parallel Processing and Linking of Computing Devices



FIG. 2B illustrates parallel processing and linking of computing devices, according to some embodiments. FIG. 2B illustrates a process that can run or execute in parallel to the events in FIG. 2A, according to some embodiments. At least a portion of the events can be processed in parallel including from state 206 to states 208 or 213. The events in FIG. 2B can be processed prior to, in conjunction with, or after any of the processes in FIG. 2A (such as prior to state 240), and/or vice versa.


In state 206, a device, such as the first device 100a, can detect a signature (such as an anchor) with a sensor from an output device, such as device 100b. The sensor can be a camera capturing an image of an output, such as a signature, from a display from the output device.


In some embodiments, the device 100a can create a virtual representation of the experience of session, such as, for example, session 170b1. In state 207, the first device 100a can determine a distance mapping from the second device 100b. The first device 100a can determine a distance from the first device 100a to the second device 100b. In some embodiments, the first device 100a can determine a certain distance to the second device 100b based on an assessment of an image captured by the first device 100a. For example, the first device 100a can assess a size on a portion of the user interface of the second device 100b that the image captured. The size determined by the first device 100a can be the size of a visual anchor displayed on the user interface, the size of text or font, and/or an image or video. This mapping can be created using information obtained from the output of output component 111.


In some embodiments, distance between the devices can be determined based on pixels on the first device 100a. The first device 100a can identify the area of the entire screen (or at least a portion thereof) that is rendered on the second device 100b. The image of the screen on the first device 100a can include a pixel length and/or a pixel width that can be used to calculate a ratio or relation to the actual screen area on the second device 100b.


In some embodiments, the audio snippet can be used to determine a distance. For example, the second device can play an audio snippet. The first device can determine when the first device received the audio snippet and determine a distance to the second device, based on the rate of sound travel through air.


In state 208, the first device 100a can generate an overlay model with anchors. The first device 100a can display a live feed from the camera with a three dimensional (3D) rendering of information. For example, the second device 100b can determine that the intent of the user is to sign up for a credit card. However, the website displayed on the first device 100a is displaying something else. Another portion of the website that is not being displayed has the credit card information. The second device 100b may determine that the currently displayed portion of the website is not displaying the credit card information based on a previously stored association of anchors for the particular website with corresponding user interface elements that appear near the anchors on the website (e.g., the anchors may have been placed on the page by a developer or author of the page who desired that the page be usable with visual augmentation features described herein). The second device 100b can generate three dimensional rendering that is overlaid over the current display on the first device 100a, where the three dimensional rendering can redirect the user to the credit card information (such as by pointing in a scroll direction or textually describing how to access the information). After the augmented mapping of the session is created, anchor points can be added to the model in order to represent the potential areas of interest within the model 208.


In some embodiments, the overlay model can include the anchors to be displayed on the first device 100a. The anchors can include visual anchors, such as one or more combination of pixels to be displayed on the first device 100a for the second device 100b to process and identify. The anchors can include audio anchors, such as a background sound to be played on the website. In some embodiments, the audio anchors can be played in the background of other audio already being played on the website, such as audio from a video stream.


In some embodiments, the size of the overlay model to be displayed on the first device 100a can be based on the image of the screen for the second device 100b captured by the first device 100a in relation to the actual screen size of the second device 100b. For example, a certain portion of a website displayed on the second device 100b can be 100×100 pixels, but the first device 100a captures a 10×10 pixel image of the portion. Then, there is a ratio of 1-to-10.


In some embodiments, the size of the overlay model does not change once it is loaded (unless there's an explicit refresh). In other embodiments, the size of the overlay model continuously changes with the ratio continuously being recalculated, such as based on a continuous stream of images or video stream.


In some embodiments, anchors can be used by the first device 100a to determine the resolution of the second device 100b and/or the aspect ratio of the screen of the second device 100b. One or more of a first type of anchor can be used to determine resolution, such as different audio frequencies or different visual anchors or patterns. One or more of a second type of anchor can be used to determine an aspect ratio based on the relation between pixels.


In some embodiments, anchors can be used to determine an angle that the first device 100a is viewing the second device 100b. For example, if pixel 1 is displayed directly above pixel 2 on the second device 100b, but if on the first device 100a pixel 1 is displayed one pixel above and one pixel to the right, the first device 100a can determine that the first device 100a is viewing the user interface of the second device 100b at an angle.


In some embodiments, the size and/or orientation of the overlay model can be based on one or more factors, such as the resolution, the aspect ratio, and/or the viewing angle. In state 212, the signature captured by the second device 100b can be sent to an external system, such as a processing server 100c. In state 213, the processing server can link the sessions or create new sessions, such as linking between the first and second device. Accordingly, the processing server can link the processing agents across the first and second device to work together. Advantageously, an action performed by the user on one of the devices can take effect on what is being displayed on the other device. For example, the user can select an option displayed by the three dimensional rendering on the second device, where the second device is displaying the three dimensional rendering that is overlaid on top of the user display for the first device. When the user selects the option, the selection can be sent from the processing server of the second device to the first device and/or a processing server that can cause the first device to change the display of the website. For example, the three dimensional rendering on the second device can provide an option to navigate to a portion of the website that is not currently being displayed. Upon selection, the first device can automatically move the display of the website to the portion that is not currently being displayed.


In some embodiments, a user can use a user device 100a to control 100b, for example, if they own or have access to or control over both devices. In these embodiments, the process of linking the sessions can be done utilizing the same signature information determined in step 206. After the sessions are linked 213, then the processing agent 185 can act upon the sessions.


Mobile Phone Camera and Laptop Display



FIG. 3A illustrates a mobile phone camera taking a picture of a user interface displayed on a laptop, according to some embodiments. In some embodiments, FIG. 3A depicts a phone 300a (e.g., of a second computing device 100a) that is linking sessions with a computer 300b (e.g., of a first computing device 100b) that displays content from the processing server, according to some embodiments. The phone 300a can take a picture (or continuous video) of a user interface 304 currently being displayed on the laptop 300b via a camera 302, and the phone 300a can identify what is being displayed on the laptop 300b.


In some embodiments, the phone 300a can identify an intent of the user based on an assessment of the image or video captured by the camera. For example, the user interface 304 can be displaying a loan offer 306, a car 308, and text. The phone 300a (and or an external server that the phone sends the image to) can process the image to identify the loan offer 306, the car 308, and/or the text. The processing can include a mapping to known websites, such as a pixel by pixel (or combination of pixel) comparison or through an artificial intelligence model, to determine whether the website matches a known website in a database. In some embodiments, the image can be processed through an artificial intelligence model trained to determine objects in images. In some embodiments, the text can be assessed to determine an intent. The text can be describing features of the car. In this example, the intent can be identified as the user desiring to purchase a car and qualify for an automobile loan.


The phone 300a can generate and display a three dimensional rendering of additional (or supplemental) information overlaid over the user interface. Upon a user selection of an option displayed on the phone 300a, the laptop can change what is being displayed. The phone 300a can send a signal back to the laptop, e.g., via WiFi, bluetooth, or other network communication to change the display. In some embodiments, upon a selection of an option, the phone 300a can display an updated 3D rendering of information that instructs the user how to get to a certain feature, such as scrolling down the screen of a webpage.



FIG. 3B depicts the scenario where a mobile phone is using a sensor device (for example, a camera) and whose view is displayed on the screen of the mobile phone, according to some embodiments. Additionally a processing agent bot 185 in the form of an augmented reality character can be displayed on the screen 385 of the mobile phone 315a. Device 300b can be a laptop with display 315b.


In some embodiments, section 316 can display the entire viewable area of the content that was created from the processing server 100c with the view currently centered at the box 315b (which corresponds to the laptop). In some embodiments, Section 316 is displayed on the mobile phone and/or the laptop for ease of navigation throughout the webpage.


In some embodiments, the laptop can display anchors, such as anchors 360a and 360b, which can represent anchors in the context of the site. These anchors 360a and 360b can represent the anchor points that can correspond to the outcomes of the states in processing agent 185. By mapping these possibilities to the anchor points, the processing agent 385 can initiate display of a three dimensional rendering that helps the user to solve their problem within the experience.



FIG. 3C depicts examples of the three dimensional representation of the bot 385 interacting with the user, according to some embodiments. The bot 385a, 385b, 385c, 385d (collectively referred to herein as bot 385) can direct the user to another portion of the webpage. For example, the bot 385a can indicate that what they are looking for corresponding to 360a is in a portion of the website that is above what is currently being displayed. The bot 385a can point upward indicating that the user should scroll up.


In some embodiments, the bot 385c depiction can correspond to anchor 360b, which was below the content currently being displayed. If the mobile phone determines that the intent is for the content corresponding to anchor 360b, the bot 385c can direct the user to scroll down the webpage. A depiction of bot 385b illustrates the bot inviting the user to move his or her mouse or click the anchor on the screen. In one embodiment, the bot does not move position, but in another embodiment, the bot could move to show exactly where the anchor is.


Visual Reference Anchors



FIG. 3D depicts visual reference anchors displayed on the user interface of a computing device for another computing device to identify, according to some embodiments. In some embodiments, FIG. 3D depicts an embodiment where the reference anchors 361a, 361b, 361c, 361d, 361e, 361f, 361g and 361h are unique beacons or visual combination of pixels embedded with the experience 316, according to some embodiments. The number of reference anchors 361 can vary, such as from a to g in FIG. 3D. In some embodiments, the anchors can be different from each other. For example, anchor 361a can be smaller or different shape than anchor 361b. In some embodiments, at least a subset of anchors can be placed along an axis, such as a horizontal, vertical, or diagonal axis.


In some embodiments, the first computing device 300b can be displaying only a portion of the text. As shown in FIG. 3D, the first computing device 300b can display the text portion 315b and reference anchors 361c, 361d, 361e. The second computing device 300a identify the reference anchors 361c, 361d, 361e being displayed by the first computing device 300b, and determine that the first computing device 300b is on a certain portion of the website, and/or that the text 315b is being displayed.


In some embodiments, the number, size, and/or type of anchors can vary depending on the size of the screen 316. For example, for a smaller screen, smaller visual anchors can be used, or a smaller collection of anchors can be used than for a larger screen. Advantageously, the system can vary the application of anchors to adjust for limited user interface real estate. This is a technical improvement on user interface technology as user interface sizes are becoming smaller.


In one embodiment, these beacons or anchors can be visual graphics embedded within the user interface or page displayed on a screen. Due to some properties of these graphics, the sensor can be able to determine the relative positions of things or objects and infer where the anchor points can be in the scene, and/or determine what user was involved in the session. For example, an anchor can be placed near or around an image or text, such that the computing system with the sensor can correspond the anchor to the image or text. Such location information can be used to identify an intent of a user, such as if the user is viewing a credit card offer for an extended period of time, highlighted certain text, or clicked certain options.


An anchor may be a particular icon, graphic, shape/color combination, pattern, and/or other visual content that may be included within the page at a particular position to enable a device viewing the page to determine a relative location within the page that is currently being displayed on screen, such as by the particular anchor's appearance being globally unique to that anchor position. Another embodiment of an anchor 361 can be a QR code or other two-dimensional graphical code or barcode. In the case of the QR code, only a single anchor 361 may be required per session, but others could be used.


Alternatively, if the sensor includes a microphone and the output includes a speaker, then these reference anchors can include one or more sounds, such as sound clips or audio corresponding to certain frequencies. The frequency can be something beyond the human range of hearing such that the experience was hidden from the user. The frequency can be designated to a certain function or portion of the webpage. For example, an audio clip with a frequency of 30 hertz may correspond to a first advertisement, whereas a frequency of 35 hertz may correspond to a second advertisement.


In some embodiments including three-dimension (“3d”) experiences, an option on how to control the experience can be via the three dimensional rendering of the bot 385. The user input can directly control the display that displays the bot 385 on a first computing device via inputs to a second computing device. In this scenario, the sessions 170 of the first and second computing device can be linked by the external server.


Virtual Mapping Between Two Computing Devices



FIG. 3E depicts virtual mapping between two computing devices, according to some embodiments. In some embodiments, FIG. 3E depicts a virtual mapping of the scenario from FIG. 3D according to some embodiments. In this embodiment, the position of the reference anchors can be used to create a 3d xyz axis. As discussed herein, the distance between the devices can be determined based on resolution and/or aspect ratios. The mobile device can then calculate a mapping on how many pixels that are displayed on the user device for at least a portion of the laptop screen to a number of pixels actually displayed on the laptop screen for the same portion of the website. For example, if the anchor is 40 pixels on the laptop and the phone sees 10 pixels of the anchor, then the mapping is 4:1. The distance can be determined based on this mapping.


In some embodiments, a mapping can be determined based on pixel distance between anchors. For example, pixels A and B are 500 pixels apart on the laptop and the mobile phone determines that the text of interest is in between pixels A and B (at the 250 pixel mark). The image that the mobile phone captures can identify a distance of 100 pixels between pixels A and B. Then, the mobile phone can generate a three dimensional representation (e.g., an avatar) that points to the 50 pixel mark between A and B on the mobile phone screen.


In some embodiments, when this mapping is created, then it can still be possible to deploy an actual depiction of a bot 385 in the model to interact even without the anchors. As an alternative, in some embodiments, an interpolation among the reference points may be used to calculate where the anchors should be, and the bot 385 may be directed to those points.


Computing Device Communicating with Servers



FIGS. 4A and 4B depict a diagram of a computing device interacting with servers, according to some embodiments. In some embodiments, FIGS. 4A and 4B depict a diagram of a device 100a interacting with a server 100c, which further offloads processing to another server 100d, according to some embodiments. Such offloading can be initiated by the processing agents 185.


In some embodiments, within 100c, the 185 processing agent can include a combination of rules-based logic with machine learning algorithms, or artificial intelligence models. The bot 410 can be comprised of one or more rules called Intents 411a, 411b, and those intents may include one or more slots 411a1, 411a2, 411b1, 411b2. Each intent may also include a fulfillment function 418a, 418b that handles the final output after the slots are satisfied or when a determined condition is met.


Additionally, in some embodiments, the bot 410 can include one or more function handlers 415a, 415b, and so forth. In some embodiments, the handler 415a need not correlate to intent 411a, the handler 415 may handle one or more intents 411, and/or multiple handlers 415 may be used to service a single intent 411.


In some embodiments, the handlers 415 can comprise at least a part of the intent 411a, the slots 411a1, and/or the fulfillment 418a. A single handler could be used for all of these, or there could be a separate handler for each.


In some embodiments, the software within the handlers 415 are executed that can initiate connecting to or performing steps in a learning system 450. An instance of 100c may include or be in electronic communication with a system that includes one or more learning systems 450a, 450b, and so forth. In some embodiments, the mapping of 450a need not have any relation to either the function handler 415a or the intent 411a. A single learning system 450 could handle one or more function handlers 415, or multiple learning systems 450 may be used to handle a single function handler 415.


In some embodiments, a learning system 450 may include one or more real time handlers for in-session data 451 as well as a continuous learning system 452, sometimes referred to as or including a model.


In some embodiments, a single 110C may include one or more bots 410.


In some embodiments, a function handler 415 may call a remote learning system 455 stored in a cloud server 100d. These remote learning systems 455 can have the same or similar characteristics as the learning system 450, and/or may be called to handle a certain use case. For example, the remote learning system 455 may be a voice-to-text processing framework while the learning system 450 that includes the session-specific knowledge for how to handle the input after it has been changed, converted, or translated to text. A single function handler 415 can instantiate any combination of requests to local learning systems 450 or remote learning systems 455 in order to satisfy the request.


Artificial Intelligence Bot Functions



FIG. 4C depicts the flow of artificial intelligence bot functions, according to some embodiments. After receiving a first input 460, the artificial intelligence bot can process the input locally 471 or remotely 475. The artificial intelligence bot can process the input through processes stored internal to the device and/or through layers of a neural network. The artificial intelligence bot can send the input to an external device or server for the external device or server to process the input data. The input can be an initial input or an intermediary input received by the bot. The artificial intelligence bot can receive the first input 460. In some embodiments, local processing can include the processing that occurs within the server 100c. It may or may not involve either a function handler 415 or a learning system 450.


In some embodiments, the artificial intelligence bot can map the input to an intent of the user at 461. The artificial intelligence bot can map the input with the intent of the user based on an assessment of the input. The input can include an image taken by the second computing device of a user interface of the first device. The image can be assessed to identify objects, text, video, pictures, and/or anchors that can provide contextual information that is being displayed on the website. The contextual information can provide information related to the user's intent. For example, if the user interface is displaying text related to a car purchase, a video and image of a car, and an anchor corresponding to a loan application, the artificial intelligence bot can determine that the intent is that the user is looking to purchase a car.


In some embodiments, the input can include an audio anchor played by the second computing device and captured by the first computing device. The artificial intelligence bot can match the audio snippet to audio stored in a database that corresponds to particular interests and intents of the user.


If the input satisfies the slots of the intent or when a determined condition is met, then the bot can fulfill the request 480. The artificial intelligence bot can be programmed and/or trained to automatically perform a certain action in response to a determination of a certain intent and/or determine intent that meets a certain threshold. For example, if the artificial intelligence bot determines that the user is looking for a credit card related to travel awards and/or that the user is 90% likely looking for a credit card related to travel awards based on contextual information identified in the user interface of the first device, the second device can send a command to the first device to display a different part of a website or a different website with credit card offers related to travel awards.


If the slots are not fulfilled, then the artificial intelligence bot may prompt for or request further inputs 462 from the user. The further inputs can be processed, for example, in some combination of local 472 and remote 476 processing.


After the processing completes, then a mapping to the slot can occur at 463. If there are more slots to resolve, then the process can repeat from 462 to 463 until all slots are satisfied or when a determined condition is met. When all slots are satisfied or the determined condition is met, the bot can fulfill the request 480. If the artificial intelligence bot cannot determine an intent of the user, the artificial intelligence bot can display a three dimensional rendering on the second device requesting for more information, such as selecting from a list of options (e.g., credit cards, loans, car purchases). The artificial intelligence bot can receive a selection of one or more of the options and determine an intent. If the user selects car and loan, the artificial intelligence bot can determine that the user is looking for an automobile loan for a purchase of a car.


For fulfillment 480, the fulfillment handler 418 can be utilized, which could handle the fulfillment in any combination of local or remote fulfillment. After the fulfillment is completed, the state can revert to 460 to await another first input.


Note that a single bot 410 can handle many concurrent instances of the process depicted in FIG. 4C. These instances can map to a session such as 170a1 on the device 100a.


For these sessions to be served in such a manner, some presentation components 430 may be utilized. For example, if a represented figure of a bot 385 were created, there would be some configuration within the presentation components 430.


In FIG. 4C, the devices 100c and 100d can be different in that the devices 100a and 100c may be within a “controlled experience” in the sense that there is some authentication used to protect those transactions. For example, the 100d server can authenticate with device 100c (but not device 100a). As another example, the device 100e can interact with device 100c and is not authenticated.


Computing System Embodiments



FIG. 5 illustrates one embodiment of a block diagram showing one embodiment of a computing system in communication with a network and various systems that may be used as one or more devices. In some embodiments, any of the devices systems, servers, or components referenced herein may take the form of a computing system as shown in FIG. 5, which illustrates a block diagram of one embodiment of a type of computing system 502. The exemplary computing system 502 includes a central processing unit (“CPU”) 510, which may include one or more conventional microprocessors that comprise hardware circuitry configured to read computer-executable instructions and to cause portions of the hardware circuitry to perform operations specifically defined by the circuitry. The computing system 502 may also include a memory 512, such as random access memory (“RAM”) for temporary storage of information and read only memory (“ROM”) for permanent storage of information, which may store some or all of the computer-executable instructions prior to being communicated to the processor for execution. The computing system may also include one or more mass storage devices 504, such as a hard drive, diskette, CD-ROM drive, a DVD-ROM drive, or optical media storage device, that may store the computer-executable instructions for relatively long periods, including, for example, when the computer system is turned off. Typically, the modules of the computing system are connected using a standard based bus system. In different embodiments, the standard based bus system could be Peripheral Component Interconnect (“PCI”), Microchannel, Small Computer System Interface (“SCSI”), Industrial Standard Architecture (“ISA”) and Extended ISA (“EISA”) architectures, for example. In addition, the functionality provided for in the components and modules of computing system may be combined into fewer components and modules or further separated into additional components and modules. The illustrated structure of the computing system 502 may also be used to implement other computing components and systems described in the disclosure. It is recognized that the components discussed herein may be implemented as different types of components. For example, a server may be implemented as a module executing on a computing device, a mainframe may be implemented on a non-mainframe server, a server or other computing device may be implemented using two or more computing devices, and/or various components could be implemented using a single computing device.


Also, it is recognized that a variety of embodiments may be used and that some of the blocks in FIG. 5 may be combined, separated into sub-blocks, and rearranged to run in a different order and/or in parallel.


In one embodiment, the computing system 502 is a server, a workstation, a mainframe, a minicomputer. In other embodiments, the system may be a personal computer that is IBM, Macintosh, or Linux/Unix compatible, a laptop computer, a tablet, a handheld device, a mobile phone, a smart phone, a smart watch, a personal digital assistant, a car system, a tablet or other user device. Servers may include a variety of servers such as database servers (for example, Oracle, DB2, Informix, Microsoft SQL Server, MySQL, or Ingres), application servers, data loader servers, or web servers. In addition, the servers may run a variety of software for data visualization, distributed file systems, distributed processing, web portals, enterprise workflow, form management, and so forth.


The computing system 502 may be generally controlled and coordinated by operating system software, such as Windows 95, Windows 98, Windows NT, Windows 2000, Windows XP, Windows Vista, Windows 7, Windows 8, Windows 10, Unix, Linux, SunOS, Solaris, Maemo, MeeGo, BlackBerry Tablet OS, Android, webOS, Sugar, Symbian OS, MAC OS X, or iOS or other operating systems. In other embodiments, the computing system 502 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.


The exemplary computing system 502 includes one or more commonly available input/output (“I/O”) devices and interfaces 508, such as a keyboard, mouse, touchpad, speaker, microphone, or printer. In one embodiment, the I/O devices and interfaces 508 include one or more display device, such as a touchscreen, display or monitor, which allows the visual presentation of data to a user. More particularly, a display device provides for the presentation of GUIs, application software data, and multimedia presentations, for example. The central processing unit 510 may be in communication with a display device that is configured to perform some of the functions defined by the computer-executable instructions. For example, some of the computer-executable instructions may define the operation of displaying to a display device, an image that is like one of the screen shots included in this application. The computing system may also include one or more multimedia devices 506, such as speakers, video cards, graphics accelerators, and microphones, for example. A skilled artisan would appreciate that, in light of this disclosure, a system, including all hardware components, such as the central processing unit 510, display device, memory 512, and mass storage device 504 that are necessary to perform the operations illustrated in this application, is within the scope of the disclosure.


In the embodiment of FIG. 5, the I/O devices and interfaces provide a communication interface to various external devices and systems. The computing system may be electronically coupled to a network 518, which comprises one or more of a LAN, WAN, the Internet, or cloud computing networks, for example, via a wired, wireless, or combination of wired and wireless, communication links. The network 518 communicates with various systems or other systems 520 via wired or wireless communication links, as well as various data sources 522.


Information may be provided to the computing system 502 over the network from one or more data sources. The network may communicate with other data sources or other computing devices such as a third party survey provider system or database, for example. The data sources may include one or more internal or external data sources. In some embodiments, one or more of the databases or data sources may be implemented using a relational database, such as Sybase, Oracle, CodeBase and Microsoft® SQL Server as well as other types of databases such as, for example, a flat file database, an entity-relationship database, a no-SQL database, object-oriented database, or a record-based database.


In the embodiment of FIG. 5, the computing system 502 also includes a subsystem module 514, which may be executed by the CPU 510, to run one or more of the processes discussed herein. This system may include, by way of example, components, such as software components, object-oriented software components, class components, task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, or variables. In one embodiment, the subsystem module 514 may include one or more of the modules shown in the other figures.


Embodiments can be implemented such that all functions illustrated herein are performed on a single device, while other embodiments can be implemented in a distributed environment in which the functions are collectively performed on two or more devices that are in communication with each other. Moreover, while the computing system has been used to describe one embodiment of a subsystem module 514, it is recognized that the user or customer systems may be implemented as computing systems as well.


In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, or may be invoked in response to detected events or interrupts. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.


It is recognized that the term “remote” may include systems, data, objects, devices, components, or modules not stored locally, that are not accessible via the local bus. Thus, remote data may include a system that is physically stored in the same room and connected to the computing system via a network. In other situations, a remote device may also be located in a separate geographic area, such as, for example, in a different location, country, and so forth.


Visual Indicators Via Website or Browser Plug-in


In some embodiments, a processing agent of a second device (such as a mobile phone with integrated camera) may identify content items on a webpage displayed by a first device based on visual indicators appearing on the webpage and captured by a camera of the second device that captures an image or video of the display screen of the first device. The first device may include a software module, such as a webpage plug-in. The website server can add the webpage plug-in to enable the site to include visual indicators for the processing agent.


In some embodiments, the software module can include a browser plug-in or script that adds visual indicators to the webpage. The browser plug-in, such as a web browser extension, may be installed by a user of the browser and can collect information on a website, such as HTML code, and determine content items of the webpage. The extension can scan the webpage data to determine relevant data and enhance the webpage by adding visual indicators. For example, the plug-in can analyze the HTML code of the webpage to identify types of content items in the webpage, such as images, price information, interactive user interface controls, text, and/or the like. The plug-in can add visual indicators on the webpage to be displayed on the first device for purposes of enabling an augmented reality view of the webpage by the second device.


In some embodiments, the software module can include both a website plug-in and a browser plug-in. The website plug-in can add the visual indicators directly into the HTML code of the website. The browser plug-in can assess the HTML code and add additional indicators into the website. For example, the website plug-in may add indicators regardless of the browser type. However, the browser plug-in can adapt or add additional indicators based on the browser type or display type, such as a browser for a mobile phone or a laptop, browsers from different companies, different resolutions on the user interface, and/or the like.


Method of a Processing Agent



FIG. 6 illustrates an example method implemented by a processing agent of a second device according to some embodiments. Some embodiments are described as using indicators, but it is understood that the embodiments can apply to anchors, and/or vice versa, where applicable.


In some embodiments, the visual indicators or graphical indicators added within a webpage on a first device to be recognized by a processing agent or bot operating on a different device (e.g., a device that analyzes an image of the webpage as displayed by the first device) may be similar in some respects to the anchors discussed above. However, in some embodiments, such visual indicators or graphical indicators added within the displayed first page on the device that displays the first page may indicate more information to the processing agent than which portion of a page is currently displayed or where to spatially present augmented reality content by a second device. For example, the graphical indicators added to the page display by a plug-in or other code executed by the browser may each be associated with mapping information or association information that associates each of a number of unique graphical symbols, images or grouping of pixels with different content types, types or information, or other data or metadata regarding a nearby piece of content appearing on the page. For example, a green triangle may mean a product name appears nearby on the page, a red circle inside an orange square may mean a product's UPC code is nearby, a pink rectangle may mean a flight number is nearby, etc.


In some embodiments, a user can open the webpage on the first device, such as a desktop computer or a laptop computer. As an example, the webpage may that present a product, such as a television, for sale by an operator of the webpage. The plug-in can add visual indicators near certain identified information on the webpage, such as one visual indicator near a name of the television and a second visual indicator near the price of the television as displayed by a browser on the first device. The user can take a picture or video from a second device, such as a mobile phone, of the user interface of the first device showing the television and the price.


In some embodiments, the processing agent on the second device can analyze the image or video to identify the visual indicators for the television and the price being displayed on the user interface of the first device. At block 602, the processing agent can capture an image or video of a user interface displayed by a first device. For example, the processing agent can be installed on a mobile device, and the camera of the mobile device captures an image of a user interface displayed on a user's laptop screen. At block 604, the processing agent can identify one or more visual indicators within image or video of the user interface captured at block 602.


At block 606, the processing agent can determine or identify one or more of the content items displayed within the user interface displaying the webpage based at least in part on the identified visual indicator(s). In some embodiments, the visual indicator itself may be uniquely associated with the product or item, while in other embodiment the visual indicator may signify to the processing agent that a product identifier or name appears next to or near the visual indicator on the webpage (such that the processing agent may extract the product name or identifier from the image data captured by the camera in proximity of the visual indicator). The processing agent can display additional offers, help tips, discount coupons, and/or the like on the second device near the relevant content item, such as an overlay that appears to be part of the actual scene captured by the camera (e.g., using augmented reality techniques to apply realistic perspective in 3D virtual space). For example, the processing agent can display an offer for credit that would provide larger discounts if used for the purchase of the television.


In some embodiments, the plug-in can determine the locations of the content items on the webpage. The plug-in can determine that an image is placed on the top right corner of the webpage, and the price is located underneath the image based on an analysis of the HTML code. The plug-in can identify the order in which certain content items are displayed, such as the image first, and then text indicating a price. The plug-in can analyze the code to identify particular locations on the page, such as a style attribute in the HTML code that center aligns a JPEG on the screen.


In some embodiments, based on locations of certain content items on a webpage, the plug-in can add anchors or visual indicators that can be used by the processing agent to identify the content items. For example, the plug-in can identify that the top of the page includes a picture of a house and a price underneath the picture, and at the bottom of the page includes a mortgage loan offer, based on an analysis of the underlying HTML code and/or other page content. The plug-in can overlay a first visual indicator near the picture of a house, a second visual indicator near the price, and a third visual indicator near the mortgage loan offer at the bottom of the page. A user can be viewing the top of the page on a desktop computer, and not displaying the bottom of the page. A user can, with the user's mobile phone, take a picture or a video stream of the top of the page displayed on the desktop computer. A processing agent on a user's mobile phone can identify the locations of the house picture and the price based on the identification of the first and second visual indicators in the image or video. In some embodiments, the first and/or second visual indicators can also indicate that a mortgage loan application is at the bottom of the page, as discussed further herein.


In some embodiments, a visual indicator can be overlaid on top of the webpage on a first device by the plug-in installed on the first device. The processing agent on a second device can identify the visual indicator based on an image capturing the webpage displayed on the first device. The visual indicator can include a pixel and/or a group of pixels that can be mapped to characteristics of content items. For example, a particular group of pixels can be mapped to a price, a price range, a car, an application, an offer, and/or the like. A group of pixels can be mapped to a type of content item, such as a house and/or a mortgage loan. A group of pixels can indicate one or more characteristics of the content items. For example, a group of pixels can indicate a single characteristic, such as a price range. A group of pixels can also indicate a group of characteristics, such as a price range for a house and a mortgage loan. The visual indicator can include an icon, such as a group of pixels creating a dollar sign in a certain shape, color or configuration.


In some embodiments, the visual indicators can be placed near or in a certain proximity or relation to the associated content item by the plug-in. The visual indicators can be placed closer to the content item of interest than other content items. The visual indicator can indicate a type of content item. For example, a first visual indicator can indicate an image and a second visual indicator can indicate a price. Then, the processing agent can correlate the nearest image with the first visual indicator and the nearest price with the second visual indicator.


In some embodiments, the visual indicator can be on the left, right, top, and/or bottom of the content item according to a predefined rule or template known by both the plug-in and an associated processing agent on a second device (e.g., the user's mobile phone). In some embodiments, the visual indicator can be embedded in a style of the content item. For example, if the content item is part of a list (such as a bullet point list), the visual indicator can be placed on top of or in place of the bullet point. In some embodiments, the visual indicator can be overlaid on the content item, and/or overlaid on the frame of an image.


In some embodiments, visual indicators can be placed with certain constraints of the webpage. For example, the plug-in can place only a single visual indicator on each Y-axis. As such, the processing agent can look for visual indicators on the Y-axis (such as on the far left of the display screen).


In some embodiments, the visual indicators can be integrated into the webpage that may not be readily apparent to the consumer viewing the user interface. For example, the visual indicators can be of different shades of colors. A bullet point may be a black circle, and the visual indicators can include different shades of black and grey. In other embodiments, the visual indicators may be apparent to the user as a design feature to encourage the user to capture an image of the page with a camera (e.g., the user may know from the appearance of the visual indicators that the page enables an augmented reality experience via inclusion of the graphical indicators).


In some embodiments, the plug-in and/or the processing agent can perform analysis on the content items on a page. For example, the plug-in can identify an image on a webpage. The plug-in can perform image recognition to determine an indication of a likely object in an image, such as a car or a house. The plug-in can perform optical character recognition (OCR) or transcribe audio to text to identify content items. For example, an image can include text that the plug-in can analyze for text. The plug-in can transcribe audio of a video stream to categorize the content item, such as a video explaining how to shop for mortgage loans.


In some embodiments, the visual indicator can be an entire background or framing of a webpage or other interface displayed on the user display. For example, a background image of a webpage that exceeds the viewable area of the screen can include a diagonal line from the top left to the bottom right of the webpage. Relative placement of the diagonal line to the content items can be used for placement of other visual indicators. The position of the diagonal line on the screen can indicate the position of the entire webpage currently being displayed. For example, if the processing agent identifies that the top of the current view of the webpage includes the diagonal background line starting with a top horizontal position in the middle of the display area, that may indicate that the top of the currently displayed portion of the webpage is the midpoint of the page as a whole (such as because half of the page has already been scrolled through).


Advantageously, the plug-in and/or the processing agent can identify content items and determine an intent of the user. At block 608, the processing agent can determine an intent of the user based on the visual indicator(s) and/or the determined content item(s). For example, the plug-in can identify that the page includes an image of a house, a price for the house, and a mortgage loan application, and determine that the user is shopping for the house and may need a mortgage loan.


In some embodiments, the processing agent can identify the content items based on a visual indicator created by the plug-in, and can identify the intent of the user. The processing agent can determine that the user is looking for a mortgage loan, assess a user's credit history, and display offers for credit on the user's second device.


At block 610, the processing agent can display, on the second device, supplemental information associated with the intent of the user. The processing agent can overlay offers, such as credit offers, over the second device's display of the camera's capture of the website displayed on the first device. In some embodiments, the visual indicator can indicate an intent of a consumer, such as a red circle indicating that the consumer is looking to purchase a car. The object of the intent (such as the specific car to be purchased) may be determined from a graphical indicator and/or associated content of the page near a graphical indicator (such as a graphical indicator that signifies to the processing agent that a make and model of a car is indicated next to the graphical indicator on the page, such that the processing agent extracts the nearby image data of the captured page, applies OCR, and extracts the make/model of the car using predefined rules that indicate typical text formatting of car make information).


In some embodiments, the HTML code of the website can be programmed to work with the plug-in. For example, the website HTML code can include plug-in references for the plug-in to recognize when placing visual indicators. For example, a developer of the page may include HTML tags or other code within the page at points where particular visual indicators should be added dynamically at the time of displaying the page by the browser or an associated browser plug-in.


In some embodiments, the plug-in can be an extension for a web browser. The plug-in can access HTML code of website loading on the web browser. The plug-in can analyze the HTML or other code or markup language to dynamically and automatically place the visual indicators within or on top of the webpage displayed via the web browser. Advantageously, the plug-in and/or the processing agent may not have any prior knowledge of the webpage, but can dynamically add visual indicators on top of the webpage for the processing agent to display help service to a consumer, offers, and/or the like on a second device.


In some embodiments, the plug-in can modify the HTML code. For example, the plug-in can add the visual indicators directly into the website's HTML code and/or on top of the displayed webpage dynamically at the time of display. The web browser can load the website's HTML code with the HTML code for the visual indicator on the first device.


In some embodiments, dynamic and automatic placement of visual indicators on websites provide the technical advantage of not having to share control or the modify webpage code. For example, an airline website may not want to program plug-in references into the website's HTML for the plug-in to add visual indicators, or to add additional software modules that specifically communicate with the plug-in. The plug-in itself (or the processing agent, or a combination thereof) can identify the content items on the webpage based on an analysis of the page, and automatically place visual indicators onto the displayed page.


In some embodiments, dynamic and automatic placement of visual indicators on websites provide the technical advantage of not having to share additional information between the servers of the webpage and the plug-in and/or processing agent system. For example, the plug-in on the client side can analyze the HTML code of an airline's website without having to share additional information with the airline website's server. The visual indicators can be placed on top of the airline website via the web browser. Advantageously, when the processing agent determining and displays offers on the second device's user interface that is not originally on the webpage, the processing agent may be identifying characteristics of the consumer, such as accessing a consumer profile on a database that the airline website server does not have access to. The consumer profile that the processing agent has access to can include sensitive information, such as an address, credit information, social security information, transaction data, and/or the like. Thus, the dynamic and automatic placement of the visual indicators improves data security and identity theft by performing the analysis all within the plug-in and/or the processing agent system, and displaying the advertisements or offers on top of the airline's website on the second device, without having to share the sensitive consumer data with the airline website server (or otherwise sending any sensitive or personal information to or from the first client device that displays the webpage).


Certain offers may be preferable for a consumer with a certain credit score range. Thus, sharing even the offer to the airline web server may be reverse engineered to determine a consumer's credit score range. Advantageously, displaying the offers only on the second device further ensures data security and privacy. The processing agent and/or the plug-in can request authentication of the consumer, such as based on a consumer's input of authentication data, in order to protect sensitive information of the user. In some embodiments, the processing agent can provide more customized offers to an authenticated consumer, whereas general offers are displayed to a user who has not been authenticated.


Moreover, the airline website does not have to share its own consumer data with the plug-in and/or the processing agent system. The airline website may have its own database of information on the consumer. The plug-in and/or the processing agent system can rely on its own internal information database (or information accessible via a secure connection to a related server or network-accessible data source) to make its determinations on offers to display.


In some embodiments, the extension can communicate with the processing agent. The processing agent can indicate to the plug-in that the processing agent has identified a particular visual indicator. The plug-in can send contextual information on the content item associated with the visual indicator, information on the webpage, user behavior, and/or the like. For example, the plug-in can send page history, view timeline, click stream, and/or the like. The plug-in can send data on content items that may not be displayed on the second device, but that the plug-in knows is on the webpage (such as a content item at the bottom of the webpage). This communication may be via a server that is in communication with both the first and second device over a network, such as the Internet, in some embodiments.


In some embodiments, the browser plug-in may not communicate information to and/or from the processing agent. The processing agent can map characteristics of the content item based on the visual indicator. For example, a group of pixels received by the processing agent on the second device can map the group of pixels to a scenario where the consumer is looking to purchase a house and may be in need of a mortgage loan. The processing agent can access credit information on the consumer and display an offer for a mortgage loan next to an image of the house on the second device. The processing agent can identify optimal financial services and/or products, discount codes, helpful tips, and/or the like for the user's intent. For example, the processing agent can identify a credit card that would provide additional bonuses for the purchase, such as a merchant specific bonus or a bonus for the purchase of the content item type. Advantageously, the processing agent may have additional data on the consumer that is accessible to the processing agent but not to the website's server, and thus, the processing agent may be able to provide more targeted or customized offers to the user based on the accessible data. Thus, the consumer is not limited to the website server's offers.


In some embodiments, the processing agent can display offers on the second device, overlaid on top of the screen of the first device captured by the camera on the second device. The user can select an offer, such as an offer to apply for a loan. The user can be routed to an offer for a loan on the second device. In some embodiments, the processing agent can communicate with the plug-in and/or the website browser to launch a loan application webpage on the first device, such as on a new tab of the website browser.


In some embodiments, the plug-in can gather information from multiple webpages loaded on the webpage browser. The plug-in can analyze HTML code and/or communicate with a software module on one or more of the webpages to identify content items and/or determine consumer intent. For example, the consumer can be viewing a television on a first webpage, a television wall mount on a second webpage, and a universal remote on a third webpage. The plug-in can aggregate the data regarding the content items, and identify an optimal financial service and/or product for the combined purchase, such as an optimal credit card to use across the three products. An optimal financial product and/or service can be determined based on a total value. For example, the total value can include a price, a shipping cost, a shipping time, consumer data protection, privacy, rewards, best credit to use, warranty coverage for the product, warranty coverage based on a credit card, a combination thereof, and/or the like. The plug-in can generate visual indicators for each of the web pages. A visual indicator on the first webpage can be processed by the processing agent to identify the three products. In some embodiments, the visual indicator for the first product can initiate the processing agent to communicate with the plug-in to receive information related to the other two products.


ADDITIONAL EMBODIMENTS

Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware. The code modules may be stored on any type of non-transitory computer-readable medium or computer storage device, such as hard drives, solid state memory, optical disc, and/or the like. The systems and modules may also be transmitted as generated data signals (for example, as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (for example, as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). The processes and algorithms may be implemented partially or wholly in application-specific circuitry. The results of the disclosed processes and process steps may be stored, persistently or otherwise, in any type of non-transitory computer storage such as, for example, volatile or non-volatile storage.


The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and sub-combinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.


As used herein, the terms “determine” or “determining” encompass a wide variety of actions. For example, “determining” may include calculating, computing, processing, deriving, generating, obtaining, looking up (for example, looking up in a table, a database or another data structure), ascertaining and the like via a hardware element without user intervention. Also, “determining” may include receiving (for example, receiving information), accessing (for example, accessing data in a memory) and the like via a hardware element without user intervention. Also, “determining” may include resolving, selecting, choosing, establishing, and the like via a hardware element without user intervention.


As used herein, the terms “provide” or “providing” encompass a wide variety of actions. For example, “providing” may include storing a value in a location of a storage device for subsequent retrieval, transmitting a value directly to the recipient via at least one wired or wireless communication medium, transmitting or storing a reference to a value, and the like. “Providing” may also include encoding, decoding, encrypting, decrypting, validating, verifying, and the like via a hardware element.


As used herein, the term “message” encompasses a wide variety of formats for communicating (for example, transmitting or receiving) information. A message may include a machine readable aggregation of information such as an XML document, fixed field message, comma separated message, or the like. A message may, in some implementations, include a signal utilized to transmit one or more representations of the information. While recited in the singular, it will be understood that a message may be composed, transmitted, stored, received, and so forth, in multiple parts.


As used herein “receive” or “receiving” may include specific algorithms for obtaining information. For example, receiving may include transmitting a request message for the information. The request message may be transmitted via a network as described above. The request message may be transmitted according to one or more well-defined, machine readable standards which are known in the art. The request message may be stateful in which case the requesting device and the device to which the request was transmitted maintain a state between requests. The request message may be a stateless request in which case the state information for the request is included within the messages exchanged between the requesting device and the device serving the request. One example of such state information includes a unique token that can be generated by either the requesting or serving device and included in messages exchanged. For example, the response message may include the state information to indicate what request message caused the serving device to transmit the response message.


As used herein “generate” or “generating” may include specific algorithms for creating information based on or using other input information. Generating may include retrieving the input information such as from memory or as provided input parameters to the hardware performing the generating. After obtained, the generating may include combining the input information. The combination may be performed through specific circuitry configured to provide an output indicating the result of the generating. The combination may be dynamically performed such as through dynamic selection of execution paths based on, for example, the input information, device operational characteristics (for example, hardware resources available, power level, power source, memory levels, network connectivity, bandwidth, and the like). Generating may also include storing the generated information in a memory location. The memory location may be identified as part of the request message that initiates the generating. In some implementations, the generating may return location information identifying where the generated information can be accessed. The location information may include a memory location, network locate, file system location, or the like.


As used herein, “activate” or “activating” may refer to causing or triggering a mechanical, electronic, or electro-mechanical state change to a device. Activation of a device may cause the device, or a feature associated therewith, to change from a first state to a second state. In some implementations, activation may include changing a characteristic from a first state to a second state such as, for example, changing the viewing state of a lens of stereoscopic viewing glasses. Activating may include generating a control message indicating the desired state change and providing the control message to the device to cause the device to change state.


Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.


All of the methods and processes described above may be embodied in, and partially or fully automated via, software code modules executed by one or more general purpose computers. For example, the methods described herein may be performed by the computing system and/or any other suitable computing device. The methods may be executed on the computing devices in response to execution of software instructions or other executable code read from a tangible computer readable medium. A tangible computer readable medium is a data storage device that can store data that is readable by a computer system. Examples of computer readable mediums include read-only memory, random-access memory, other volatile or non-volatile memory devices, CD-ROMs, magnetic tape, flash drives, and optical data storage devices.


It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the systems and methods can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the systems and methods should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the systems and methods with which that terminology is associated.

Claims
  • 1. A computer-implemented method comprising: capturing, by a camera of a first computing device, an image of a display screen of a second computing device, wherein the display screen displays a user interface presented by the second computing device, wherein the second computing device is a different device than the first computing device and is physically separate from the first computing device;identifying an anchor within the user interface as depicted in the image of the display screen;determining one or more content items expected to be present within the user interface based at least in part on the anchor, wherein the one or more content items are determined regardless of whether the one or more content items are visible in the image;identifying an intent of the user based at least in part on the determined one or more content items;determining supplemental content based at least in part on the intent of the user, wherein the supplemental content comprises a recommendation regarding (a) interacting with the user interface via the second computing device and (b) information for a user to provide to the user interface via the second computing device; andcausing display on the first computing device of the supplemental content as an overlay on top of a visual representation of a current user interface of the second computing device captured in real-time by the first computing device.
  • 2. The computer-implemented method of claim 1, wherein the first computing device comprises a mobile phone.
  • 3. The computer-implemented method of claim 1, wherein the supplemental content includes a three-dimensional rendering that provides the recommendation to the user.
  • 4. The computer-implemented method of claim 3, wherein the three-dimensional rendering includes an animated avatar.
  • 5. The computer-implemented method of claim 1, wherein the intent corresponds to a portion of a website that is not currently displayed on the display screen of the second computing device, and wherein the supplemental content provides information directing the user to the portion of the website not currently displayed.
  • 6. The computer-implemented method of claim 5, wherein the supplemental content provides information by pointing in a direction for the user to scroll on the website.
  • 7. The computer-implemented method of claim 1, wherein the intent corresponds to a portion of a website that is currently displayed on the display screen of the second computing device, and wherein the supplemental content points to the portion of the website that corresponds to the display screen.
  • 8. The computer-implemented method of claim 1, wherein the intent of the user is identified further based on audio played by the second computing device and received by a microphone of the first computing device.
  • 9. A first computing device comprising: a memory;a camera configured to capture an image of a display screen of a second computing device, wherein the display screen displays a user interface presented by the second computing device, wherein the second computing device is a different device than the first computing device and is physically separate from the first computing device; andone or more processors configured by specific executable instructions to: identify an anchor within the user interface as depicted in the image of the display screen;determine one or more content items expected to be present within the user interface based at least in part on the anchor, wherein the one or more content items are determined regardless of whether the one or more content items are visible in the image;identify an intent of the user based at least in part on the determined one or more content items;determine supplemental content based at least in part on the intent of the user, wherein the supplemental content comprises a recommendation regarding (a) interacting with the user interface via the second computing device or (b) information for a user to provide to the user interface via the second computing device; andcause display on the first computing device of the supplemental content as an overlay on top of a visual representation of a current user interface of the second computing device captured in real-time by the first computing device.
  • 10. The first computing device of claim 9, wherein a processing agent causes performance of processes across a plurality of devices including implementation of the executable instructions by the one or more processors of the first computing device.
  • 11. The first computing device of claim 9, wherein the one or more processors are further configured to match an audio snippet emitted by the second computing device with a pre-stored audio fingerprint, wherein the audio snippet comprises at least one of: a chime or audio of a certain frequency beyond the human range of hearing.
  • 12. The first computing device of claim 9, wherein the one or more processors are further configured to determine a distance between the first computing device and the second computing device based on a time stamp for the first computing device receiving an audio anchor emitted by the second computing device.
  • 13. The first computing device of claim 9, wherein the one or more processors are further configured to determine a distance between the first computing device and the second computing device based on a size of at least a portion of the image captured by the first computing device.
  • 14. The first computing device of claim 9, wherein the one or more processors are further configured to determine a size of the supplemental content to be displayed based on a size of at least a portion of the image captured by the first computing device.
  • 15. A non-transitory computer storage medium storing computer-executable instructions that, when executed by a processor, cause the processor to perform operations comprising: identifying an anchor within a user interface as depicted in an image of a display screen of a second computing device, wherein the image is captured by a camera of a first computing device, wherein the second computing device is a different device than the first computing device and is physically separate from the first computing device;determining one or more content items expected to be present within the user interface based at least in part on the anchor, wherein the one or more content items are determined regardless of whether the one or more content items are visible in the image;determining supplemental content based on the determined one or more content items, wherein the supplemental content comprises a recommendation regarding interacting with the user interface via the second computing device, wherein the supplemental content provides information directing the user to a portion of a website not currently displayed by the second computing device; andtransmitting the supplemental content to be displayed on the first computing device as an overlay on top of a visual representation of a current user interface of the second computing device captured in real-time by the first computing device.
  • 16. The non-transitory computer storage medium of claim 15, wherein the operations further comprise receiving a user selection of an option corresponding to the supplemental content, wherein in response to the user selection, the second computing device changes a display for the first computing device to correspond to the user selection.
  • 17. The non-transitory computer storage medium of claim 15, wherein the second computing device adjusts a size of the anchor based on a size of a display for the first computing device.
  • 18. The non-transitory computer storage medium of claim 15, wherein the supplemental content is specific to a particular website presented within a user interface of the second computing device, wherein the particular website is determined based at least in part on one or more of: the anchor or the one or more content items.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation and claims priority from U.S. application Ser. No. 16/797,697 filed on Feb. 21, 2020 which claims priority from provisional App. No. 62/809,469, filed on Feb. 22, 2019 and provisional U.S. Pat. App. No. 62/832,159, filed on Apr. 10, 2019, which are hereby incorporated by reference in their entirety.

US Referenced Citations (2185)
Number Name Date Kind
3405457 Bitzer Oct 1968 A
4346442 Musmanno Aug 1982 A
4734858 Schlafly Mar 1988 A
4755940 Brachtl et al. Jul 1988 A
4774664 Campbell et al. Sep 1988 A
4827508 Shear May 1989 A
4891503 Jewell Jan 1990 A
4977595 Ohta et al. Dec 1990 A
4989141 Lyons et al. Jan 1991 A
5126936 Champion et al. Jun 1992 A
5148365 Dembo Sep 1992 A
5220501 Lawlor et al. Jun 1993 A
5262941 Saladin Nov 1993 A
5274547 Zoffel et al. Dec 1993 A
5336870 Hughes et al. Aug 1994 A
5351186 Bullock et al. Sep 1994 A
5361201 Jost et al. Nov 1994 A
5383113 Kight et al. Jan 1995 A
5404518 Gilbertson et al. Apr 1995 A
5423033 Yuen Jun 1995 A
5500513 Langhans et al. Mar 1996 A
5563783 Stolfo et al. Oct 1996 A
5590038 Pitroda Dec 1996 A
5592560 Deaton et al. Jan 1997 A
5621201 Langhans et al. Apr 1997 A
5640577 Scharmer Jun 1997 A
5649115 Schrader et al. Jul 1997 A
5659725 Levy et al. Aug 1997 A
5659731 Gustafson Aug 1997 A
5689651 Lozman Nov 1997 A
5719941 Swift et al. Feb 1998 A
5729735 Meyering Mar 1998 A
5739512 Tognazzini Apr 1998 A
5742769 Lee et al. Apr 1998 A
5754632 Smith May 1998 A
5774870 Storey Jun 1998 A
5809322 Akerib Sep 1998 A
5813006 Polnerow et al. Sep 1998 A
5819234 Slavin et al. Oct 1998 A
5832068 Smith Nov 1998 A
5842211 Horadan et al. Nov 1998 A
5844218 Kawan et al. Dec 1998 A
5857174 Dugan Jan 1999 A
5864620 Pettitt Jan 1999 A
5870721 Norris Feb 1999 A
5878403 DeFrancesco Mar 1999 A
5881131 Farris et al. Mar 1999 A
5884302 Ho Mar 1999 A
5903830 Joao et al. May 1999 A
5903881 Schrader et al. May 1999 A
5918227 Polnerow et al. Jun 1999 A
5930776 Dykstra et al. Jul 1999 A
5933837 Kung Aug 1999 A
5937392 Alberts Aug 1999 A
5953710 Fleming Sep 1999 A
5956693 Geerlings Sep 1999 A
5963939 McCann et al. Oct 1999 A
5966695 Melchione et al. Oct 1999 A
5974521 Akerib Oct 1999 A
5978780 Watson Nov 1999 A
5995947 Fraser et al. Nov 1999 A
5999596 Walker et al. Dec 1999 A
6006333 Nielsen Dec 1999 A
6009412 Storey Dec 1999 A
6009415 Shurling et al. Dec 1999 A
6014645 Cunningham Jan 2000 A
6021397 Jones et al. Feb 2000 A
6021943 Chastain Feb 2000 A
6026440 Shrader et al. Feb 2000 A
6029149 Dykstra et al. Feb 2000 A
6035288 Solomon Mar 2000 A
6038551 Barlow et al. Mar 2000 A
6043815 Simonoff et al. Mar 2000 A
6055570 Nielsen Apr 2000 A
6064990 Goldsmith May 2000 A
6072894 Payne Jun 2000 A
6073140 Morgan et al. Jun 2000 A
6076070 Stack Jun 2000 A
6085242 Chandra Jul 2000 A
6088686 Walker et al. Jul 2000 A
6108691 Lee et al. Aug 2000 A
6112190 Fletcher et al. Aug 2000 A
6115694 Cheetham et al. Sep 2000 A
6119103 Basch et al. Sep 2000 A
6128602 Northington et al. Oct 2000 A
6145088 Stevens Nov 2000 A
6149441 Pellegrino et al. Nov 2000 A
6157707 Baulier et al. Dec 2000 A
6161139 Win et al. Dec 2000 A
6173272 Thomas et al. Jan 2001 B1
6178420 Sassano Jan 2001 B1
6182068 Culliss Jan 2001 B1
6182229 Nielsen Jan 2001 B1
6195660 Polnerow et al. Feb 2001 B1
6195738 Akerib Feb 2001 B1
6199077 Inala et al. Mar 2001 B1
6202053 Christiansen et al. Mar 2001 B1
6202067 Blood et al. Mar 2001 B1
6208998 Marcus Mar 2001 B1
6230188 Marcus May 2001 B1
6233566 Levine et al. May 2001 B1
6233588 Marchoili et al. May 2001 B1
6247000 Hawkins et al. Jun 2001 B1
6254000 Degen et al. Jul 2001 B1
6263447 French et al. Jul 2001 B1
6278993 Kumar et al. Aug 2001 B1
6282658 French et al. Aug 2001 B2
6289452 Arnold et al. Sep 2001 B1
6295528 Marcus et al. Sep 2001 B1
6295541 Bodnar et al. Sep 2001 B1
6304850 Keller et al. Oct 2001 B1
6304860 Martin et al. Oct 2001 B1
6311169 Duhon Oct 2001 B2
6317783 Freishtat et al. Nov 2001 B1
6321339 French et al. Nov 2001 B1
6327578 Linehan Dec 2001 B1
6330551 Burchetta et al. Dec 2001 B1
6343279 Bissonette et al. Jan 2002 B1
6347375 Reinert et al. Feb 2002 B1
6353778 Brown Mar 2002 B1
6353795 Ranjan Mar 2002 B1
6356937 Montville et al. Mar 2002 B1
6374262 Kodama Apr 2002 B1
6384844 Stewart et al. May 2002 B1
6386444 Sullivan May 2002 B1
6397197 Gindlesperger May 2002 B1
6397212 Biffar May 2002 B1
6401118 Thomas Jun 2002 B1
6405181 Lent et al. Jun 2002 B2
6405245 Burson et al. Jun 2002 B1
6408282 Buist Jun 2002 B1
6412073 Rangan Jun 2002 B1
6421675 Ryan et al. Jul 2002 B1
6421729 Paltenghe et al. Jul 2002 B1
6422462 Cohen Jul 2002 B1
6442590 Inala et al. Aug 2002 B1
6453353 Win et al. Sep 2002 B1
6457012 Jatkowski Sep 2002 B1
6460127 Akerib Oct 2002 B1
6473740 Cockril et al. Oct 2002 B2
6477565 Daswani et al. Nov 2002 B1
6487540 Smith et al. Nov 2002 B1
6496936 French et al. Dec 2002 B1
6505168 Rothman et al. Jan 2003 B1
6510451 Wu et al. Jan 2003 B2
6517587 Satyavolu et al. Feb 2003 B2
6523021 Monberg et al. Feb 2003 B1
6523041 Morgan et al. Feb 2003 B1
6539377 Culliss Mar 2003 B1
6539392 Rebane Mar 2003 B1
6543683 Hoffman Apr 2003 B2
6549904 Ortega et al. Apr 2003 B1
6552670 Sundaravel et al. Apr 2003 B2
6564210 Korda et al. May 2003 B1
6567791 Lent et al. May 2003 B2
6567850 Freishtat et al. May 2003 B1
6571236 Ruppelt May 2003 B1
6574736 Andrews Jun 2003 B1
6578012 Storey Jun 2003 B1
6581025 Lehman Jun 2003 B2
6587841 DeFrancesco Jul 2003 B1
6594766 Rangan et al. Jul 2003 B2
6611816 Lebda et al. Aug 2003 B2
6615193 Kingdon et al. Sep 2003 B1
6622131 Brown et al. Sep 2003 B1
6629245 Stone et al. Sep 2003 B1
6633910 Rajan et al. Oct 2003 B1
6647383 August et al. Nov 2003 B1
6658393 Basch et al. Dec 2003 B1
6665715 Houri Dec 2003 B1
6678694 Zimmermann et al. Jan 2004 B1
6694353 Sommerer Feb 2004 B2
6701348 Sommerer Mar 2004 B2
6703930 Skinner Mar 2004 B2
6711665 Akerib et al. Mar 2004 B1
6714944 Shapiro et al. Mar 2004 B1
6718313 Lent et al. Apr 2004 B1
6725381 Smith et al. Apr 2004 B1
6725425 Rajan et al. Apr 2004 B1
6738804 Lo May 2004 B1
6745196 Colyer et al. Jun 2004 B1
6745938 Sullivan Jun 2004 B2
6748426 Shaffer et al. Jun 2004 B1
6750985 Rhoads Jun 2004 B2
6766304 Kemp et al. Jul 2004 B2
6766327 Morgan, Jr. et al. Jul 2004 B2
6772132 Kemp et al. Aug 2004 B1
6781608 Crawford Aug 2004 B1
6782370 Stack Aug 2004 B1
6782379 Lee Aug 2004 B2
6792088 Takeuchi Sep 2004 B2
6792263 Kite Sep 2004 B1
6795812 Lent et al. Sep 2004 B1
6796497 Benkert et al. Sep 2004 B2
6802042 Rangan et al. Oct 2004 B2
6804346 Mewhinney Oct 2004 B1
6805287 Bishop et al. Oct 2004 B2
6807533 Land et al. Oct 2004 B1
6810323 Bullock et al. Oct 2004 B1
6816850 Culliss Nov 2004 B2
6816871 Lee Nov 2004 B2
6823319 Lynch et al. Nov 2004 B1
6826707 Stevens Nov 2004 B1
6829639 Lawson et al. Dec 2004 B1
6842782 Malik et al. Jan 2005 B1
6845448 Chaganti et al. Jan 2005 B1
6847966 Sommer et al. Jan 2005 B1
6847974 Wachtel Jan 2005 B2
6857073 French et al. Feb 2005 B2
6859212 Kumar et al. Feb 2005 B2
6865680 Wu et al. Mar 2005 B1
6871220 Rajan et al. Mar 2005 B1
6871287 Ellingson Mar 2005 B1
6871789 Hilton et al. Mar 2005 B2
6892307 Wood et al. May 2005 B1
6898574 Regan May 2005 B1
6900731 Kreiner et al. May 2005 B2
6910624 Natsuno Jun 2005 B1
6928487 Eggebraaten et al. Aug 2005 B2
6934714 Meinig Aug 2005 B2
6938011 Kemp et al. Aug 2005 B1
6941323 Galperin Sep 2005 B1
6947989 Gullotta et al. Sep 2005 B2
6950807 Brock Sep 2005 B2
6954757 Zargham et al. Oct 2005 B2
6957336 Wheeler et al. Oct 2005 B2
6962336 Glass Nov 2005 B2
6963857 Johnson Nov 2005 B1
6965881 Brickell et al. Nov 2005 B1
6968319 Remington et al. Nov 2005 B1
6970864 Marcus et al. Nov 2005 B2
6973462 Dattero et al. Dec 2005 B2
6983320 Thomas et al. Jan 2006 B1
6985183 Jan et al. Jan 2006 B2
6985887 Sunstein et al. Jan 2006 B1
6988082 Williams et al. Jan 2006 B1
6988085 Hedy Jan 2006 B2
6990591 Pearson Jan 2006 B1
6993504 Friesen et al. Jan 2006 B1
6993510 Guy Jan 2006 B2
6993572 Ross, Jr. et al. Jan 2006 B2
6993596 Hinton et al. Jan 2006 B2
6996542 Landry Feb 2006 B1
6999941 Agarwal Feb 2006 B1
7013310 Messing et al. Mar 2006 B2
7013315 Boothby Mar 2006 B1
7013323 Thomas et al. Mar 2006 B1
7016907 Boreham et al. Mar 2006 B2
7024548 O'Toole, Jr. Apr 2006 B1
7024689 O'Donnell et al. Apr 2006 B2
7028013 Saeki Apr 2006 B2
7028052 Chapman et al. Apr 2006 B2
7039607 Watarai et al. May 2006 B2
7039656 Tsai et al. May 2006 B1
7043476 Robson May 2006 B2
7046139 Kuhn et al. May 2006 B2
7047258 Balogh et al. May 2006 B2
7050989 Hurt et al. May 2006 B1
7058386 McGregor et al. Jun 2006 B2
7058817 Ellmore Jun 2006 B1
7062475 Szabo et al. Jun 2006 B1
7065526 Wissner et al. Jun 2006 B2
7065566 Menard et al. Jun 2006 B2
7069240 Spero et al. Jun 2006 B2
7072909 Polk Jul 2006 B2
7076462 Nelson et al. Jul 2006 B1
7085727 VanOrman Aug 2006 B2
7085997 Wu et al. Aug 2006 B1
7086586 Sullivan Aug 2006 B1
7089594 Lal et al. Aug 2006 B2
7103473 Ranjan Sep 2006 B2
7103602 Black et al. Sep 2006 B2
7107241 Pinto Sep 2006 B1
7107285 Von Kaenel et al. Sep 2006 B2
7110978 Chin Sep 2006 B1
7117172 Black Oct 2006 B1
7117529 O'Donnell et al. Oct 2006 B1
7124144 Christianson et al. Oct 2006 B2
7127068 Sundaravel et al. Oct 2006 B2
7127424 Kemp et al. Oct 2006 B2
7139728 Rigole Nov 2006 B2
7143063 Lent Nov 2006 B2
7149782 Sommerer Dec 2006 B2
7155508 Sankuratripati et al. Dec 2006 B2
7155725 Kister et al. Dec 2006 B1
7155739 Bari et al. Dec 2006 B2
7174455 Arnold et al. Feb 2007 B1
7178096 Rangan et al. Feb 2007 B2
7181418 Zucker et al. Feb 2007 B1
7181427 DeFrancesco Feb 2007 B1
7194416 Provost et al. Mar 2007 B1
7200602 Jonas Apr 2007 B2
7200804 Khavari et al. Apr 2007 B1
7206768 deGroeve et al. Apr 2007 B1
7209895 Kundtz et al. Apr 2007 B2
7209911 Boothby et al. Apr 2007 B2
7212995 Schulkins May 2007 B2
7212999 Friesen et al. May 2007 B2
7213064 Smith et al. May 2007 B2
7218912 Erskine et al. May 2007 B2
7219107 Beringer May 2007 B2
7222085 Stack May 2007 B2
7222369 Vering et al. May 2007 B2
7225464 Satyavolu et al. May 2007 B2
7228289 Brumfield et al. Jun 2007 B2
7228335 Caughey Jun 2007 B2
7229006 Babbi et al. Jun 2007 B2
7234156 French et al. Jun 2007 B2
7236950 Savage et al. Jun 2007 B2
7240363 Ellingson Jul 2007 B1
7243369 Bhat et al. Jul 2007 B2
7246361 Scalora et al. Jul 2007 B1
7246740 Swift et al. Jul 2007 B2
7249076 Pendleton et al. Jul 2007 B1
7249080 Hoffman et al. Jul 2007 B1
7249096 Lasater et al. Jul 2007 B1
7249113 Continelli et al. Jul 2007 B1
7263497 Wiser et al. Aug 2007 B1
7263548 Daswani et al. Aug 2007 B2
7280980 Hoadley et al. Oct 2007 B1
7281652 Foss Oct 2007 B2
7289971 O'Neil et al. Oct 2007 B1
7296734 Pliha Nov 2007 B2
7302272 Ackley Nov 2007 B2
7305233 Paul et al. Dec 2007 B2
7310611 Shibuya et al. Dec 2007 B2
7310617 Cunningham Dec 2007 B1
7310618 Libman Dec 2007 B2
7313813 Rangan et al. Dec 2007 B2
7314167 Kiliccote Jan 2008 B1
7315837 Sloan et al. Jan 2008 B2
7328233 Salim et al. Feb 2008 B2
7328435 Trifon Feb 2008 B2
7330717 Gidron et al. Feb 2008 B2
7330831 Biondi et al. Feb 2008 B2
7330835 Deggendorf Feb 2008 B2
7333635 Tsantes et al. Feb 2008 B2
7334020 Caughey Feb 2008 B2
7340042 Cluff et al. Mar 2008 B2
7340679 Botscheck et al. Mar 2008 B2
7343149 Benco Mar 2008 B2
7343295 Pomerance Mar 2008 B2
7346576 Lent et al. Mar 2008 B2
7356503 Johnson et al. Apr 2008 B1
7356506 Watson et al. Apr 2008 B2
7356516 Richey et al. Apr 2008 B2
7366495 Magnotta et al. Apr 2008 B1
7366694 Lazerson Apr 2008 B2
7366759 Trevithick Apr 2008 B2
7370014 Vasavada et al. May 2008 B1
7370044 Mulhern et al. May 2008 B2
7373324 Engin et al. May 2008 B1
7379978 Anderson et al. May 2008 B2
7383988 Slonecker, Jr. Jun 2008 B2
7386511 Buchanan et al. Jun 2008 B2
7386786 Davis et al. Jun 2008 B2
7389913 Starrs Jun 2008 B2
7395241 Cook et al. Jul 2008 B1
7395243 Zielke et al. Jul 2008 B1
7395273 Khan et al. Jul 2008 B2
7400883 Rivers et al. Jul 2008 B2
7401050 O'Neill Jul 2008 B2
7403923 Elliott et al. Jul 2008 B2
7403942 Bayliss Jul 2008 B1
7409369 Homuth et al. Aug 2008 B1
7412228 Barclay et al. Aug 2008 B2
7412487 Caughey Aug 2008 B2
7413113 Zhu Aug 2008 B1
7424520 Daswani et al. Sep 2008 B2
7430520 Haugen et al. Sep 2008 B1
7433864 Malik Oct 2008 B2
7437679 Uemura et al. Oct 2008 B2
7444518 Dharmarajan et al. Oct 2008 B1
7447663 Barker et al. Nov 2008 B1
7451095 Bradley et al. Nov 2008 B1
7451113 Kasower Nov 2008 B1
7458508 Shao et al. Dec 2008 B1
7467401 Cicchitto Dec 2008 B2
7475032 Patnode et al. Jan 2009 B1
7479949 Jobs et al. Jan 2009 B2
7480631 Merced et al. Jan 2009 B1
7483892 Sommer et al. Jan 2009 B1
7490356 Lieblich et al. Feb 2009 B2
RE40692 Rose, Jr. Mar 2009 E
7503489 Heffez Mar 2009 B2
7505931 Silva Mar 2009 B2
7509117 Yum Mar 2009 B2
7509278 Jones Mar 2009 B2
7512221 Toms Mar 2009 B2
7529698 Joao May 2009 B2
7530097 Casco-Arias et al. May 2009 B2
7536329 Goldberg et al. May 2009 B2
7536348 Shao et al. May 2009 B2
7536354 deGroeve et al. May 2009 B1
7542468 Begley et al. Jun 2009 B1
7542922 Bennett et al. Jun 2009 B2
7542993 Satterfield et al. Jun 2009 B2
7543739 Brown et al. Jun 2009 B2
7546271 Chmielewski et al. Jun 2009 B1
7548886 Kirkland et al. Jun 2009 B2
7552080 Willard et al. Jun 2009 B1
7552086 Rajasekar et al. Jun 2009 B1
7552089 Bruer et al. Jun 2009 B2
7552190 Freishtat et al. Jun 2009 B1
7552467 Lindsay Jun 2009 B2
7555459 Dhar et al. Jun 2009 B2
7558748 Ehring et al. Jul 2009 B2
7558777 Santos Jul 2009 B1
7558795 Malik et al. Jul 2009 B2
7559217 Bass Jul 2009 B2
7562184 Henmi et al. Jul 2009 B2
7562382 Hinton et al. Jul 2009 B2
7562814 Shao et al. Jul 2009 B1
7571138 Miri et al. Aug 2009 B2
7571322 Karoubi Aug 2009 B2
7571473 Boydstun et al. Aug 2009 B1
7575157 Barnhardt et al. Aug 2009 B2
7577665 Ramer et al. Aug 2009 B2
7577934 Anonsen et al. Aug 2009 B2
7580884 Cook Aug 2009 B2
7581112 Brown et al. Aug 2009 B2
7583682 Hopkins Sep 2009 B2
7584126 White Sep 2009 B1
7584146 Duhon Sep 2009 B1
7587368 Felsher Sep 2009 B2
7593891 Kornegay et al. Sep 2009 B2
7594019 Clapper Sep 2009 B2
7596716 Frost et al. Sep 2009 B2
7606752 Hazlehurst et al. Oct 2009 B2
7610216 May et al. Oct 2009 B1
7610229 Kornegay Oct 2009 B1
7613600 Krane Nov 2009 B2
7620596 Knudson et al. Nov 2009 B2
7620602 Jakstadt et al. Nov 2009 B2
7620653 Swartz Nov 2009 B1
7623844 Herrmann et al. Nov 2009 B2
7624433 Clark et al. Nov 2009 B1
7630903 Vaidyanathan Dec 2009 B1
7630932 Danaher et al. Dec 2009 B2
7630933 Peterson et al. Dec 2009 B2
7631803 Peyret et al. Dec 2009 B2
7634651 Gerde et al. Dec 2009 B1
7634737 Beringer et al. Dec 2009 B2
7636686 Pierdinock et al. Dec 2009 B2
7640200 Gardner et al. Dec 2009 B2
7640209 Brooks et al. Dec 2009 B1
7644023 Kumar et al. Jan 2010 B2
7644035 Biffle et al. Jan 2010 B1
7644285 Murray et al. Jan 2010 B1
7647274 Peterson et al. Jan 2010 B2
7647344 Skurtovich, Jr. et al. Jan 2010 B2
7653592 Flaxman et al. Jan 2010 B1
7653600 Gustin Jan 2010 B2
7653613 DeGraaff et al. Jan 2010 B1
7653688 Bittner Jan 2010 B2
7664725 Murray et al. Feb 2010 B2
7665657 Huh Feb 2010 B2
7672833 Blume et al. Mar 2010 B2
7672865 Kumar et al. Mar 2010 B2
7672879 Kumar et al. Mar 2010 B1
7672944 Holladay et al. Mar 2010 B1
7676410 Petralia Mar 2010 B2
7676463 Thompson et al. Mar 2010 B2
7680772 Kronberg Mar 2010 B2
7685209 Norton et al. Mar 2010 B1
7685525 Kumar et al. Mar 2010 B2
7686214 Shao et al. Mar 2010 B1
7688813 Shin et al. Mar 2010 B2
7689487 Britto et al. Mar 2010 B1
7689505 Kasower Mar 2010 B2
7689563 Jacobson Mar 2010 B1
7690032 Peirce Mar 2010 B1
7693787 Provinse Apr 2010 B2
7697520 Hopkins Apr 2010 B2
7698214 Lindgren Apr 2010 B1
7698217 Phillips et al. Apr 2010 B1
7702576 Fahner et al. Apr 2010 B2
7707117 Jimenez et al. Apr 2010 B1
7707122 Hull et al. Apr 2010 B2
7707271 Rudkin et al. Apr 2010 B2
7708190 Brandt et al. May 2010 B2
7711626 Nanjundamoorthy et al. May 2010 B2
7711635 Steele et al. May 2010 B2
7711707 Kelley May 2010 B2
7715832 Zhou May 2010 B2
7720705 Stein May 2010 B2
7720750 Brody May 2010 B2
7720846 Bayliss May 2010 B1
7725385 Royer et al. May 2010 B2
7729283 Ferguson et al. Jun 2010 B2
7729959 Wells et al. Jun 2010 B1
7729969 Smith, III et al. Jun 2010 B1
7730078 Schwabe et al. Jun 2010 B2
7734522 Johnson et al. Jun 2010 B2
7734541 Kumar et al. Jun 2010 B2
7734637 Greifeneder et al. Jun 2010 B2
7739193 Zimmer et al. Jun 2010 B2
7739707 Sie et al. Jun 2010 B2
7747520 Livermore et al. Jun 2010 B2
7747521 Serio Jun 2010 B2
7752179 Brown Jul 2010 B1
7752286 Anderson et al. Jul 2010 B2
7756789 Welker et al. Jul 2010 B2
7757944 Cline et al. Jul 2010 B2
7761373 Metz Jul 2010 B2
7761384 Madhogarhia Jul 2010 B2
7761569 Hopkins Jul 2010 B2
7765148 German et al. Jul 2010 B2
7765166 Beringer et al. Jul 2010 B2
7765279 Kaib et al. Jul 2010 B1
7765525 Davidson et al. Jul 2010 B1
7769697 Fieschi et al. Aug 2010 B2
7769998 Lynch et al. Aug 2010 B2
7770002 Weber Aug 2010 B2
7774257 Maggioncalda et al. Aug 2010 B2
7774270 MacCloskey Aug 2010 B1
7778868 Haugen et al. Aug 2010 B2
7783515 Kumar et al. Aug 2010 B1
7783749 Hopkins Aug 2010 B2
7787869 Rice et al. Aug 2010 B2
7788040 Haskell et al. Aug 2010 B2
7792715 Kasower Sep 2010 B1
7792725 Booraem et al. Sep 2010 B2
7792747 Chin Sep 2010 B2
7792903 Fischer et al. Sep 2010 B2
7793835 Coggeshall et al. Sep 2010 B1
7797224 Barone et al. Sep 2010 B2
7797252 Rosskamm et al. Sep 2010 B2
7797644 Bhojan Sep 2010 B1
7797734 Babi et al. Sep 2010 B2
7801807 DeFrancesco et al. Sep 2010 B2
7801811 Merrell et al. Sep 2010 B1
7801828 Candella et al. Sep 2010 B2
7801896 Szabo Sep 2010 B2
7801956 Cumberbatch et al. Sep 2010 B1
7802104 Dickinson Sep 2010 B2
7805348 Nanjundamoorthy et al. Sep 2010 B2
7805362 Merrell et al. Sep 2010 B1
7805439 Elliott et al. Sep 2010 B2
7809398 Pearson Oct 2010 B2
7809624 Smith, III et al. Oct 2010 B1
7809797 Cooley et al. Oct 2010 B2
7810036 Bales et al. Oct 2010 B2
7814002 DeFrancesco et al. Oct 2010 B2
7814005 Imrey et al. Oct 2010 B2
7814431 Quinn et al. Oct 2010 B1
7818228 Coulter Oct 2010 B1
7818229 Imrey et al. Oct 2010 B2
7818382 Sommerer Oct 2010 B2
7822624 Erdmann et al. Oct 2010 B2
7822667 Smith, III et al. Oct 2010 B1
7827108 Perlman et al. Nov 2010 B2
7827115 Weller et al. Nov 2010 B2
7831609 Alexander Nov 2010 B1
7832006 Chen et al. Nov 2010 B2
7835983 Lefner et al. Nov 2010 B2
7835990 Coleman Nov 2010 B2
7840484 Haggerty et al. Nov 2010 B2
7840597 Showalter et al. Nov 2010 B2
7840674 Sterling Nov 2010 B1
7841004 Balducci et al. Nov 2010 B1
7841008 Cole et al. Nov 2010 B1
7844520 Franklin Nov 2010 B1
7844604 Baio et al. Nov 2010 B2
7848972 Sharma Dec 2010 B1
7848978 Imrey et al. Dec 2010 B2
7849014 Erikson Dec 2010 B2
7849397 Ahmed Dec 2010 B1
7853493 DeBie et al. Dec 2010 B2
7853522 Chin Dec 2010 B2
7856203 Lipovski Dec 2010 B2
7856376 Storey Dec 2010 B2
7856386 Hazlehurst et al. Dec 2010 B2
7856453 Malik et al. Dec 2010 B2
7860769 Benson Dec 2010 B2
7860790 Monk Dec 2010 B2
7865412 Weiss et al. Jan 2011 B1
7866548 Reed et al. Jan 2011 B2
7870066 Lin et al. Jan 2011 B2
7870068 Chin Jan 2011 B2
7870078 Clark et al. Jan 2011 B2
7870485 Seliutin et al. Jan 2011 B2
7870491 Henderson et al. Jan 2011 B1
7873563 Barone et al. Jan 2011 B2
7873573 Realini Jan 2011 B2
7873677 Messing et al. Jan 2011 B2
7877304 Coulter Jan 2011 B1
7877402 Weiss et al. Jan 2011 B1
7877784 Chow et al. Jan 2011 B2
7880728 de los Reyes et al. Feb 2011 B2
7890403 Smith Feb 2011 B1
7895227 Henderson Feb 2011 B1
7899750 Klieman et al. Mar 2011 B1
7899757 Talan et al. Mar 2011 B1
7904447 Russell et al. Mar 2011 B1
7904899 Robalewski et al. Mar 2011 B2
7908242 Achanta Mar 2011 B1
7909246 Hogg et al. Mar 2011 B2
7911673 Yap Mar 2011 B1
7912778 Nanjundamoorthy Mar 2011 B2
7912842 Bayliss et al. Mar 2011 B1
7912865 Akerman et al. Mar 2011 B2
7913173 Hebard et al. Mar 2011 B2
7917412 Wang et al. Mar 2011 B1
7917754 Harrison et al. Mar 2011 B1
7925285 Indirabhai Apr 2011 B2
7925582 Kornegay et al. Apr 2011 B1
7925982 Parker Apr 2011 B2
7930239 Pierdinock et al. Apr 2011 B2
7930285 Abraham et al. Apr 2011 B2
7930302 Bandaru et al. Apr 2011 B2
7930411 Hayward Apr 2011 B1
7933834 Kumar et al. Apr 2011 B2
7937325 Kumar et al. May 2011 B2
7941560 Friesen et al. May 2011 B1
7953213 Babi et al. May 2011 B2
7954698 Pliha Jun 2011 B1
7958046 Doerner et al. Jun 2011 B2
7962361 Ramchandani et al. Jun 2011 B2
7965275 Lew Jun 2011 B1
7966192 Pagliari et al. Jun 2011 B2
7966325 Singh Jun 2011 B2
7970676 Feinstein Jun 2011 B2
7970679 Kasower Jun 2011 B2
7970698 Gupta et al. Jun 2011 B2
7970701 Lewis et al. Jun 2011 B2
7970796 Narayanan Jun 2011 B1
7971141 Quinn et al. Jun 2011 B1
7975299 Balducci et al. Jul 2011 B1
7979908 Millwee Jul 2011 B2
7983932 Kane Jul 2011 B2
7983979 Holland, IV Jul 2011 B2
7984436 Murray Jul 2011 B1
7987173 Alexander Jul 2011 B2
7987501 Miller et al. Jul 2011 B2
7990895 Ferguson et al. Aug 2011 B2
7991673 Kumar et al. Aug 2011 B2
7991688 Phelan et al. Aug 2011 B2
8001041 Hoadley et al. Aug 2011 B2
8001153 Skurtovich, Jr. et al. Aug 2011 B2
8001235 Russ et al. Aug 2011 B2
8001582 Hulten et al. Aug 2011 B2
8005755 Freishtat et al. Aug 2011 B2
8006261 Haberman et al. Aug 2011 B1
8010422 Lascelles et al. Aug 2011 B1
8010674 Fong Aug 2011 B2
8014756 Henderson Sep 2011 B1
8015083 Sterling et al. Sep 2011 B1
8015107 Kornegay et al. Sep 2011 B2
8019066 Efrati et al. Sep 2011 B1
8024660 Quinn et al. Sep 2011 B1
8027975 Gabriel et al. Sep 2011 B2
8032822 Artamonov et al. Oct 2011 B1
8032930 Hicks Oct 2011 B2
8032932 Speyer et al. Oct 2011 B2
8036941 Bennett et al. Oct 2011 B2
8037097 Guo et al. Oct 2011 B2
8037115 Scalora et al. Oct 2011 B1
8037176 Hopkins Oct 2011 B2
8041127 Whitelaw Oct 2011 B2
8051074 Eom et al. Nov 2011 B2
8055904 Cato et al. Nov 2011 B1
8060404 Storey Nov 2011 B2
8060424 Kasower Nov 2011 B2
8060438 Dhar et al. Nov 2011 B2
8060508 Gabriel et al. Nov 2011 B2
8060532 White et al. Nov 2011 B2
8060916 Bajaj et al. Nov 2011 B2
8065175 Lewis Nov 2011 B1
8065233 Lee et al. Nov 2011 B2
8065367 Stanley Nov 2011 B1
8069213 Bloch et al. Nov 2011 B2
8069407 Armandpour et al. Nov 2011 B1
8073785 Candella et al. Dec 2011 B1
8078453 Shaw Dec 2011 B2
8078516 Weiss et al. Dec 2011 B1
8078524 Crawford et al. Dec 2011 B2
8078527 Cerise et al. Dec 2011 B2
8078528 Vicente et al. Dec 2011 B1
8078881 Liu Dec 2011 B1
8078986 Rhyne et al. Dec 2011 B1
8086508 Dheer et al. Dec 2011 B2
8086525 Atwood et al. Dec 2011 B2
8090794 Kilat et al. Jan 2012 B1
8095443 DeBie Jan 2012 B2
8095458 Peterson et al. Jan 2012 B2
8095534 Alexander Jan 2012 B1
8095614 Hopkins Jan 2012 B2
8098239 Moore Jan 2012 B1
8099309 Bober Jan 2012 B1
8099341 Varghese Jan 2012 B2
8099356 Feinstein et al. Jan 2012 B2
8099376 Serrano-Morales et al. Jan 2012 B2
8103587 Kumar et al. Jan 2012 B2
8104671 Besecker et al. Jan 2012 B2
8104679 Brown Jan 2012 B2
8108301 Gupta et al. Jan 2012 B2
8117648 Slaton et al. Feb 2012 B2
8122133 Hopkins Feb 2012 B2
8126456 Lotter et al. Feb 2012 B2
8126820 Talan et al. Feb 2012 B1
8127982 Casey et al. Mar 2012 B1
8127986 Taylor et al. Mar 2012 B1
8130075 Hingole Mar 2012 B1
8131598 Goolkasian et al. Mar 2012 B2
8131685 Gedalius et al. Mar 2012 B1
8131777 McCullouch Mar 2012 B2
8131846 Hernacki et al. Mar 2012 B1
8140847 Wu Mar 2012 B1
8145189 Power et al. Mar 2012 B2
8145554 Kumar et al. Mar 2012 B2
8150161 Laaser et al. Apr 2012 B2
8151343 Wang et al. Apr 2012 B1
8151344 Channakeshava Apr 2012 B1
8155950 Bickerstaff Apr 2012 B1
8156175 Hopkins Apr 2012 B2
8160624 Kumar et al. Apr 2012 B2
8160960 Fei et al. Apr 2012 B1
8171471 Daly May 2012 B1
8175889 Girulat et al. May 2012 B1
8194956 Chandler Jun 2012 B2
8195549 Kasower Jun 2012 B2
8196113 Miller et al. Jun 2012 B2
8200966 Grinberg et al. Jun 2012 B2
8201257 Andres et al. Jun 2012 B1
8204812 Stewart et al. Jun 2012 B2
8209659 Mathew Jun 2012 B2
8219473 Gardner et al. Jul 2012 B2
8219771 Le Neel Jul 2012 B2
8224723 Bosch et al. Jul 2012 B2
8224747 Kumar et al. Jul 2012 B2
8224974 Flora et al. Jul 2012 B1
8225270 Frasher et al. Jul 2012 B2
8225288 Miller et al. Jul 2012 B2
8225383 Channakeshava et al. Jul 2012 B1
8225395 Atwood et al. Jul 2012 B2
8229850 Dilip et al. Jul 2012 B2
8229911 Bennett Jul 2012 B2
8234498 Britti et al. Jul 2012 B2
8244635 Freishtat et al. Aug 2012 B2
8244646 Johnston et al. Aug 2012 B2
8244848 Narayanan et al. Aug 2012 B1
8249965 Tumminaro Aug 2012 B2
8249968 Oldham et al. Aug 2012 B1
8255298 Nesladek Aug 2012 B1
8255868 Robalewski Aug 2012 B1
8255978 Dick Aug 2012 B2
8256013 Hernacki et al. Aug 2012 B1
8260649 Ramanujan et al. Sep 2012 B2
8260699 Smith et al. Sep 2012 B2
8260805 Venu et al. Sep 2012 B1
8261204 Huynh et al. Sep 2012 B1
8261334 Hazlehurst et al. Sep 2012 B2
8261974 Hull Sep 2012 B2
8266065 Dilip et al. Sep 2012 B2
8266515 Satyavolu Sep 2012 B2
8271362 Fasching Sep 2012 B2
8271393 Twining et al. Sep 2012 B2
8271650 Alexander Sep 2012 B2
8271894 Mayers Sep 2012 B1
8271899 Blackburn et al. Sep 2012 B1
8271906 Fong Sep 2012 B1
8271961 Chithambaram Sep 2012 B1
8275683 Wolfson et al. Sep 2012 B2
8280723 Laaser Oct 2012 B1
8280879 Alexander Oct 2012 B2
8281372 Vidal Oct 2012 B1
8285613 Coulter Oct 2012 B1
8285640 Scipioni Oct 2012 B2
8285641 Cataline et al. Oct 2012 B2
8285656 Chang et al. Oct 2012 B1
8290835 Homer et al. Oct 2012 B2
8290840 Kasower Oct 2012 B2
8290845 Leibon et al. Oct 2012 B2
8290856 Kasower Oct 2012 B1
8290941 Alexander Oct 2012 B2
8296206 Del Favero et al. Oct 2012 B1
8296229 Yellin et al. Oct 2012 B1
8306255 Degnan Nov 2012 B1
8306889 Leibon et al. Nov 2012 B2
8311792 Podgorny et al. Nov 2012 B1
8312033 McMillan Nov 2012 B1
8320944 Gibson et al. Nov 2012 B1
8321339 Imrey et al. Nov 2012 B2
8321413 Gabriel et al. Nov 2012 B2
8324080 Yang et al. Dec 2012 B2
8326725 Elwell et al. Dec 2012 B2
8327429 Speyer et al. Dec 2012 B2
8335741 Kornegay et al. Dec 2012 B2
8341545 Hebard Dec 2012 B2
8346226 Gibson et al. Jan 2013 B2
8346615 Connors et al. Jan 2013 B2
8347364 Babi et al. Jan 2013 B2
8352564 Campise et al. Jan 2013 B1
8353027 Dennis et al. Jan 2013 B2
8353029 Morgan et al. Jan 2013 B2
8355935 Hellman et al. Jan 2013 B2
8355967 Debie et al. Jan 2013 B2
8359278 Domenikos et al. Jan 2013 B2
8364662 Moyer et al. Jan 2013 B1
8364969 King Jan 2013 B2
8370340 Yu et al. Feb 2013 B1
8374885 Stibel et al. Feb 2013 B2
8374973 Herbrich et al. Feb 2013 B2
8375331 Mayers Feb 2013 B1
8380803 Stibel et al. Feb 2013 B1
8381120 Stibel et al. Feb 2013 B2
8386966 Attinasi et al. Feb 2013 B1
8392230 Stibel et al. Mar 2013 B2
8392969 Park et al. Mar 2013 B1
8396743 Alvin Mar 2013 B2
8396747 Bachenheimer Mar 2013 B2
8400970 Bajar et al. Mar 2013 B2
8401875 Fish et al. Mar 2013 B2
8402526 Ahn Mar 2013 B2
8407141 Mullen et al. Mar 2013 B2
8407194 Chaput et al. Mar 2013 B1
8412593 Song et al. Apr 2013 B1
8413239 Sutton et al. Apr 2013 B2
8417644 Ferguson et al. Apr 2013 B2
8423285 Paterson et al. Apr 2013 B2
8429073 Ferguson et al. Apr 2013 B2
8432275 Patel et al. Apr 2013 B2
8433654 Subbarao et al. Apr 2013 B2
8443202 White et al. May 2013 B2
8453068 Stibel et al. May 2013 B2
8453218 Lan et al. May 2013 B2
8456293 Trundle et al. Jun 2013 B1
8458062 Dutt et al. Jun 2013 B2
8463595 Rehling et al. Jun 2013 B1
8463939 Galvin Jun 2013 B1
8464939 Taylor et al. Jun 2013 B1
8468028 Stibel et al. Jun 2013 B2
8473318 Nielson et al. Jun 2013 B2
8478674 Kapczynski et al. Jul 2013 B1
8484186 Kapczynski et al. Jul 2013 B1
8489480 Kassir Jul 2013 B2
8490197 Herz Jul 2013 B2
8494973 Dignan et al. Jul 2013 B1
8498914 Hazelhurst Jul 2013 B2
8498944 Solomon Jul 2013 B2
8499348 Rubin Jul 2013 B1
8504470 Chirehdast Aug 2013 B1
8515828 Wolf et al. Aug 2013 B1
8515844 Kasower Aug 2013 B2
8527357 Ganesan Sep 2013 B1
8527596 Long et al. Sep 2013 B2
8533118 Weller et al. Sep 2013 B2
8539599 Gomez et al. Sep 2013 B2
8543498 Silbernagel et al. Sep 2013 B2
8544091 Stibel Sep 2013 B2
8548903 Becker Oct 2013 B2
8554584 Hargroder Oct 2013 B2
8555357 Gauvin Oct 2013 B1
8560161 Kator et al. Oct 2013 B1
8560436 Ingram et al. Oct 2013 B2
8560438 Hankey et al. Oct 2013 B2
8560444 Rosenblatt et al. Oct 2013 B2
8560447 Hinghole et al. Oct 2013 B1
8566187 Keld et al. Oct 2013 B2
8572083 Snell et al. Oct 2013 B1
8578036 Holfelder et al. Nov 2013 B1
8578496 Krishnappa Nov 2013 B1
8600768 Stibel et al. Dec 2013 B2
8600886 Ramavarjula et al. Dec 2013 B2
8601602 Zheng Dec 2013 B1
8606694 Campbell et al. Dec 2013 B2
8606869 Stibel et al. Dec 2013 B2
8626137 Devitt et al. Jan 2014 B1
8626637 Gooch et al. Jan 2014 B1
8630893 Stibel et al. Jan 2014 B2
8630938 Cheng et al. Jan 2014 B2
8639930 Stibel et al. Jan 2014 B2
8646051 Paden et al. Feb 2014 B2
8650189 Fertik et al. Feb 2014 B2
8660541 Beresniewicz et al. Feb 2014 B1
8660919 Kasower Feb 2014 B2
8671115 Skurtovich, Jr. et al. Mar 2014 B2
8676684 Newman et al. Mar 2014 B2
8688543 Dominquez Apr 2014 B2
8689001 Satish Apr 2014 B1
8694420 Oliai Apr 2014 B1
8705718 Baniak et al. Apr 2014 B2
8706599 Koenig et al. Apr 2014 B1
8706616 Flynn Apr 2014 B1
8712789 Stibel et al. Apr 2014 B2
8712907 Stibel et al. Apr 2014 B1
8713651 Stibel Apr 2014 B1
8725605 Plunkett May 2014 B1
8725613 Celka et al. May 2014 B1
8732004 Ramos et al. May 2014 B1
8732803 Stibel et al. May 2014 B2
8738449 Cupps et al. May 2014 B1
8738516 Dean et al. May 2014 B1
8745698 Ashfield et al. Jun 2014 B1
8751378 Dornhelm et al. Jun 2014 B2
8768914 Scriffignano et al. Jul 2014 B2
8781951 Lewis et al. Jul 2014 B2
8781953 Kasower Jul 2014 B2
8782217 Arone et al. Jul 2014 B1
8818888 Kapczynski et al. Aug 2014 B1
8819789 Orttung et al. Aug 2014 B2
8825544 Imrey et al. Sep 2014 B2
8856894 Dean et al. Oct 2014 B1
8856945 Carter et al. Oct 2014 B2
8860763 Privault et al. Oct 2014 B2
8868914 Teppler Oct 2014 B2
8882509 Nunamaker Nov 2014 B1
8930251 DeBie Jan 2015 B2
8930263 Mahacek et al. Jan 2015 B1
8938399 Herman Jan 2015 B1
8949981 Trollope et al. Feb 2015 B1
8954459 McMillan et al. Feb 2015 B1
8972400 Kapczynski et al. Mar 2015 B1
9002753 Anschutz et al. Apr 2015 B2
9058627 Wasser et al. Jun 2015 B1
9092616 Kumar et al. Jul 2015 B2
9106691 Burger et al. Aug 2015 B1
9111281 Stibel et al. Aug 2015 B2
9118614 Rogers et al. Aug 2015 B1
9147042 Haller et al. Sep 2015 B1
9218481 Belisario Oct 2015 B2
9183377 Sobel et al. Nov 2015 B1
9202200 Stibel et al. Dec 2015 B2
9225704 Johansson et al. Dec 2015 B1
9230283 Taylor et al. Jan 2016 B1
9256624 Skurtovich, Jr. et al. Feb 2016 B2
9256904 Haller et al. Feb 2016 B1
9324080 Shafron et al. Apr 2016 B2
9349145 Rozman et al. May 2016 B2
9400589 Wasser et al. Jul 2016 B1
9406085 Hunt, III et al. Aug 2016 B1
9418213 Roth et al. Aug 2016 B1
9443268 Kapczynski et al. Sep 2016 B1
9449346 Hockey et al. Sep 2016 B1
9477737 Charyk et al. Oct 2016 B1
9479471 Schoenrock Oct 2016 B2
9483606 Dean et al. Nov 2016 B1
9501583 Nordstrom et al. Nov 2016 B2
9536238 Garrett et al. Jan 2017 B2
9536263 Dean et al. Jan 2017 B1
9542553 Burger et al. Jan 2017 B1
9542682 Taylor et al. Jan 2017 B1
9569797 Rohn et al. Feb 2017 B1
9589266 Pourgallah et al. Mar 2017 B2
9595023 Hockey et al. Mar 2017 B1
9613382 Newstadt et al. Apr 2017 B1
9619751 Woon et al. Apr 2017 B2
9654541 Kapczynski et al. May 2017 B1
9665854 Burger et al. May 2017 B1
9697568 Hunt, III Jul 2017 B1
9704107 Baker, IV et al. Jul 2017 B1
9710523 Skurtovich, Jr. et al. Jul 2017 B2
9710852 Olson et al. Jul 2017 B1
9767513 Taylor et al. Sep 2017 B1
9824199 Kshirsagar et al. Nov 2017 B2
9830646 Wasser et al. Nov 2017 B1
9853959 Kapczynski et al. Dec 2017 B1
9870589 Arnold et al. Jan 2018 B1
9892457 Kapczynski Feb 2018 B1
9916621 Wasser et al. Mar 2018 B1
9972048 Dean et al. May 2018 B1
9990674 Taylor et al. Jun 2018 B1
10002075 O'Leary et al. Jun 2018 B1
10003591 Hockey et al. Jun 2018 B2
10025842 Charyk et al. Jul 2018 B1
10043214 Hunt, III Aug 2018 B1
10061936 Burger et al. Aug 2018 B1
10075446 McMillan et al. Sep 2018 B2
10102570 Kapczynski et al. Oct 2018 B1
10104059 Hockey et al. Oct 2018 B2
10176233 Dean et al. Jan 2019 B1
10187341 Schoenrock Jan 2019 B2
10235965 Horneff Mar 2019 B2
10255598 Dean et al. Apr 2019 B1
10262364 Taylor et al. Apr 2019 B2
10269065 Kapczynski et al. Apr 2019 B1
10277659 Kapczynski et al. Apr 2019 B1
D847840 Poschel et al. May 2019 S
D851126 Tauban Jun 2019 S
D851127 Tauban Jun 2019 S
D851128 Tauban Jun 2019 S
10319029 Hockey et al. Jun 2019 B1
10325314 Kapczynski et al. Jun 2019 B1
10366450 Mahacek et al. Jul 2019 B1
10482532 Kapczynski Nov 2019 B1
10523653 Hockey et al. Dec 2019 B2
10530761 Hockey et al. Jan 2020 B2
10614463 Hockey et al. Apr 2020 B1
10614519 Taylor et al. Apr 2020 B2
10621657 Kasower Apr 2020 B2
10628448 Charyk et al. Apr 2020 B1
10642999 Burger et al. May 2020 B2
10671749 Felice-Steele et al. Jun 2020 B2
10685398 Olson et al. Jun 2020 B1
10686773 Britti et al. Jun 2020 B2
10706453 Morin et al. Jul 2020 B1
10726491 Hockey et al. Jul 2020 B1
10798113 Muddu et al. Oct 2020 B2
10798197 Dean et al. Oct 2020 B2
10839446 Mupkala et al. Nov 2020 B1
10878499 Taylor et al. Dec 2020 B2
10880313 Manna et al. Dec 2020 B2
10891691 Courbage et al. Jan 2021 B2
10916220 Ngo Feb 2021 B2
10929925 Hunt, III Feb 2021 B1
10949428 Poirel et al. Mar 2021 B2
10963959 Wasser et al. Mar 2021 B2
11012491 Kapczynski et al. May 2021 B1
11025629 Chasman et al. Jun 2021 B2
11025638 Ford et al. Jun 2021 B2
11050767 Black et al. Jun 2021 B2
11087022 Burger et al. Aug 2021 B2
11113759 Kapczynski et al. Sep 2021 B1
11132742 Wasser et al. Sep 2021 B1
11157872 McMillan et al. Oct 2021 B2
11200620 Dean et al. Dec 2021 B2
11238656 Lin et al. Feb 2022 B1
11265324 Felice-Steele et al. Mar 2022 B2
11308551 Mahacek et al. Apr 2022 B1
11315179 Rehder et al. Apr 2022 B1
11356430 Kapczynski et al. Jun 2022 B1
11373109 Zoldi et al. Jun 2022 B2
11379916 Taylor et al. Jul 2022 B1
11399029 Manna et al. Jul 2022 B2
11425144 Bondugula et al. Aug 2022 B2
11436626 Lawrence et al. Sep 2022 B2
11461364 Charyk et al. Oct 2022 B1
11489834 Carroll et al. Nov 2022 B1
11514519 Hunt, III Nov 2022 B1
11580598 Rehder et al. Feb 2023 B1
20010014878 Mitra et al. Aug 2001 A1
20010029470 Schultz et al. Oct 2001 A1
20010029482 Tealdi et al. Oct 2001 A1
20010032181 Jakstadt et al. Oct 2001 A1
20010034631 Kiselik Oct 2001 A1
20010037204 Horn et al. Nov 2001 A1
20010037289 Mayr et al. Nov 2001 A1
20010039532 Coleman, Jr. et al. Nov 2001 A1
20010039563 Tian Nov 2001 A1
20010042785 Walker et al. Nov 2001 A1
20010044729 Pomerance Nov 2001 A1
20010044756 Watkins et al. Nov 2001 A1
20010044764 Arnold Nov 2001 A1
20010047332 Gonen-Friedman et al. Nov 2001 A1
20010049274 Degraeve Dec 2001 A1
20010053989 Keller et al. Dec 2001 A1
20020010616 Itzaki Jan 2002 A1
20020010635 Tokiwa Jan 2002 A1
20020013827 Edstrom et al. Jan 2002 A1
20020013899 Faul Jan 2002 A1
20020023108 Daswani et al. Feb 2002 A1
20020029192 Nakagawa et al. Mar 2002 A1
20020032635 Harris et al. Mar 2002 A1
20020033846 Balasubramanian et al. Mar 2002 A1
20020035480 Gordon et al. Mar 2002 A1
20020035520 Weiss Mar 2002 A1
20020045154 Wood et al. Apr 2002 A1
20020049624 Raveis, Jr. Apr 2002 A1
20020052841 Guthrie et al. May 2002 A1
20020055906 Katz et al. May 2002 A1
20020059139 Evans May 2002 A1
20020059201 Work May 2002 A1
20020062249 Iannacci May 2002 A1
20020069122 Yun et al. Jun 2002 A1
20020069182 Dwyer Jun 2002 A1
20020073017 Robertson Jun 2002 A1
20020077964 Brody et al. Jun 2002 A1
20020087460 Hornung Jul 2002 A1
20020091635 Dilip et al. Jul 2002 A1
20020099635 Guiragosian Jul 2002 A1
20020103933 Garon et al. Aug 2002 A1
20020111816 Lortscher et al. Aug 2002 A1
20020111890 Sloan et al. Aug 2002 A1
20020116247 Tucker et al. Aug 2002 A1
20020116331 Cataline et al. Aug 2002 A1
20020120537 Morea et al. Aug 2002 A1
20020120757 Sutherland et al. Aug 2002 A1
20020120846 Stewart et al. Aug 2002 A1
20020126449 Casebolt Sep 2002 A1
20020128917 Grounds Sep 2002 A1
20020128962 Kasower Sep 2002 A1
20020130894 Young et al. Sep 2002 A1
20020133365 Grey et al. Sep 2002 A1
20020133462 Shteyn Sep 2002 A1
20020133504 Vlahos et al. Sep 2002 A1
20020138409 Bass Sep 2002 A1
20020138470 Zhou Sep 2002 A1
20020143943 Lee et al. Oct 2002 A1
20020147801 Gullotta et al. Oct 2002 A1
20020149794 Yoshioka et al. Oct 2002 A1
20020152166 Dutta et al. Oct 2002 A1
20020156676 Ahrens et al. Oct 2002 A1
20020161664 Shaya et al. Oct 2002 A1
20020173994 Ferguson, III Nov 2002 A1
20020174010 Rice Nov 2002 A1
20020174016 Cuervo Nov 2002 A1
20020174048 Dheer et al. Nov 2002 A1
20020174061 Srinivasan et al. Nov 2002 A1
20020188511 Johnson et al. Dec 2002 A1
20020194120 Russell et al. Dec 2002 A1
20020194140 Makuck Dec 2002 A1
20020198800 Shamrakov Dec 2002 A1
20020198806 Blagg et al. Dec 2002 A1
20020198822 Munoz et al. Dec 2002 A1
20020198830 Randell et al. Dec 2002 A1
20030002671 Inchalik et al. Jan 2003 A1
20030004853 Ram et al. Jan 2003 A1
20030004855 Dutta et al. Jan 2003 A1
20030004922 Schmidt et al. Jan 2003 A1
20030007283 Ostwald et al. Jan 2003 A1
20030009411 Ram et al. Jan 2003 A1
20030009415 Lutnick et al. Jan 2003 A1
20030009418 Green et al. Jan 2003 A1
20030009426 Ruiz-Sanchez Jan 2003 A1
20030018578 Schultz Jan 2003 A1
20030023531 Fergusson Jan 2003 A1
20030028466 Jenson et al. Feb 2003 A1
20030028477 Stevenson et al. Feb 2003 A1
20030028529 Cheung Feb 2003 A1
20030036952 Panttaja et al. Feb 2003 A1
20030036995 Lazerson Feb 2003 A1
20030041019 Vagim, III et al. Feb 2003 A1
20030041031 Hedy Feb 2003 A1
20030046311 Baidya et al. Mar 2003 A1
20030048294 Arnold Mar 2003 A1
20030050929 Bookman et al. Mar 2003 A1
20030061104 Thomson et al. Mar 2003 A1
20030061155 Chin Mar 2003 A1
20030061163 Durfield Mar 2003 A1
20030069839 Whittington et al. Apr 2003 A1
20030069943 Bahrs et al. Apr 2003 A1
20030078897 Florance et al. Apr 2003 A1
20030078926 Uthe et al. Apr 2003 A1
20030088472 Offutt et al. May 2003 A1
20030090586 Jan et al. May 2003 A1
20030093289 Thornley et al. May 2003 A1
20030093311 Knowlson May 2003 A1
20030097342 Whittingtom May 2003 A1
20030097380 Mulhern et al. May 2003 A1
20030101111 Dang et al. May 2003 A1
20030101344 Wheeler et al. May 2003 A1
20030105646 Siepser Jun 2003 A1
20030105710 Barbara et al. Jun 2003 A1
20030105733 Boreham Jun 2003 A1
20030105742 Boreham et al. Jun 2003 A1
20030115133 Bian Jun 2003 A1
20030154162 Danaher et al. Aug 2003 A1
20030158960 Engberg Aug 2003 A1
20030163435 Payone Aug 2003 A1
20030163513 Schaeck et al. Aug 2003 A1
20030163733 Barriga-Caceres et al. Aug 2003 A1
20030171942 Gaito Sep 2003 A1
20030186200 Selix Oct 2003 A1
20030187768 Ryan et al. Oct 2003 A1
20030187837 Culliss Oct 2003 A1
20030191711 Jamison et al. Oct 2003 A1
20030191731 Stewart et al. Oct 2003 A1
20030195805 Storey Oct 2003 A1
20030195859 Lawrence Oct 2003 A1
20030200142 Hicks et al. Oct 2003 A1
20030204429 Botscheck et al. Oct 2003 A1
20030204752 Garrison Oct 2003 A1
20030208412 Hillestad et al. Nov 2003 A1
20030212745 Caughey Nov 2003 A1
20030212909 Chandrashekhar Nov 2003 A1
20030214775 Fukuta et al. Nov 2003 A1
20030219709 Olenick et al. Nov 2003 A1
20030220858 Lam et al. Nov 2003 A1
20030225742 Tenner et al. Dec 2003 A1
20030229504 Hollister Dec 2003 A1
20030229580 Gass et al. Dec 2003 A1
20030236701 Rowney et al. Dec 2003 A1
20040001565 Jones et al. Jan 2004 A1
20040006536 Kawashima et al. Jan 2004 A1
20040010458 Friedman Jan 2004 A1
20040015714 Abraham et al. Jan 2004 A1
20040015715 Brown Jan 2004 A1
20040019549 Gulbrandsen Jan 2004 A1
20040019799 Vering et al. Jan 2004 A1
20040024671 Freund Feb 2004 A1
20040024709 Yu et al. Feb 2004 A1
20040030574 DiCostanzo et al. Feb 2004 A1
20040030649 Nelson et al. Feb 2004 A1
20040039586 Garvey et al. Feb 2004 A1
20040044563 Stein Mar 2004 A1
20040044601 Kim et al. Mar 2004 A1
20040044628 Mathew et al. Mar 2004 A1
20040044673 Brady et al. Mar 2004 A1
20040045028 Harris Mar 2004 A1
20040046033 Kolodziej et al. Mar 2004 A1
20040059786 Caughey Mar 2004 A1
20040062213 Koss Apr 2004 A1
20040083159 Crosby et al. Apr 2004 A1
20040083230 Caughey Apr 2004 A1
20040088237 Moenickheim et al. May 2004 A1
20040088255 Zielke et al. May 2004 A1
20040093278 Burchetta et al. May 2004 A1
20040098418 Hein May 2004 A1
20040098546 Bashant et al. May 2004 A1
20040102197 Dietz May 2004 A1
20040107250 Marciano Jun 2004 A1
20040110119 Riconda et al. Jun 2004 A1
20040111359 Hudock Jun 2004 A1
20040117302 Weichert et al. Jun 2004 A1
20040122681 Ruvolo et al. Jun 2004 A1
20040122696 Beringer Jun 2004 A1
20040122697 Becerra et al. Jun 2004 A1
20040128150 Lundegren Jul 2004 A1
20040128156 Beringer et al. Jul 2004 A1
20040128215 Florance et al. Jul 2004 A1
20040133440 Carolan et al. Jul 2004 A1
20040133493 Ford et al. Jul 2004 A1
20040133509 McCoy et al. Jul 2004 A1
20040133513 McCoy et al. Jul 2004 A1
20040133514 Zielke et al. Jul 2004 A1
20040133515 McCoy et al. Jul 2004 A1
20040138992 DeFrancesco et al. Jul 2004 A1
20040138994 DeFrancesco et al. Jul 2004 A1
20040138997 DeFrancesco et al. Jul 2004 A1
20040141005 Banatwala et al. Jul 2004 A1
20040143546 Wood et al. Jul 2004 A1
20040143596 Sirkin Jul 2004 A1
20040148200 Hodges Jul 2004 A1
20040158723 Root Aug 2004 A1
20040159700 Khan et al. Aug 2004 A1
20040167793 Masuoka et al. Aug 2004 A1
20040172360 Mabrey et al. Sep 2004 A1
20040177035 Silva Sep 2004 A1
20040186807 Nathans et al. Sep 2004 A1
20040193538 Raines Sep 2004 A1
20040193891 Ollila Sep 2004 A1
20040198386 Dupray Oct 2004 A1
20040199789 Shaw et al. Oct 2004 A1
20040210661 Thompson Oct 2004 A1
20040215584 Yao Oct 2004 A1
20040215673 Furukawa et al. Oct 2004 A1
20040220865 Lozowski et al. Nov 2004 A1
20040220918 Scriffignano et al. Nov 2004 A1
20040225545 Turner et al. Nov 2004 A1
20040225609 Greene Nov 2004 A1
20040225643 Alpha et al. Nov 2004 A1
20040230499 Stack Nov 2004 A1
20040230527 Hansen et al. Nov 2004 A1
20040236688 Bozeman Nov 2004 A1
20040243450 Bernard, Jr. et al. Dec 2004 A1
20040243508 Samson et al. Dec 2004 A1
20040243588 Tanner et al. Dec 2004 A1
20040249811 Shostack Dec 2004 A1
20040250107 Guo Dec 2004 A1
20040254935 Chagoly et al. Dec 2004 A1
20040255127 Arnouse Dec 2004 A1
20040267714 Frid et al. Dec 2004 A1
20050004864 Lent et al. Jan 2005 A1
20050010494 Mourad et al. Jan 2005 A1
20050010513 Duckworth et al. Jan 2005 A1
20050015273 Iyer Jan 2005 A1
20050021457 Johnson et al. Jan 2005 A1
20050021551 Silva et al. Jan 2005 A1
20050027632 Zeitoun et al. Feb 2005 A1
20050027666 Beck Feb 2005 A1
20050027817 Novik et al. Feb 2005 A1
20050027983 Klawon Feb 2005 A1
20050033660 Solomon Feb 2005 A1
20050050027 Yeh et al. Mar 2005 A1
20050055231 Lee Mar 2005 A1
20050055296 Hattersley et al. Mar 2005 A1
20050058262 Timmins et al. Mar 2005 A1
20050060244 Goolkasian et al. Mar 2005 A1
20050060332 Bernstein et al. Mar 2005 A1
20050071328 Lawrence Mar 2005 A1
20050080716 Belyi et al. Apr 2005 A1
20050080723 Burchetta et al. Apr 2005 A1
20050080796 Midgley Apr 2005 A1
20050086126 Patterson Apr 2005 A1
20050086261 Mammone Apr 2005 A1
20050091164 Varble Apr 2005 A1
20050097017 Hanratty May 2005 A1
20050097039 Kulcsar et al. May 2005 A1
20050097320 Golan et al. May 2005 A1
20050102180 Gailey et al. May 2005 A1
20050102209 Sagrillo et al. May 2005 A1
20050105719 Huda May 2005 A1
20050108396 Bittner May 2005 A1
20050108631 Amorin et al. May 2005 A1
20050114335 Wesinger, Jr. et al. May 2005 A1
20050114344 Wesinger, Jr. et al. May 2005 A1
20050114345 Wesinger, Jr. et al. May 2005 A1
20050125291 Demkiw Grayson et al. Jun 2005 A1
20050125397 Gross et al. Jun 2005 A1
20050125686 Brandt Jun 2005 A1
20050137899 Davies et al. Jun 2005 A1
20050144143 Freiberg Jun 2005 A1
20050154664 Guy et al. Jul 2005 A1
20050154665 Kerr Jul 2005 A1
20050154769 Eckart et al. Jul 2005 A1
20050160051 Johnson Jul 2005 A1
20050160280 Caslin et al. Jul 2005 A1
20050171884 Arnott Aug 2005 A1
20050198377 Ferguson et al. Sep 2005 A1
20050203768 Florance Sep 2005 A1
20050203844 Ferguson et al. Sep 2005 A1
20050203864 Schmidt et al. Sep 2005 A1
20050208461 Krebs et al. Sep 2005 A1
20050216434 Haveliwala et al. Sep 2005 A1
20050216524 Gomes et al. Sep 2005 A1
20050216953 Ellingson Sep 2005 A1
20050216955 Wilkins et al. Sep 2005 A1
20050226224 Lee et al. Oct 2005 A1
20050240578 Biederman et al. Oct 2005 A1
20050251474 Shinn et al. Nov 2005 A1
20050256766 Garcia et al. Nov 2005 A1
20050267840 Holm-Blagg et al. Dec 2005 A1
20050273431 Abel et al. Dec 2005 A1
20050283415 Studnitzer et al. Dec 2005 A1
20050288998 Verma et al. Dec 2005 A1
20060004623 Jasti Jan 2006 A1
20060004626 Holmen et al. Jan 2006 A1
20060010391 Uemura et al. Jan 2006 A1
20060031158 Orman Feb 2006 A1
20060031177 Rule Feb 2006 A1
20060032909 Seegar Feb 2006 A1
20060036543 Blagg et al. Feb 2006 A1
20060036748 Nusbaum et al. Feb 2006 A1
20060041464 Powers et al. Feb 2006 A1
20060041670 Musseleck et al. Feb 2006 A1
20060047605 Ahmad Mar 2006 A1
20060059062 Wood et al. Mar 2006 A1
20060059083 Friesen et al. Mar 2006 A1
20060059110 Madhok et al. Mar 2006 A1
20060059362 Paden et al. Mar 2006 A1
20060074986 Mallalieu et al. Apr 2006 A1
20060074991 Lussier et al. Apr 2006 A1
20060079211 Degraeve Apr 2006 A1
20060080210 Mourad et al. Apr 2006 A1
20060080216 Hausman et al. Apr 2006 A1
20060080230 Freiberg Apr 2006 A1
20060080235 Fukuda et al. Apr 2006 A1
20060080251 Fried et al. Apr 2006 A1
20060080263 Willis et al. Apr 2006 A1
20060080274 Mourad Apr 2006 A1
20060085334 Murphy Apr 2006 A1
20060085361 Hoerle et al. Apr 2006 A1
20060095289 Bunning May 2006 A1
20060101508 Taylor May 2006 A1
20060106670 Cai et al. May 2006 A1
20060116931 Storey Jun 2006 A1
20060116932 Storey Jun 2006 A1
20060129419 Flaxer et al. Jun 2006 A1
20060129472 Harrington Jun 2006 A1
20060129481 Bhatt et al. Jun 2006 A1
20060129533 Purvis Jun 2006 A1
20060131390 Kim Jun 2006 A1
20060136524 Wohlers et al. Jun 2006 A1
20060136595 Satyavolu Jun 2006 A1
20060155780 Sakairi et al. Jul 2006 A1
20060161435 Atef et al. Jul 2006 A1
20060161478 Turner et al. Jul 2006 A1
20060161554 Lucovsky et al. Jul 2006 A1
20060173776 Shalley et al. Aug 2006 A1
20060173792 Glass Aug 2006 A1
20060178971 Owen et al. Aug 2006 A1
20060179050 Giang et al. Aug 2006 A1
20060184585 Grear et al. Aug 2006 A1
20060190394 Fraser et al. Aug 2006 A1
20060195351 Bayburtian Aug 2006 A1
20060200583 Le Lann et al. Sep 2006 A1
20060202012 Grano et al. Sep 2006 A1
20060212407 Lyon Sep 2006 A1
20060212486 Kennis et al. Sep 2006 A1
20060213985 Walker et al. Sep 2006 A1
20060218407 Toms Sep 2006 A1
20060223043 Dancy-Edwards et al. Oct 2006 A1
20060224498 Chin Oct 2006 A1
20060229943 Mathias et al. Oct 2006 A1
20060229961 Lyftogt et al. Oct 2006 A1
20060230343 Armandpour et al. Oct 2006 A1
20060235935 Ng Oct 2006 A1
20060239512 Petrillo Oct 2006 A1
20060245731 Lai Nov 2006 A1
20060248021 Jain et al. Nov 2006 A1
20060248048 Jain et al. Nov 2006 A1
20060248525 Hopkins Nov 2006 A1
20060253358 Delgrosso et al. Nov 2006 A1
20060253463 Wu et al. Nov 2006 A1
20060262929 Vatanen et al. Nov 2006 A1
20060271456 Romain et al. Nov 2006 A1
20060271457 Romain et al. Nov 2006 A1
20060271633 Adler Nov 2006 A1
20060277089 Hubbard et al. Dec 2006 A1
20060277102 Agliozzo Dec 2006 A1
20060282359 Nobili et al. Dec 2006 A1
20060282373 Stone Dec 2006 A1
20060282374 Stone Dec 2006 A1
20060282429 Hernandez-Sherrington et al. Dec 2006 A1
20060282819 Graham et al. Dec 2006 A1
20060282886 Gaug Dec 2006 A1
20060287764 Kraft Dec 2006 A1
20060287765 Kraft Dec 2006 A1
20060287766 Kraft Dec 2006 A1
20060287767 Kraft Dec 2006 A1
20060288090 Kraft Dec 2006 A1
20060293987 Shapiro Dec 2006 A1
20060294199 Bertholf Dec 2006 A1
20070005508 Chiang Jan 2007 A1
20070005984 Florencio et al. Jan 2007 A1
20070016500 Chatterji et al. Jan 2007 A1
20070022141 Singleton et al. Jan 2007 A1
20070027816 Writer Feb 2007 A1
20070032240 Finnegan et al. Feb 2007 A1
20070038568 Greene et al. Feb 2007 A1
20070039049 Kupferman et al. Feb 2007 A1
20070040015 Carlson et al. Feb 2007 A1
20070043577 Kasower Feb 2007 A1
20070047714 Baniak et al. Mar 2007 A1
20070050777 Hutchinson et al. Mar 2007 A1
20070055621 Tischler et al. Mar 2007 A1
20070057947 Yokoyama Mar 2007 A1
20070061260 deGroeve et al. Mar 2007 A1
20070067235 Nathans et al. Mar 2007 A1
20070067297 Kublickis Mar 2007 A1
20070072190 Aggarwal Mar 2007 A1
20070073577 Krause Mar 2007 A1
20070073889 Morris Mar 2007 A1
20070078908 Rohatgi et al. Apr 2007 A1
20070078985 Shao et al. Apr 2007 A1
20070078990 Hopkins Apr 2007 A1
20070080826 Chang Apr 2007 A1
20070083460 Bachenheimer Apr 2007 A1
20070083463 Kraft Apr 2007 A1
20070088821 Sankuratripati et al. Apr 2007 A1
20070093234 Willis et al. Apr 2007 A1
20070094230 Subramaniam et al. Apr 2007 A1
20070094241 M. Blackwell et al. Apr 2007 A1
20070112667 Rucker May 2007 A1
20070112668 Celano et al. May 2007 A1
20070112670 DeFrancesco et al. May 2007 A1
20070121843 Atazky et al. May 2007 A1
20070124235 Chakraborty et al. May 2007 A1
20070124256 Crooks et al. May 2007 A1
20070130347 Rangan et al. Jun 2007 A1
20070131755 Chang Jun 2007 A1
20070136109 Yager et al. Jun 2007 A1
20070143123 Goldberg et al. Jun 2007 A1
20070149184 Viegers et al. Jun 2007 A1
20070152068 Kurita Jul 2007 A1
20070153085 Chang Jul 2007 A1
20070153710 Hopkins Jul 2007 A1
20070156554 Nikoley et al. Jul 2007 A1
20070156692 Rosewarne Jul 2007 A1
20070157107 Bishop Jul 2007 A1
20070160458 Yen Jul 2007 A1
20070162369 Hardison Jul 2007 A1
20070162458 Fasciano Jul 2007 A1
20070174166 Jones Jul 2007 A1
20070174186 Hokland Jul 2007 A1
20070174448 Ahuja et al. Jul 2007 A1
20070174903 Greff Jul 2007 A1
20070180380 Khavari et al. Aug 2007 A1
20070192167 Lei et al. Aug 2007 A1
20070198432 Pitroda et al. Aug 2007 A1
20070203954 Vargas et al. Aug 2007 A1
20070204033 Bookbinder et al. Aug 2007 A1
20070204212 Chamberlain et al. Aug 2007 A1
20070204338 Aiello et al. Aug 2007 A1
20070205266 Carr et al. Sep 2007 A1
20070206917 Ono et al. Sep 2007 A1
20070208640 Banasiak et al. Sep 2007 A1
20070219966 Baylis et al. Sep 2007 A1
20070220003 Chern et al. Sep 2007 A1
20070220092 Heitzeberg et al. Sep 2007 A1
20070220275 Heitzeberg et al. Sep 2007 A1
20070220581 Chang Sep 2007 A1
20070226047 Ward Sep 2007 A1
20070226122 Burrell et al. Sep 2007 A1
20070233591 Newton Oct 2007 A1
20070236562 Chang Oct 2007 A1
20070239493 Sweetland et al. Oct 2007 A1
20070240206 Wu et al. Oct 2007 A1
20070244807 Andringa et al. Oct 2007 A1
20070245245 Blue et al. Oct 2007 A1
20070250441 Paulsen et al. Oct 2007 A1
20070250459 Schwarz et al. Oct 2007 A1
20070262140 Long, Sr. Nov 2007 A1
20070266439 Kraft Nov 2007 A1
20070273558 Smith Nov 2007 A1
20070276780 Iriyama et al. Nov 2007 A1
20070282743 Lovelett Dec 2007 A1
20070287415 Yamada Dec 2007 A1
20070288355 Roland et al. Dec 2007 A1
20070288360 Seeklus Dec 2007 A1
20070294195 Curry et al. Dec 2007 A1
20070299770 Delinsky Dec 2007 A1
20070299772 Mastie et al. Dec 2007 A1
20080004957 Hildreth et al. Jan 2008 A1
20080010203 Grant Jan 2008 A1
20080010206 Coleman Jan 2008 A1
20080010687 Gonen et al. Jan 2008 A1
20080015919 Busse et al. Jan 2008 A1
20080015979 Bentley Jan 2008 A1
20080021802 Pendleton Jan 2008 A1
20080021816 Lent et al. Jan 2008 A1
20080027859 Nathans et al. Jan 2008 A1
20080028435 Strickland et al. Jan 2008 A1
20080028446 Burgoyne Jan 2008 A1
20080033956 Saha et al. Feb 2008 A1
20080040176 Ehling Feb 2008 A1
20080040475 Bosworth et al. Feb 2008 A1
20080040610 Fergusson Feb 2008 A1
20080047017 Renaud Feb 2008 A1
20080052170 Storey Feb 2008 A1
20080052182 Marshall Feb 2008 A1
20080052244 Tsuei et al. Feb 2008 A1
20080059352 Chandran Mar 2008 A1
20080059364 Tidwell et al. Mar 2008 A1
20080059447 Winner et al. Mar 2008 A1
20080060054 Srivastava Mar 2008 A1
20080065774 Keeler Mar 2008 A1
20080066188 Kwak Mar 2008 A1
20080072316 Chang et al. Mar 2008 A1
20080077526 Arumugam Mar 2008 A1
20080079809 Chang Apr 2008 A1
20080082536 Schwabe et al. Apr 2008 A1
20080083021 Doane et al. Apr 2008 A1
20080086400 Ardelean et al. Apr 2008 A1
20080086431 Robinson et al. Apr 2008 A1
20080091519 Foss Apr 2008 A1
20080091530 Egnatios et al. Apr 2008 A1
20080097822 Schigel et al. Apr 2008 A1
20080103798 Domenikos et al. May 2008 A1
20080103800 Domenikos et al. May 2008 A1
20080103972 Lanc May 2008 A1
20080109308 Storey May 2008 A1
20080109422 Dedhia May 2008 A1
20080109740 Prinsen et al. May 2008 A1
20080110973 Nathans et al. May 2008 A1
20080114670 Friesen May 2008 A1
20080114855 Welingkar et al. May 2008 A1
20080115226 Welingkar et al. May 2008 A1
20080120155 Pliha May 2008 A1
20080120204 Conner et al. May 2008 A1
20080120416 Hopkins et al. May 2008 A1
20080120569 Mann et al. May 2008 A1
20080120716 Hall et al. May 2008 A1
20080122920 Chang May 2008 A1
20080126233 Hogan May 2008 A1
20080133273 Marshall Jun 2008 A1
20080133278 Stanfield Jun 2008 A1
20080133657 Pennington Jun 2008 A1
20080140476 Anand et al. Jun 2008 A1
20080140734 Wagner Jun 2008 A1
20080140780 Hopkins et al. Jun 2008 A1
20080141346 Kay et al. Jun 2008 A1
20080147523 Mulry et al. Jun 2008 A1
20080148368 Zurko et al. Jun 2008 A1
20080148392 Akens Jun 2008 A1
20080154758 Schattmaier et al. Jun 2008 A1
20080162236 Sommerer Jul 2008 A1
20080162317 Banaugh et al. Jul 2008 A1
20080162350 Allen-Rouman et al. Jul 2008 A1
20080162383 Kraft Jul 2008 A1
20080172304 Berkowitz Jul 2008 A1
20080175360 Schwarz et al. Jul 2008 A1
20080177655 Zalik Jul 2008 A1
20080183480 Carlson et al. Jul 2008 A1
20080183585 Vianello Jul 2008 A1
20080184351 Gephart Jul 2008 A1
20080195548 Chu et al. Aug 2008 A1
20080201257 Lewis et al. Aug 2008 A1
20080201401 Pugh et al. Aug 2008 A1
20080208548 Metzger et al. Aug 2008 A1
20080208726 Tsantes et al. Aug 2008 A1
20080208735 Balet et al. Aug 2008 A1
20080212845 Lund Sep 2008 A1
20080215640 Hartz et al. Sep 2008 A1
20080216156 Kosaka Sep 2008 A1
20080221972 Megdal et al. Sep 2008 A1
20080222027 Megdal et al. Sep 2008 A1
20080222706 Renaud et al. Sep 2008 A1
20080228556 Megdal et al. Sep 2008 A1
20080228775 Abhyanker et al. Sep 2008 A1
20080229415 Kapoor et al. Sep 2008 A1
20080249869 Angell et al. Oct 2008 A1
20080249925 Nazari et al. Oct 2008 A1
20080255992 Lin Oct 2008 A1
20080263013 Hopkins Oct 2008 A1
20080263638 McMurtry et al. Oct 2008 A1
20080270038 Partovi et al. Oct 2008 A1
20080270209 Mauseth et al. Oct 2008 A1
20080270292 Ghosh et al. Oct 2008 A1
20080270294 Lent et al. Oct 2008 A1
20080270295 Lent et al. Oct 2008 A1
20080275816 Hazlehurst Nov 2008 A1
20080277465 Pletz et al. Nov 2008 A1
20080281737 Fajardo Nov 2008 A1
20080282324 Hoal Nov 2008 A1
20080284586 Chang Nov 2008 A1
20080288283 Baldwin, Jr. et al. Nov 2008 A1
20080288299 Schultz Nov 2008 A1
20080294501 Rennich et al. Nov 2008 A1
20080297602 Chang Dec 2008 A1
20080301016 Durvasula et al. Dec 2008 A1
20080306846 Ferguson Dec 2008 A1
20080307063 Caughey Dec 2008 A1
20080316010 Chang Dec 2008 A1
20080319861 Hopkins Dec 2008 A1
20080319889 Hammad Dec 2008 A1
20080319896 Carlson et al. Dec 2008 A1
20090006230 Lyda et al. Jan 2009 A1
20090006582 Daswani et al. Jan 2009 A1
20090018986 Alcorn et al. Jan 2009 A1
20090024462 Lin Jan 2009 A1
20090024484 Walker et al. Jan 2009 A1
20090024485 Haugen et al. Jan 2009 A1
20090030776 Walker et al. Jan 2009 A1
20090037279 Chockalingam et al. Feb 2009 A1
20090037323 Feinstein et al. Feb 2009 A1
20090037332 Cheung et al. Feb 2009 A1
20090043691 Kasower Feb 2009 A1
20090048957 Celano Feb 2009 A1
20090048999 Gupta et al. Feb 2009 A1
20090055287 Chin Feb 2009 A1
20090055312 Chin Feb 2009 A1
20090055322 Bykov et al. Feb 2009 A1
20090055404 Heiden et al. Feb 2009 A1
20090064297 Selgas et al. Mar 2009 A1
20090069000 Kindberg Mar 2009 A1
20090070148 Skocic Mar 2009 A1
20090076950 Chang et al. Mar 2009 A1
20090076966 Bishop et al. Mar 2009 A1
20090089190 Girulat Apr 2009 A1
20090089193 Palantin Apr 2009 A1
20090089869 Varghese Apr 2009 A1
20090094237 Churi et al. Apr 2009 A1
20090094675 Powers Apr 2009 A1
20090099941 Berkowitz Apr 2009 A1
20090100047 Jones et al. Apr 2009 A1
20090106141 Becker Apr 2009 A1
20090106150 Pelegero et al. Apr 2009 A1
20090106846 Dupray et al. Apr 2009 A1
20090119116 Steen May 2009 A1
20090119299 Rhodes May 2009 A1
20090125369 Kloostra et al. May 2009 A1
20090125972 Hinton et al. May 2009 A1
20090132347 Anderson et al. May 2009 A1
20090132813 Schibuk May 2009 A1
20090146879 Chang Jun 2009 A1
20090147774 Caughey Jun 2009 A1
20090157564 Cross Jun 2009 A1
20090157693 Palahnuk Jun 2009 A1
20090158030 Rasti Jun 2009 A1
20090164380 Brown Jun 2009 A1
20090164582 Dasgupta et al. Jun 2009 A1
20090164929 Chen et al. Jun 2009 A1
20090171723 Jenkins Jul 2009 A1
20090172788 Veldula et al. Jul 2009 A1
20090172795 Ritari et al. Jul 2009 A1
20090177529 Hadi Jul 2009 A1
20090177562 Peace et al. Jul 2009 A1
20090177670 Grenier et al. Jul 2009 A1
20090183259 Rinek et al. Jul 2009 A1
20090187607 Yoo et al. Jul 2009 A1
20090195377 Chang Aug 2009 A1
20090198557 Wang et al. Aug 2009 A1
20090198602 Wang et al. Aug 2009 A1
20090199294 Schneider Aug 2009 A1
20090204514 Bhogal et al. Aug 2009 A1
20090204599 Morris et al. Aug 2009 A1
20090210241 Calloway Aug 2009 A1
20090210807 Xiao et al. Aug 2009 A1
20090216640 Masi Aug 2009 A1
20090217342 Nadler Aug 2009 A1
20090222364 McGlynn et al. Sep 2009 A1
20090222449 Hom et al. Sep 2009 A1
20090222527 Arconati et al. Sep 2009 A1
20090228295 Lowy Sep 2009 A1
20090228392 Pinson, III Sep 2009 A1
20090228918 Rolff et al. Sep 2009 A1
20090228990 Chen et al. Sep 2009 A1
20090233579 Castell et al. Sep 2009 A1
20090234665 Conkel Sep 2009 A1
20090234775 Whitney et al. Sep 2009 A1
20090234814 Boerries et al. Sep 2009 A1
20090234876 Schigel et al. Sep 2009 A1
20090240624 James et al. Sep 2009 A1
20090247122 Fitzgerald et al. Oct 2009 A1
20090248573 Haggerty et al. Oct 2009 A1
20090249451 Su et al. Oct 2009 A1
20090254375 Martinez et al. Oct 2009 A1
20090254476 Sharma et al. Oct 2009 A1
20090254656 Vignisson et al. Oct 2009 A1
20090254971 Herz et al. Oct 2009 A1
20090258334 Pyne Oct 2009 A1
20090260064 Mcdowell et al. Oct 2009 A1
20090271265 Lay et al. Oct 2009 A1
20090276368 Martin et al. Nov 2009 A1
20090280467 Ahart Nov 2009 A1
20090281816 Houga et al. Nov 2009 A1
20090281941 Worth Nov 2009 A1
20090281951 Shakkarwar Nov 2009 A1
20090282479 Smith et al. Nov 2009 A1
20090289110 Regen et al. Nov 2009 A1
20090300066 Guo et al. Dec 2009 A1
20090300604 Barringer Dec 2009 A1
20090300641 Friedman et al. Dec 2009 A1
20090307778 Mardikar Dec 2009 A1
20090313562 Appleyard et al. Dec 2009 A1
20090319648 Dutta et al. Dec 2009 A1
20090327054 Yao et al. Dec 2009 A1
20090327120 Eze et al. Dec 2009 A1
20090327270 Teevan et al. Dec 2009 A1
20090327487 Olson et al. Dec 2009 A1
20100009320 Wilkelis Jan 2010 A1
20100009332 Yaskin et al. Jan 2010 A1
20100010935 Shelton Jan 2010 A1
20100010993 Hussey, Jr. et al. Jan 2010 A1
20100011428 Atwood et al. Jan 2010 A1
20100023434 Bond Jan 2010 A1
20100023440 Fraser et al. Jan 2010 A1
20100023448 Eze Jan 2010 A1
20100023506 Sahni et al. Jan 2010 A1
20100025820 Suekawa Feb 2010 A1
20100030578 Siddique et al. Feb 2010 A1
20100030649 Ubelhor Feb 2010 A1
20100030677 Melik-Aslanian et al. Feb 2010 A1
20100036697 Kelnar Feb 2010 A1
20100036769 Winters et al. Feb 2010 A1
20100042537 Smith et al. Feb 2010 A1
20100042542 Rose et al. Feb 2010 A1
20100042732 Hopkins Feb 2010 A1
20100043055 Baumgart Feb 2010 A1
20100049803 Ogilvie et al. Feb 2010 A1
20100063906 Nelsen et al. Mar 2010 A1
20100063942 Arnott et al. Mar 2010 A1
20100063993 Higgins et al. Mar 2010 A1
20100076833 Nelsen Mar 2010 A1
20100076966 Strutton et al. Mar 2010 A1
20100077483 Stolfo et al. Mar 2010 A1
20100082445 Hodge et al. Apr 2010 A1
20100082476 Bowman Apr 2010 A1
20100083371 Bennetts et al. Apr 2010 A1
20100088188 Kumar et al. Apr 2010 A1
20100094768 Miltonberger Apr 2010 A1
20100094774 Jackowitz et al. Apr 2010 A1
20100094910 Bayliss Apr 2010 A1
20100100945 Ozzie et al. Apr 2010 A1
20100114724 Ghosh et al. May 2010 A1
20100114744 Gonen May 2010 A1
20100114776 Weller et al. May 2010 A1
20100122324 Welingkar et al. May 2010 A1
20100122333 Noe et al. May 2010 A1
20100130172 Vendrow et al. May 2010 A1
20100136956 Drachev et al. Jun 2010 A1
20100145836 Baker et al. Jun 2010 A1
20100153278 Farsedakis Jun 2010 A1
20100153290 Duggan Jun 2010 A1
20100161486 Liu et al. Jun 2010 A1
20100161816 Kraft et al. Jun 2010 A1
20100169159 Rose et al. Jul 2010 A1
20100174638 Debie et al. Jul 2010 A1
20100174813 Hildreth et al. Jul 2010 A1
20100179906 Hawkes Jul 2010 A1
20100185546 Pollard Jul 2010 A1
20100188684 Kumara Jul 2010 A1
20100198636 Choudhary et al. Aug 2010 A1
20100205076 Parson et al. Aug 2010 A1
20100205662 Ibrahim et al. Aug 2010 A1
20100211445 Bodington Aug 2010 A1
20100211636 Starkenburg et al. Aug 2010 A1
20100214090 Sartini et al. Aug 2010 A1
20100215270 Manohar et al. Aug 2010 A1
20100217837 Ansari et al. Aug 2010 A1
20100223160 Brown Sep 2010 A1
20100223184 Perlman Sep 2010 A1
20100223192 Levine et al. Sep 2010 A1
20100228658 Ketelsen et al. Sep 2010 A1
20100229245 Singhal Sep 2010 A1
20100241535 Nightengale et al. Sep 2010 A1
20100248681 Phills Sep 2010 A1
20100250338 Banerjee et al. Sep 2010 A1
20100250410 Song et al. Sep 2010 A1
20100250411 Ogrodski Sep 2010 A1
20100250416 Hazlehurst Sep 2010 A1
20100250509 Andersen Sep 2010 A1
20100253686 Alsbury et al. Oct 2010 A1
20100257102 Perlman Oct 2010 A1
20100257234 Caughey Oct 2010 A1
20100257577 Grandison et al. Oct 2010 A1
20100258623 Beemer et al. Oct 2010 A1
20100258625 Stanfield et al. Oct 2010 A1
20100259373 Chang Oct 2010 A1
20100262339 Chang Oct 2010 A1
20100262535 Lent et al. Oct 2010 A1
20100262606 Bedolla et al. Oct 2010 A1
20100262932 Pan Oct 2010 A1
20100268557 Faith et al. Oct 2010 A1
20100268768 Kurtenbach et al. Oct 2010 A1
20100274815 Vanasco Oct 2010 A1
20100280914 Carlson Nov 2010 A1
20100281020 Drubner Nov 2010 A1
20100293090 Domenikos et al. Nov 2010 A1
20100306834 Grandison et al. Dec 2010 A1
20100312691 Johnson, Jr. Dec 2010 A1
20100323446 Barnett et al. Dec 2010 A1
20100324999 Conway et al. Dec 2010 A1
20100325048 Carlson et al. Dec 2010 A1
20100332393 Weller et al. Dec 2010 A1
20110004498 Readshaw Jan 2011 A1
20110023115 Wright Jan 2011 A1
20110029388 Kendall et al. Feb 2011 A1
20110029566 Grandison et al. Feb 2011 A1
20110029660 Hopkins Feb 2011 A1
20110035305 Imrey et al. Feb 2011 A1
20110035315 Langley Feb 2011 A1
20110035452 Gittleman Feb 2011 A1
20110040629 Chiu et al. Feb 2011 A1
20110047086 Heisterkamp et al. Feb 2011 A1
20110047606 Blomquist Feb 2011 A1
20110066495 Ayloo et al. Mar 2011 A1
20110066618 Sigurbjornsson et al. Mar 2011 A1
20110066695 Hopkins Mar 2011 A1
20110071950 Ivanovic Mar 2011 A1
20110078073 Annappindi et al. Mar 2011 A1
20110083181 Nazarov Apr 2011 A1
20110107265 Buchanan et al. May 2011 A1
20110107400 Shankaranarayanan et al. May 2011 A1
20110112899 Strutton et al. May 2011 A1
20110113084 Ramnani May 2011 A1
20110113086 Long et al. May 2011 A1
20110119169 Passero et al. May 2011 A1
20110119765 Hering et al. May 2011 A1
20110125924 McAleer May 2011 A1
20110126275 Anderson et al. May 2011 A1
20110131123 Griffin et al. Jun 2011 A1
20110137760 Rudie et al. Jun 2011 A1
20110137765 Nonaka Jun 2011 A1
20110142213 Baniak et al. Jun 2011 A1
20110143711 Hirson et al. Jun 2011 A1
20110145064 Anderson et al. Jun 2011 A1
20110145899 Cao et al. Jun 2011 A1
20110148625 Velusamy Jun 2011 A1
20110161155 Wilhem et al. Jun 2011 A1
20110166988 Coulter Jul 2011 A1
20110167011 Paltenghe et al. Jul 2011 A1
20110178841 Rane et al. Jul 2011 A1
20110178899 Huszar Jul 2011 A1
20110179139 Starkenburg et al. Jul 2011 A1
20110184780 Alderson et al. Jul 2011 A1
20110193704 Harper et al. Aug 2011 A1
20110196791 Dominguez Aug 2011 A1
20110211445 Chen Sep 2011 A1
20110213670 Strutton et al. Sep 2011 A1
20110214187 Wittenstein et al. Sep 2011 A1
20110243406 Chandler Oct 2011 A1
20110252071 Cidon Oct 2011 A1
20110264531 Bhatia et al. Oct 2011 A1
20110264566 Brown Oct 2011 A1
20110264581 Clyne Oct 2011 A1
20110270618 Banerjee et al. Nov 2011 A1
20110270754 Kelly et al. Nov 2011 A1
20110271329 Hulten et al. Nov 2011 A1
20110276382 Ramchandani et al. Nov 2011 A1
20110276396 Rathod Nov 2011 A1
20110276604 Hom et al. Nov 2011 A1
20110282711 Freishtat et al. Nov 2011 A1
20110282783 Ferguson et al. Nov 2011 A1
20110282943 Anderson et al. Nov 2011 A1
20110289151 Hopkins Nov 2011 A1
20110289209 Hopkins Nov 2011 A1
20110296003 McCann et al. Dec 2011 A1
20110302083 Bhinder Dec 2011 A1
20110302653 Frantz et al. Dec 2011 A1
20110304646 Kato Dec 2011 A1
20110307397 Benmbarek Dec 2011 A1
20110307434 Rostampour et al. Dec 2011 A1
20110307474 Hom et al. Dec 2011 A1
20110307494 Snow Dec 2011 A1
20110307938 Reeves, Jr. et al. Dec 2011 A1
20110307957 Barcelo et al. Dec 2011 A1
20110313915 Tang Dec 2011 A1
20110314100 Hopkins Dec 2011 A1
20110314383 Abdo et al. Dec 2011 A1
20110320582 Lewis Dec 2011 A1
20110321137 Iida et al. Dec 2011 A1
20120005070 McFall et al. Jan 2012 A1
20120005221 Ickman et al. Jan 2012 A1
20120005542 Petersen et al. Jan 2012 A1
20120010927 Attenberg et al. Jan 2012 A1
20120011063 Killian et al. Jan 2012 A1
20120011158 Avner et al. Jan 2012 A1
20120011432 Strutton Jan 2012 A1
20120015717 Mosites et al. Jan 2012 A1
20120016948 Sinha Jan 2012 A1
20120022990 Kasower Jan 2012 A1
20120030216 Churi et al. Feb 2012 A1
20120030771 Pierson et al. Feb 2012 A1
20120036053 Miller Feb 2012 A1
20120036065 Orttung et al. Feb 2012 A1
20120036127 Work et al. Feb 2012 A1
20120036565 Gamez et al. Feb 2012 A1
20120042237 Armandpour et al. Feb 2012 A1
20120047174 Avner et al. Feb 2012 A1
20120047219 Feng et al. Feb 2012 A1
20120054088 Edrington et al. Mar 2012 A1
20120054592 Jaffe et al. Mar 2012 A1
20120060105 Brown et al. Mar 2012 A1
20120066044 Honnef et al. Mar 2012 A1
20120066106 Papadimitriou Mar 2012 A1
20120072382 Pearson et al. Mar 2012 A1
20120078766 Rose et al. Mar 2012 A1
20120079598 Brock et al. Mar 2012 A1
20120084866 Stolfo Apr 2012 A1
20120089438 Tavares et al. Apr 2012 A1
20120101938 Kasower Apr 2012 A1
20120101939 Kasower Apr 2012 A1
20120109752 Strutton et al. May 2012 A1
20120110467 Blake et al. May 2012 A1
20120110677 Abendroth et al. May 2012 A1
20120116913 Goolkasian May 2012 A1
20120116969 Kumar et al. May 2012 A1
20120124033 Gabriel et al. May 2012 A1
20120124498 Santoro et al. May 2012 A1
20120131009 Nath et al. May 2012 A1
20120131656 Slaton et al. May 2012 A1
20120135705 Thaker May 2012 A1
20120136763 Megdal et al. May 2012 A1
20120136774 Imrey et al. May 2012 A1
20120144461 Rathbun Jun 2012 A1
20120151045 Anakata et al. Jun 2012 A1
20120151046 Weiss et al. Jun 2012 A1
20120158562 Kassir Jun 2012 A1
20120158654 Behren et al. Jun 2012 A1
20120173339 Flynt et al. Jul 2012 A1
20120173417 Lohman et al. Jul 2012 A1
20120185515 Ferrel et al. Jul 2012 A1
20120191596 Kremen et al. Jul 2012 A1
20120191693 Alexander Jul 2012 A1
20120195412 Smith Aug 2012 A1
20120198556 Patel et al. Aug 2012 A1
20120203733 Zhang Aug 2012 A1
20120215682 Lent et al. Aug 2012 A1
20120215719 Verlander Aug 2012 A1
20120216125 Pierce Aug 2012 A1
20120221467 Hamzeh Aug 2012 A1
20120235897 Hirota Sep 2012 A1
20120239417 Pourfallah et al. Sep 2012 A1
20120239497 Nuzzi Sep 2012 A1
20120239583 Dobrowolski Sep 2012 A1
20120240223 Tu Sep 2012 A1
20120242473 Choi Sep 2012 A1
20120246060 Conyack, Jr. et al. Sep 2012 A1
20120246092 Stibel et al. Sep 2012 A1
20120246093 Stibel et al. Sep 2012 A1
20120253852 Pourfallah et al. Oct 2012 A1
20120262386 Kwon et al. Oct 2012 A1
20120262472 Garr et al. Oct 2012 A1
20120271660 Harris et al. Oct 2012 A1
20120271712 Katzin et al. Oct 2012 A1
20120278217 Sui et al. Nov 2012 A1
20120278226 Kolo Nov 2012 A1
20120278767 Stibel et al. Nov 2012 A1
20120284280 Kumar Nov 2012 A1
20120290486 Dobrowolski et al. Nov 2012 A1
20120290660 Rao et al. Nov 2012 A1
20120290740 Tewari et al. Nov 2012 A1
20120296804 Stibel et al. Nov 2012 A1
20120297484 Srivastava Nov 2012 A1
20120317013 Luk et al. Dec 2012 A1
20120317014 Cerise et al. Dec 2012 A1
20120321202 Fertik et al. Dec 2012 A1
20120323695 Stibel Dec 2012 A1
20120324388 Rao et al. Dec 2012 A1
20120330689 McLaughlin et al. Dec 2012 A1
20130006843 Tralvex Jan 2013 A1
20130006844 Kremen Jan 2013 A1
20130007012 Selkowe Fertik et al. Jan 2013 A1
20130007014 Fertik et al. Jan 2013 A1
20130007891 Mogaki Jan 2013 A1
20130013513 Ledbetter et al. Jan 2013 A1
20130013553 Stibel et al. Jan 2013 A1
20130018798 Scipioni Jan 2013 A1
20130018811 Britti et al. Jan 2013 A1
20130018838 Parnaby et al. Jan 2013 A1
20130018877 Gabriel et al. Jan 2013 A1
20130018892 Castellanos et al. Jan 2013 A1
20130018957 Parnaby et al. Jan 2013 A1
20130024367 Bellefeuille et al. Jan 2013 A1
20130024520 Siminoff Jan 2013 A1
20130024813 Schnorr et al. Jan 2013 A1
20130030826 Blom Jan 2013 A1
20130031105 Stibel et al. Jan 2013 A1
20130031109 Roulson et al. Jan 2013 A1
20130031624 Britti et al. Jan 2013 A1
20130036466 Penta et al. Feb 2013 A1
20130040619 Grube et al. Feb 2013 A1
20130041798 Unger Feb 2013 A1
20130041810 Murrell et al. Feb 2013 A1
20130041949 Biesecker et al. Feb 2013 A1
20130054357 Mager et al. Feb 2013 A1
20130061335 Schwabe Mar 2013 A1
20130066716 Chen et al. Mar 2013 A1
20130066775 Milam Mar 2013 A1
20130066884 Kast et al. Mar 2013 A1
20130066922 Jang et al. Mar 2013 A1
20130067582 Donovan et al. Mar 2013 A1
20130073366 Heath Mar 2013 A1
20130080322 Adolphe Mar 2013 A1
20130080467 Carson et al. Mar 2013 A1
20130085804 Leff et al. Apr 2013 A1
20130085894 Chan et al. Apr 2013 A1
20130085939 Colak et al. Apr 2013 A1
20130085953 Bhola et al. Apr 2013 A1
20130086075 Scott et al. Apr 2013 A1
20130090982 Ross Apr 2013 A1
20130103464 Kuznetsov Apr 2013 A1
20130103571 Chung et al. Apr 2013 A1
20130104216 Dennis et al. Apr 2013 A1
20130110557 Kasower May 2013 A1
20130110565 Means et al. May 2013 A1
20130110585 Nesbitt et al. May 2013 A1
20130111436 Phan et al. May 2013 A1
20130117072 Nish May 2013 A1
20130117087 Coppinger May 2013 A1
20130124855 Varadarajan et al. May 2013 A1
20130125010 Strandell May 2013 A1
20130132151 Stibel et al. May 2013 A1
20130159411 Bowen Jun 2013 A1
20130173447 Rothschild Jul 2013 A1
20130173449 Ng et al. Jul 2013 A1
20130173451 Kornegay et al. Jul 2013 A1
20130179338 Evans Jul 2013 A1
20130185210 Dodson et al. Jul 2013 A1
20130185293 Boback Jul 2013 A1
20130187923 Yoshimoto et al. Jul 2013 A1
20130204762 Harris et al. Aug 2013 A1
20130205135 Lutz Aug 2013 A1
20130211986 Debie et al. Aug 2013 A1
20130212187 Mortazavi et al. Aug 2013 A1
20130238387 Stibel et al. Sep 2013 A1
20130262226 LaChapelle et al. Oct 2013 A1
20130267171 Sarkar et al. Oct 2013 A1
20130278515 Kikuchi Oct 2013 A1
20130279676 Baniak et al. Oct 2013 A1
20130282819 Mehta et al. Oct 2013 A1
20130290164 Salm Oct 2013 A1
20130293363 Plymouth Nov 2013 A1
20130297499 Mukherjee Nov 2013 A1
20130298238 Shah et al. Nov 2013 A1
20130332341 Papadimitriou Dec 2013 A1
20130332342 Kasower Dec 2013 A1
20130332352 Imrey et al. Dec 2013 A1
20130339141 Stibel et al. Dec 2013 A1
20130339249 Weller et al. Dec 2013 A1
20130347059 Fong et al. Dec 2013 A1
20140012733 Vidal Jan 2014 A1
20140012737 Evans Jan 2014 A1
20140015860 Kim Jan 2014 A1
20140019348 Daley Jan 2014 A1
20140025562 Rothrock et al. Jan 2014 A1
20140032300 Zhang et al. Jan 2014 A1
20140032723 Nema Jan 2014 A1
20140046872 Arnott et al. Feb 2014 A1
20140052732 Softky Feb 2014 A1
20140061302 Hammad Mar 2014 A1
20140074689 Lund et al. Mar 2014 A1
20140089166 Padawer Mar 2014 A1
20140089167 Kasower Mar 2014 A1
20140089191 Brown Mar 2014 A1
20140095640 Stibel et al. Apr 2014 A1
20140096249 Dupont et al. Apr 2014 A1
20140098142 Lee et al. Apr 2014 A1
20140098229 Lu et al. Apr 2014 A1
20140110477 Hammad Apr 2014 A1
20140114735 Isaacson et al. Apr 2014 A1
20140122354 Stibel et al. May 2014 A1
20140129942 Rathod May 2014 A1
20140156500 Lassen et al. Jun 2014 A1
20140156501 Howe Jun 2014 A1
20140156503 Lassen et al. Jun 2014 A1
20140164112 Kala Jun 2014 A1
20140164398 Smith et al. Jun 2014 A1
20140164519 Shah Jun 2014 A1
20140172681 Lamp et al. Jun 2014 A1
20140173732 Stibel Jun 2014 A1
20140180919 Brown Jun 2014 A1
20140181285 Stevens et al. Jun 2014 A1
20140237377 Meissner Aug 2014 A1
20140258083 Achanta et al. Sep 2014 A1
20140258084 Padawer et al. Sep 2014 A1
20140279329 Dancel Sep 2014 A1
20140279382 Drakeley et al. Sep 2014 A1
20140279391 Gallo et al. Sep 2014 A1
20140282977 Madhu et al. Sep 2014 A1
20140298485 Gardner Oct 2014 A1
20140310151 Shishkov et al. Oct 2014 A1
20140317023 Kim Oct 2014 A1
20140372367 McLean et al. Dec 2014 A1
20140379554 Grossman et al. Dec 2014 A1
20150026060 Krietzman et al. Jan 2015 A1
20150127490 Puertas May 2015 A1
20150134506 King et al. May 2015 A1
20150135305 Cabrera et al. May 2015 A1
20150142639 Padawer May 2015 A1
20150161738 Stempora Jun 2015 A1
20150178829 Weiss Jun 2015 A1
20150199757 Lindholme et al. Jul 2015 A1
20150200948 Cairns et al. Jul 2015 A1
20150262249 Wical Sep 2015 A1
20150278277 Agrawal et al. Oct 2015 A1
20150302521 Bartmann Oct 2015 A1
20150310543 DeBie Oct 2015 A1
20150324920 Wilson et al. Nov 2015 A1
20160232546 Ranft Aug 2016 A1
20160232605 Zhang Aug 2016 A1
20170131964 Baek May 2017 A1
20170132700 Kazerani et al. May 2017 A1
20170161486 Jeon et al. Jun 2017 A1
20170200223 Kasower Jul 2017 A1
20170221121 Davis et al. Aug 2017 A1
20170262821 Imrey et al. Sep 2017 A1
20170352014 Smith et al. Dec 2017 A1
20170352186 Dauphiny et al. Dec 2017 A1
20180040063 Buechler et al. Feb 2018 A1
20180082372 Diana Mar 2018 A1
20180089935 Froy, Jr. Mar 2018 A1
20180101994 Da Veiga Apr 2018 A1
20180164877 Miller et al. Jun 2018 A1
20180176267 Malatesha et al. Jun 2018 A1
20180204279 Painter et al. Jul 2018 A1
20180218448 Thomas et al. Aug 2018 A1
20180343265 McMillan et al. Nov 2018 A1
20180349992 Dean et al. Dec 2018 A1
20190019185 Chitalia et al. Jan 2019 A1
20190034625 Ford et al. Jan 2019 A1
20190051305 Liddell et al. Feb 2019 A1
20190066203 Smith et al. Feb 2019 A1
20190102438 Murray et al. Apr 2019 A1
20190147366 Sankaran et al. May 2019 A1
20190188717 Putnam et al. Jun 2019 A1
20190188781 O'Brien et al. Jun 2019 A1
20190197528 Dean et al. Jun 2019 A1
20190258818 Yu et al. Aug 2019 A1
20190295165 Kapczynski et al. Sep 2019 A1
20190296804 Eitan et al. Sep 2019 A1
20190318122 Hockey et al. Oct 2019 A1
20190332400 Spoor et al. Oct 2019 A1
20190355362 Brown et al. Nov 2019 A1
20200034927 Smith et al. Jan 2020 A1
20200051115 Lawrence et al. Feb 2020 A1
20200051527 Ngo Feb 2020 A1
20200074100 Raneri et al. Mar 2020 A1
20200074541 Finneran et al. Mar 2020 A1
20200074745 Lyren Mar 2020 A1
20200076813 Felice-Steele et al. Mar 2020 A1
20200090265 Quinn et al. Mar 2020 A1
20200106764 Hockey et al. Apr 2020 A1
20200106765 Hockey et al. Apr 2020 A1
20200126126 Briancon et al. Apr 2020 A1
20200137110 Tyler et al. Apr 2020 A1
20200143384 Koontz et al. May 2020 A1
20200160372 Andrick May 2020 A1
20200174010 Pfeiffer et al. Jun 2020 A1
20200193413 Jangama et al. Jun 2020 A1
20200193423 Jangama et al. Jun 2020 A1
20200201878 Putnam et al. Jun 2020 A1
20200202425 Taylor-Shoff et al. Jun 2020 A1
20200211099 Smith et al. Jul 2020 A1
20200213206 Bracken et al. Jul 2020 A1
20200311168 Rokos Oct 2020 A1
20200342039 Bakir et al. Oct 2020 A1
20200342527 Kasower Oct 2020 A1
20200364785 Olson et al. Nov 2020 A1
20200372173 Burger et al. Nov 2020 A1
20200380599 Wasser et al. Dec 2020 A1
20200389461 Felice-Steele et al. Dec 2020 A1
20200402159 Arnold et al. Dec 2020 A1
20210004703 Zoldi et al. Jan 2021 A1
20210027357 Bonfigli et al. Jan 2021 A1
20210152567 Huston, III et al. May 2021 A1
20210194885 Manna Jun 2021 A1
20210234869 Bondugula et al. Jul 2021 A1
20220027853 McMillan et al. Jan 2022 A1
20220217146 Felice-Steele et al. Jul 2022 A1
20220374744 Zoldi et al. Nov 2022 A1
20230007007 Manna Jan 2023 A1
Foreign Referenced Citations (47)
Number Date Country
2 509 842 Dec 2005 CA
0 542 298 May 1993 EP
1 239 378 Sep 2002 EP
1 301 887 Apr 2003 EP
1 591 931 Nov 2005 EP
1 850 278 Oct 2007 EP
2 088 743 Aug 2009 EP
2 151 793 Feb 2010 EP
2 472 423 Jul 2012 EP
2 102 606 Feb 1983 GB
2005-208945 Aug 2005 JP
10-2000-0063313 Nov 2000 KR
10-2002-0039203 May 2002 KR
10-2007-0081504 Aug 2007 KR
I256569 Jun 2006 TW
WO 9116691 Oct 1991 WO
WO 00051052 Aug 2000 WO
WO 00055778 Sep 2000 WO
WO 01009752 Feb 2001 WO
WO 01009792 Feb 2001 WO
WO 01046889 Jun 2001 WO
WO 01084281 Nov 2001 WO
WO 02029636 Apr 2002 WO
WO 2004031986 Apr 2004 WO
WO 2005010683 Feb 2005 WO
WO 2005033979 Apr 2005 WO
WO 2005098630 Oct 2005 WO
WO 2006050278 May 2006 WO
WO 2006069199 Jun 2006 WO
WO 2007084555 Jul 2007 WO
WO 2007103203 Sep 2007 WO
WO 2008021104 Feb 2008 WO
WO 2008042614 Apr 2008 WO
WO 2009064694 May 2009 WO
WO 2009064840 May 2009 WO
WO 2009102391 Aug 2009 WO
WO 2010001406 Jan 2010 WO
WO 2010062537 Jun 2010 WO
WO 2010077989 Jul 2010 WO
WO 2010150251 Dec 2010 WO
WO 2011005876 Jan 2011 WO
WO 2011109576 Sep 2011 WO
WO 2012097171 Jul 2012 WO
WO 2013015746 Jan 2013 WO
WO 2019089439 May 2019 WO
WO 2020051154 Mar 2020 WO
WO 2020072239 Apr 2020 WO
Non-Patent Literature Citations (187)
Entry
U.S. Appl. No. 16/797,697, System and Method for an Augmented Reality Experience Via an Artificial Intelligence Bot, filed Feb. 21, 2020.
U.S. Appl. No. 12/705,489, filed Feb. 12, 2010, Bargoli et al.
U.S. Appl. No. 12/705,511, filed Feb. 12, 2010, Bargoli et al.
Actuate, “Delivering Enterprise Information for Corporate Portals”, White Paper, 2004, pp. 1-7.
“Aggregate and Analyze Social Media Content: Gain Faster and Broader Insight to Market Sentiment,” SAP Partner, Mantis Technology Group, Apr. 2011, pp. 4.
Aktas et al., “Personalizing PageRank Based on Domain Profiles”, WEBKDD workshop: Webmining and Web Usage Analysis, Aug. 22, 2004, pp. 83-90.
Aktas et al., “Using Hyperlink Features to Personalize Web Search”, WEBKDD workshop: Webmining and Web Usage Analysis, Aug. 2004.
“Arizona Company Has Found Key in Stopping ID Theft,” PR Newswire, New York, Aug. 10, 2005 http://proquest.umi.com/pqdweb?did=880104711&sid=1&Fmt=3&clientId=19649&RQT=309&Vname=PQD.
ABC News Now:Money Matters, as broadcasted Nov. 15, 2005 with guest Todd Davis (CEO of Lifelock), pp. 6.
Anonymous, “Credit-Report Disputes Await Electronic Resolution,” Credit Card News, Chicago, Jan. 15, 1993, vol. 5, No. 19, p. 5.
Anonymous, “MBNA Offers Resolution of Credit Card Disputes,” Hempstead, Feb. 2002, vol. 68, No. 2, p. 47.
Anonymous, “Feedback”, Credit Management, ABI/INFORM Global, Sep. 2006, pp. 6.
Awoonor-Williams, Princess Josephine, Ph.D. “Gender and Credit: An Analysis of Women's Experience in the Credit Market”, ProQuest Dissertations and Theses, May 2004, pp. 148.
“Beware of ‘Who Viewed My Profile’ Apps on Facebook” Tech for Luddites, Mar. 15, 2010 printed Sep. 27, 2013 http://www.techforluddites.com/2010/03/beware-of-who-viewed-my-profile-apps-on-facebook.html.
Bielski, Lauren, “Will you Spend to Thwart ID Theft?” ABA Banking Journal, Apr. 2005, pp. 54, 56-57, 60.
BlueCava, “What We Do”, http://www.bluecava.com/what-we-do/, printed Nov. 5, 2012 in 3 pages.
Buxfer, http://www.buxfer.com/ printed Feb. 5, 2014 in 1 page.
Check, http://check.me/ printed Feb. 5, 2014 in 3 pages.
Chores & Allowances, “Do Kids Have Credit Reports?” Oct. 15, 2007, http://choresandallowances.blogspot.com/2007/10/do-kids-have-credit-reports.html, pp. 5.
Comlounge.net, “plonesocial.auth.rpx” http://web.archive.org/web/20101026041841/http://comlounge.net/rpx as captured Oct. 26, 2010 in 9 pages.
CreditKarma, http://www.creditkarma.com printed Feb. 8, 2013 in 2 pages.
CreditSesame, http://www.creditsesame.com/how-it-works/our-technology/ printed Feb. 5, 2013 in 2 pages.
Collins, Michael J., “Exploring the Design of Financial Counseling for Mortgage Borrowers in Default,” Journal of Family and Economic Issues, Springer Science+Business Media, Mar. 13, 2007, pp. 207-226.
Consumer Financial Protection Bureau (CFPB): Analysis of Difference between Consumer- and Creditor-Purchased Credit Scores, Sep. 2012, pp. 1-42.
“Consumers Gain Immediate and Full Access to Credit Score Used by Majority of U.S. Lenders”, PR Newswire, ProQuest Copy, Mar. 19, 2001, p. 1.
“CreditCheck Monitoring Services,” Dec. 11, 2000, pp. 1, lines 21-23.
“Credit Improvement”, CreditRepair.com, Mar. 10, 2010, http://web.archive.org/web/20100310134914/http://www.creditrepair.com/credit/, as archived Mar. 10, 2010 in 2 pages.
Credit Plus, Inc., “Score Wizard”, http://web.archive.org/web/20030806080310/www.creditplus.com/scorewizard.asp, as archived Aug. 6, 2003 in 1 page.
Cullen, Terri; “The Wall Street Journal Complete Identity Theft Guidebook:How to Protect Yourself from the Most Pervasive Crime in America”; Chapter 3, pp. 59-79; Jul. 10, 2007.
“D&B Corporate Family Linkage”, D&B Internet Access for U.S. Contract Customers, https://www.dnb.com/ecomp/help/linkage.htm as printed Dec. 17, 2009, pp. 1.
“Data Loss Prevention (DLP) Software”, http://www.symantec.com/data-loss-prevention/ printed Apr. 8, 2013 in 8 pages.
“Data Protection”, http://compliantprocessing.com/data-protection/ printed Apr. 8, 2013 in 4 pages.
Day, Jo and Kevin; “ID-ology: A Planner's Guide to Identity Theft”; Journal of Financial Planning:Tech Talk; pp. 36-38; Sep. 2004.
“Debt Settlement: Watch Video on how to Pay Your Debt Faster”, http://www.debtconsolidationcare.com/debt-settlement.html printed Jan. 9, 2013 in 6 pages.
Demby, Elayne, “Special Report: Letting Consumers Know the Score—and More”, Collections and Credit Risk, New York, Feb. 2003, vol. 8, Issue 2, p. 53, pp. 3.
“Disputes in Cyberspace 2001: Update of online dispute resolution for consumers in cross-border disputes”, Consumers International, Nov. 2001, pp. 45, http://web.archive.org/web/20160414183303/http://www.consumersinternational.org/media/304196/disputes%20in%20cyberspace%202001.%20update%20of%20online%20dispute%20resolution%20for%20consumers%20in%20cross-border%20disputes..pdf.
Elangovan, A.R., “Managerial Third-Party Dispute Intervention: A Prescriptive Model of Strategy Selection”, Academy of Management, Oct. 1, 1995, vol. 20, No. 4, pp. 800-830.
Elliehausen et al., The Impact of Credit Counseling on Subsequent Borrower Behavior, The Journal of Consumer Affairs, Summer 2007, vol. 41, No. 1, pp. 1-28.
Equifax, “Business Status Alerts: User Guide”, Jul. 2009, pp. 1-21.
Equifax Consumer Credit Report http://www.equifax.com/home/, as retrieved on Sep. 17, 2008.
Equifax; “Equifax Credit Watch”; https://www.econsumer.equifax.co.uk/consumer/uk/sitepage.ehtml, dated Jun. 27, 2007 on www.archive.org.
“Equifax: Debt Wise™ Credit Monitoring Service,” Product Review, http://www.mdmproofing.com/iym/reviews/equifax/debt-wise/, Jan. 2010, pp. 11.
Equifax; “Places”, http://web.archive.org/web/20111111113930/http://www.equifax.com/places as archived Nov. 11, 2011 in 1 page.
Equifax; “Places”, http://www.equifax.com/places/ as printed Nov. 16, 2015 in 1 page.
Equifax; “Welcome to Equifax Mobile”, http://www.equifax.com/mobile/ as printed Mar. 18, 2011 in 2 pages.
Ettorre, “Paul Kahn on Exceptional Marketing,” Management Review, vol. 83, No. 11, Nov. 1994, pp. 48-51.
Experian Consumer Credit Report http://www.experian.com/, as retrieved on Sep. 17, 2008.
Facebook, “Facebook helps you connect and share with the people in your life,” www.facebook.com printed Nov. 16, 2010 in 1 page.
FamilySecure.com, “Frequently Asked Questions”, http://www.familysecure.com/FAQ.aspx as archived Jul. 15, 2007 in 3 pages.
FamilySecure.com; “Identity Theft Protection for the Whole Family | FamilySecure.com” http://www.familysecure.com/, as retrieved on Nov. 5, 2009.
Fenner, Peter, “Mobile Address Management and Billing for Personal Communications”, 1st International Conference on Universal Personal Communications, 1992, ICUPC '92 Proceedings, pp. 253-257.
“Fictitious Business Name Records”, Westlaw Database Directory, http://directory.westlaw.com/scope/default.asp?db=FBN-ALL&RS-W...&VR=2.0 as printed Dec. 17, 2009, pp. 5.
Fisher, Greg, “Credit Score Distribution and Practical Range Statistics”, Feb. 23, 2010, The Credit Scoring Site, pp. 2.
Fitzpatrick, Alex, “Facebook Monitors Your Chats for Criminal Activity [Report],” Mashable, Jul. 12, 2012 printed Sep. 27, 2013 http://mashable.com/2012/07/12/facebook-scanning-chats/.
“Fraud Alert | Learn How”. Fight Identity Theft. http://www.fightidentitytheft.com/flag.html, accessed on Nov. 5, 2009.
“Fund Manager,” Portfolio Management Software website, indexed into Google on Jan. 7, 2005, Retrieved Oct. 24, 2014 http://www.fundmanagersoftware.com/, http://www.fundmanagersoftware.com/help/gph_tp_pieasset.html, http://www.fundmanagersoftware.com/demo2.html.
Gibbs, Adrienne; “Protecting Your Children from Identity Theft,” Nov. 25, 2008, http://www.creditcards.com/credit-card-news/identity-ID-theft-and-kids-children-1282.php, pp. 4.
“GLBA Compliance and FFIEC Compliance” http://www.trustwave.com/financial-services.php printed Apr. 8, 2013 in 1 page.
Gordon et al., “Identity Fraud: A Critical National and Global Threat,” LexisNexis, Oct. 28, 2003, pp. 1-48.
“Guide to Benefits, MasterCard® Cardholder Smart Shopper Benefits”, May 2005, pp. 10.
Herzberg, Amir, “Payments and Banking with Mobile Personal Devices,” Communications of the ACM, May 2003, vol. 46, No. 5, pp. 53-58.
Hoofnagle, Chris Jay, “Identity Theft: Making the Known Unknowns Known,” Harvard Journal of Law & Technology, Fall 2007, vol. 21, No. 1, pp. 98-122.
Hunt, Robert M.; Whither Consumer Credit Counseling? Business Review, Dec. 31, 2005, pp. 9-20.
ID Analytics, “ID Analytics® Consumer Notification Service” printed Apr. 16, 2013 in 2 pages.
ID Theft Assist, “Do You Know Where Your Child's Credit Is?”, Nov. 26, 2007, http://www.idtheftassist.com/pages/story14, pp. 3.
“ID Thieves These Days Want Your Number, Not Your Name”, The Columbus Dispatch, Columbus, Ohio, http://www.dispatch.com/content/stories/business/2014/08/03/id-thieves-these-days-want-your-number-not-your-name.html, Aug. 3, 2014 in 2 pages.
Identity Theft Resource Center; Fact Sheet 120 A—To Order a Credit Report for a Child; Fact Sheets, Victim Resources; Apr. 30, 2007.
“Identity Thieves Beware: Lifelock Introduces Nation's First Guaranteed Proactive Solution to Identity Theft Protection,” PR Newswire, New York, Jun. 13, 2005 http://proquest.umi.com/pqdweb?did=852869731&sid=1&Fmt=3&clientId=19649&RQT=309&Vname=PQD.
Ideon, Credit-Card Registry that Bellyflopped this Year, Is Drawing some Bottom-Fishers, The Wall Street Journal, Aug. 21, 1995, pp. C2.
Information Brokers of America, “Information Brokers of America Child Identity Theft Protection” http://web.archive.org/web/20080706135451/http://iboainfo.com/child-order.html as archived Jul. 6, 2008 in 1 page.
Information Brokers of America, “Safeguard Your Child's Credit”, http://web.archive.org/web/20071215210406/http://www.iboainfo.com/child-id-protect.html as archived Dec. 15, 2007 in 1 page.
Intelius, “People Search—Updated Daily, Accurate and Fast!” http://www.intelius.com/people-search.html ?=&gclid=CJqZIZP7paUCFYK5KgodbCUJJQ printed Nov. 16, 2010 in 1 page.
Iovation, Device Identification & Device Fingerprinting, http://www.iovation.com/risk-management/device-identification printed Nov. 5, 2012 in 6 pages.
Irby, LaToya, “How Will a Late Payment Hurt My Credit Score?” http://web.archive.org/web/20101024113603/http://credit.about.com/od/creditscorefaq/f/how-late-payment-affects-credit-score.htm, Oct. 24, 2010, pp. 1.
“Judging Credit: Consumers Need Better Finance Tools”, News Journal, Daytona Beach, FL, Dec. 28, 2002.
Kaushik, Nishant, “The Epic Hacking of Mat Honan and Our Identity Challenge,” Aug. 7, 2012, http://blog.talkingidentity.com/2012/08/the-epic-hacking-of-mat-honan-and-our-identity-challenge.html.
Khan, Mickey Alam, “Equifax Recognizes Changing Customer Behavior with Four-Pronged Mobile Strategy”, Mobile Marketer, http://web.archive.org/web/20151117005818/http://www.mobilemarketer.com/cms/news/strategy/9733.html, Apr. 19, 2011 in 10 pages.
Lan, Joe, “The Top Portfolio Management Software,” http://www.aaii.com/computerizedinvesting/article/the-top-portfolio-management-software, Includes Discussion thread, Fourth Quarter 2011, pp. 17.
Lang et al., “A Collaborative Web-Based Help-System”, Proceedings of the 2nd international conference on web intelligence, mining and semantics, Jun. 13-15, 2012, pp. 5.
Lang et al., “An Avatar-Based Help System for Web-Portals”, International Conference on Human-Computer Interaction, Springer, Berlin, Heidelberg, 2011, pp. 10.
Lanubile, et al., “Evaluating Empirical Models for the Detection of High-Risk Components: Some Lessons Learned”, 20th Annual Software Engineering Workshop, Nov. 29-30, 1995, Greenbelt, Maryland, pp. 1-6.
Lauwers et al., “Five Hundred Years of Bookkeeping: A Portrait of Luca Pacioli”, Tijdschrift voor Economie en Management, 1994, vol. 39. No. 3, pp. 289-304.
Lee, W.A.; “Experian, on Deal Hunt, Nets Identity Theft Insurer”, American Banker: The Financial Services Daily, Jun. 4, 2003, New York, NY, 1 page.
Leskovec, Jure, “Social Media Analytics: Tracking, Modeling and Predicting the Flow of Information through Networks”, WWW 2011-Tutorial, Mar. 28-Apr. 1, 2011, Hyderabad, India, pp. 277-278.
Letter to Donald A. Robert from Carolyn B. Maloney, dated Oct. 31, 2007, pp. 2.
Letter to Donald A. Robert from Senator Charles E. Schumer, dated Oct. 11, 2007, pp. 2.
Letter to Harry C. Gambill from Carolyn B. Maloney, dated Oct. 31, 2007, pp. 2.
Letter to Harry C. Gambill from Senator Charles E. Schumer, dated Oct. 11, 2007, pp. 2.
Letter to Richard F. Smith from Carolyn B. Maloney, dated Oct. 31, 2007, pp. 2.
Letter to Richard F. Smith from Senator Charles E. Schumer, dated Oct. 11, 2007, pp. 2.
Li et al., “Automatic Verbal Information Verification for User Authentication”, IEEE Transactions on Speech and Audio Processing, vol. 8, No. 5, Sep. 2000, pp. 585-596.
LifeLock, http://web.archive.org/web/20110724011010/http://www.lifelock.com/? as archived Jul. 24, 2011 in 1 page.
LifeLock, “How LifeLock Works,” http://www.lifelock.com/lifelock-for-people printed Mar. 14, 2008 in 1 page.
LifeLock, “LifeLock Launches First ID Theft Prevention Program for the Protection of Children,” Press Release, Oct. 14, 2005, http://www.lifelock.com/about-us/press-room/2005-press-releases/lifelock-protection-for-children.
LifeLock; “How Can LifeLock Protect My Kids and Family?” http://www.lifelock.com/lifelock-for-people/how-we-do-it/how-can-lifelock-protect-my-kids-and-family printed Mar. 14, 2008 in 1 page.
LifeLock, “Personal Identity Theft Protection & Identity Theft Products,” http://www.lifelock.com/lifelock-for-people, accessed Nov. 5, 2007.
LifeLock, Various Pages, www.lifelock.com/, Jan. 9, 2007, pp. 49.
Littwin, Angela, “Beyond Usury: A Study of Credit-Card Use and Preference Among Low-Income Consumers”, Texas Law Review, vol. 86, No. 3, pp. 451-506; Feb. 2008.
Lobo, Jude, “MySAP.com Enterprise Portal Cookbook,” SAP Technical Delivery, Feb. 2002, vol. 1, pp. 1-13.
Lund, Graham, “Credit Bureau Data: Maximizing the Benefits,” Credit Management, May 2004, ProQuest Central, pp. 44-45.
Magid, Lawrence, J., Business Tools: When Selecting an ASP Ensure Data Mobility, Los Angeles Times, Los Angeles, CA, Feb. 26, 2001, vol. C, Issue 4, pp. 3.
“Managing Debt?” Federal Trade Commission: Consumer Information, http://www.consumer.ftc.gov/articles/0158-managing-debt, printed Mar. 22, 2013 in 4 pages.
Manilla, http://www.manilla.com/how-it-works/ printed Feb. 5, 2014 in 1 page.
Mannan et al., “Mercury: Recovering Forgotten Passwords Using Personal Devices*”, Dec. 17, 2011, Pre-Proceedings of Financial Cryptography and Data Security 2011, pp. 1-16.
Meyers et al., “Using Your Social Networking Accounts To Log Into NPR.org,” NPR.org, Jun. 24, 2010, http://web.archive.org/web/20100627034054/http://www.npr.org/blogs/inside/2010/06/24/128079309/using-your-social-networking-accounts-to-log-into-npr-org in 3 pages.
Micarelli et al., “Personalized Search on the World Wide Web,” The Adaptive Web, LNCS 4321, 2007, pp. 195-230.
Microsoft, “Expand the Reach of Your Business,” Microsoft Business Solutions, 2004, in 16 pages.
Mint.com, http://www.mint.com/ printed Sep. 18, 2008 in 2 pages.
Mint.com, http://www.mint.com/how-it-works/ printed Feb. 5, 2013 in 2 pages.
MS Money Software by Microsoft http://www.microsoft.com/Money/default.mspx as retrieved on Sep. 17, 2008.
Mvelopes, http://www.mvelopes.com/ printed Feb. 5, 2014 in 2 pages.
My Call Credit http://www.mycallcredit.com/products.asp?product=ALR dated Dec. 10, 2005 on www.archive.org.
My Call Credit http://www.mycallcredit.com/rewrite.asp?display=faq dated Dec. 10, 2005 on www.archive.org.
My ID Alerts, “Why ID Alerts” http://www.myidalerts.com/why-id-alerts.jsps printed Apr. 3, 2012 in 2 pages.
My ID Alerts, “How it Works” http://www.myidalerts.com/how-it-works.jsps printed Apr. 3, 2012 in 3 pages.
MyRatePlan.com, “Cell Phone Buying Guide”, http://web.archive.org/web/20061116103256/http://myrateplan.com/cell_phone_buying_guide/family_plans/ , as archived Nov. 16, 2006 in 2 pages.
MyReceipts, http://www.myreceipts.com/, printed Oct. 16, 2012 in 1 page.
MyReceipts—How it Works, http://www.myreceipts.com/howItWorks.do, printed Oct. 16, 2012 in 1 page.
“Name Availability Records”, Westlaw Database Directory, http://directory.westlaw.com/scope/default.asp?db=NA-ALL&RS=W...&VR=2.0 as printed Dec. 17, 2009, pp. 5.
National Alert Registry Launches RegisteredOffendersList.org to Provide Information on Registered Sex Offenders, May 16, 2005, pp. 2, http://www.prweb.com/printer/240437.htm accessed on Oct. 18, 2011.
National Alert Registry Offers Free Child Safety “Safe From Harm” DVD and Child Identification Kit, Oct. 24, 2006. pp. 2, http://www.prleap.com/pr/53170 accessed on Oct. 18, 2011.
National Alert Registry website titled, “Does a sexual offender live in your neighborhood”, Oct. 22, 2006, pp. 2, http://web.archive.org/wb/20061022204835/http://www.nationallertregistry.com/ accessed on Oct. 13, 2011.
“New for Investors: Asset Allocation, Seasoned Returns and More,” Prosper, http://blog.prosper.com/2011/10/27/new-for-investors-asset-allocation-seasoned-returns-and-more/, Oct. 27, 2011, pp. 4.
Next Card: About US, http://web.cba.neu.edu/˜awatson/NextCardCase/NextCardAboutUs.htm printed Oct. 23, 2009 in 10 pages.
Ogg, Erica, “Apple Cracks Down on UDID Use”, http://gigaom.com/apple/apple-cracks-down-on-udid-use/ printed Nov. 5, 2012 in 5 Pages.
Oracle: Recommendations for Leveraging the Critical Patch Update and Maintaining a Proper Security Posture, Nov. 2010, An Oracle White Paper, pp. 1-30.
Organizing Maniac's Blog—Online Receipts Provided by MyQuickReceipts.com, http://organizingmaniacs.wordpress.com/2011/01/12/online-receipts-provided-by-myquickreceipts.com/ dated Jan. 12, 2011 printed Oct. 16, 2012 in 3 pages.
Paustian, Chuck, “Every Cardholder a King Customers get the Full Treatment at Issuers' Web Sites,” Card Marketing, New York, Mar. 2001, vol. 5, No. 3, pp. 4.
Peltier, Jon, “Conditional Formatting of Excel Charts”, Peltier Tech Blog, as posted Feb. 13, 2012, http://peltiertech.com/conditional-formatting-of-excel-charts/, pp. 1-5.
People Finders, http://www.peoplefinders.com/?CMP=Google&utm_source=google&utm_medium=cpc printed Nov. 16, 2010 in 1 page.
People Lookup, “Your Source for Locating Anyone!” www.peoplelookup.com/people-search.html printed Nov. 16, 2010 in 1 page.
People Search, “The Leading Premium People Search Site on the Web,” http://www.peoplesearch.com printed Nov. 16, 2010 in 2 pages.
PersonalCapital.com, http://www.personalcapital.com/how-it-works printed Feb. 5, 2014 in 5 pages.
Phinisee, Tamarind, “Banks, FTC Step Up Efforts to Address Identity Theft”, San Antonio Business Journal; San Antonio, Jul. 5, 2002, vol. 16, No. 24, pp. 5.
Pinola, Melanie, “How Can I Protect Against Social Engineering Hacks?” Aug. 9, 2012, http://lifehacker.com/5933296/how-can-i-protect-against-hackers-who-use-sneaky-social-engineering- techniques-to-get-into-my-accounts.
Planet Receipt—Home, http://www.planetreceipt.com/home printed Oct. 16, 2012 in 1 page.
Planet Receipt—Solutions & Features, http://www.planetreceipt.com/solutions-features printed Oct. 16, 2012 in 2 pages.
Planwise, http://planwise.com printed Feb. 8, 2013 in 5 pages.
Press Release—“Helping Families Protect Against Identity Theft—Experian Announces FamilySecure.com; Parents and guardians are alerted for signs of potential identity theft for them and their children; product features an industry-leading $2 million guarantee”; PR Newswire; Irvine, CA; Oct. 1, 2007.
PrivacyGuard, http://web.archive.org/web/20110728114049/http://www.privacyguard.com/ as archived Jul. 28, 2011 in 1 page.
Privacy Rights Clearinghouse, “Identity Theft: What to do if it Happens to You,” http://web.archive.org/web/19990218180542/http://privacyrights.org/fs/fs17a.htm printed Feb. 18, 1999.
“Qualifying For Debt Settlement”, http://www.certifieddebt.com/debt/settlement-qualifications.shtml printed Jan. 9, 2013 in 2 pages.
Quantix Software, “Investment Account Manager,” available at https://www.youtube.com/watch?v=1UwNTEER1Kk, as published Mar. 21, 2012.
Quicken Online by Intuit http://www.quicken.intuit.com/, as retrieved on Sep. 17, 2008.
“Quicken Support”, http://web.archive.org/web/20071231040130/http://web.intuit.com/support/quicken/docs/d_qif.html as archived Dec. 31, 2007 in 6 pages.
Ramaswamy, Vinita M., Identity-Theft Toolkit, The CPA Journal, Oct. 1, 2006, vol. 76, Issue 10, pp. 66-70.
Rawe, Julie; “Identity Thieves”, Time Bonus Section, Inside Business, Feb. 2002, pp. 2.
Repici et al., “The Comma Separated Value (CSV) File Format”, http://creativyst.com/Doc/Articles/CSV/CSV01.htm, Creativyst, Inc., 2002, pp. 10.
Reppler.com, “Learn More: Basic Information about how TrustedID Reppler Works For You,” www.reppler.com/learn/ printed Oct. 24, 2012 in 2 pages.
“Resolve Debt for Less: With Help from Freedom Financial” http://www.debtsettlementusa.com/ printed Jan. 9, 2013 in 6 pages.
Romig, Shane, “The Truth About Credit Repair”, Credit.com, May 5, 2010, http://web.archive.org/web/20100505055526/http://www.credit.com/credit_information/credit_help/The-Truth-About-Credit-Repair.jsp printed Mar. 22, 2013 in 4 pages.
Roth, Andrew, “CheckFree to Introduce E-Mail Billing Serving,” American Banker, New York, Mar. 13, 2001, vol. 166, No. 49, pp. 3.
SAS, “SAS® Information Delivery Portal”, Fact Sheet, 2008, in 4 pages.
Schmidt et al., “A Set of Multi-Touch Graph Interaction Techniques”, ITS '10, Nov. 7-10, 2010, Saarbrucken, Germany, pp. 1-4.
Scholastic Inc.:Parent's Request for Information http://web.archive.org/web/20070210091055/http://www.scholastic.com/inforequest/index.htm as archived Feb. 10, 2007 in 1 page.
Scholastic Inc.:Privacy Policy http://web.archive.org/web/20070127214753/http://www.scholastic.com/privacy.htm as archived Jan. 27, 2007 in 3 pages.
Screenshot for Investment Account Manager v.2.8.3, published at http://www.aaii.com/objects/get/1642.gif by at least Aug. 30, 2011 in 1 page.
Sealey, Geraldine, “Child ID Theft Can Go Unnoticed for Years”, http://abcnews.go.com/US/story?id=90257, Sep. 12, 2003 in 9 pages.
“Settling Your Debts—Part 1 in Our Debt Settlement Series”, http://www.creditinfocenter.com/debt/settle_debts.shtml printed Jan. 9, 2013 in 6 pages.
Shin, Laura, “See An Error On Your Credit Report? Credit Karma Now Makes It Easy To Dispute”, Nov. 12, 2015, http://www.forbes.com/sites/laurashin/2015/11/12/see-an-error-on-your-credit-report-credit-karma-now-makes-it-easy-to-dispute/, pp. 4.
ShoeBoxed, https://www.shoeboxed.com/sbx-home/ printed Oct. 16, 2012 in 4 pages.
Simpson, Glyn, “Microsoft (MS) Money MSMoney FAQ, Help and Information Pages”, pp. 2, Copyright © Glyn Simpson 1998-2007, http://web.archive.org/web/20071018075531/http://money.mvps.org/faq/article/196.aspx.
Singletary, Michelle, “The Littlest Victims of ID Theft”, The Washington Post, The Color Of Money, Oct. 4, 2007.
Solapurkar, Prajakta, “Building Secure Healthcare Services Using OAuth 2.0 and JSON Web Token in IOT Cloud Scenario”, IEEE, 2nd International Conference on Contemporary Computing and Informatics (ic3i), 2016, pp. 99-104.
Srinivasa et al., “Augmented Reality Adaptive Web Content”, 2016 13th IEEE Annual w Consumer Communications & Networking Conference (CCNC), pp. 4.
Stauffer et al., “Using HTML 3.2,” Second Edition, 1996, Que Publishing, pp. 192-193.
Tajik, S., “Conditional Plotting, Changing Color of Line Based on Value”, MathWorks®, MATLAB Answers™, Question Posted Feb. 10, 2011 to https://www.mathworks.com/matlabcentral/answers/1156-conditional-plotting-changing-color-of-line-based-on-value?requestedDomain=www.mathworks.com, pp. 8.
TheMorningCall.Com, “Cheap Ways to Foil Identity Theft,” www.mcall.com/business/columnists/all-karp.5920748jul01,0 . . . , published Jul. 1, 2007.
Thompson, Herbert H., “How I Stole Someone's Identity”, https://www.scientificamerican.com/article/anatomy-of-a-social-hack/#, Aug. 18, 2008, pp. 5.
Todorova, Aleksandra, “Protecting Your Child's Identity”, Smart Money, Published Aug. 2, 2007, pp. 1-5.
“TransUnion—Child Identity Theft Inquiry”, TransUnion, http://www.transunion.com/corporate/personal/fraudIdentityTheft/fraudPrevention/childIDInquiry.page as printed Nov. 5, 2009 in 4 pages.
TransUnion Consumer Credit Report http://www.transunion.com/, as retrieved on Sep. 17, 2008.
Truston, “Checking if your Child is an ID Theft Victim can be Stressful,” as posted by Michelle Pastor on Jan. 22, 2007 at http://www.mytruston.com/blog/credit/checking_if_your_child_is_an_id_theft_vi.html.
US Legal, Description, http://www.uslegalforms.com/US/US-00708-LTR.htm printed Sep. 4, 2007 in 2 pages.
“Use of Alternative Data to Enhance Credit Reporting to Enable Access to Digital Financial Services by Individuals and SMEs Operating in the Informal Economy”, Guidance Note, International Committee on Credit Reporting (ICCR), Jun. 28, 2018, pp. 35.
Vamosi, Robert, “How to Handle ID Fraud's Youngest Victims,” Nov. 21, 2008, http://news.cnet.com/8301-10789_3-10105303-57.html.
Waggoner, Darren J., “Having a Global Identity Crisis,” Collections & Credit Risk, Aug. 2001, vol. vol. 6, No. 8, pp. 6.
Wesabe.com http://www.wesabe.com/, as retrieved on Sep. 17, 2008.
Yahoo! Search, “People Search,” http://people.yahoo/com printed Nov. 16, 2010 in 1 page.
YODLEE | Money Center, https://yodleemoneycenter.com/ printed Feb. 5, 2014 in 2 pages.
You Need A Budget, http://www.youneedabudget.com/features printed Feb. 5, 2014 in 3 pages.
“12 Mag: How Debt Settlement is Helpful in Improving the Credit Score”, Weblog post. Newstex Trade & Industry Blogs, Newstex, Oct. 8, 2017, pp. 2.
Caldeiira et al., “Characterizing and Preventing Chargebacks in Next Generation Web Payments Services”, 2012 Fourth International Conference on Computational Aspects of Social Networks (CASON), 2012 IEEE, pp. 333-338.
Gramazio, Connor C., “Colorgorical: Creating Discriminable and Preferable Color Palettes for Information Visualization”, IEEE Transactions on Visualization and Computer Graphics, Jan. 2017, vol. 23, No. 1, pp. 521-530.
Koka et al., “Online Review Analysis by Visual Feature Selection”, 2017 IEEE 15th Intl Conf on Dependable, Autonomic and Secure Computing, 15th Intl Conf on Pervasive Intelligence and Computing, 3rd Intl Conf on Big Data Intelligence and Computing and Cyber Science and Technology Congress (DASC/PiCom/DataCom/CyberSciTech), 2017, pp. 1084-1091.
Peng et al., “Factors Affecting Online Purchase Behavior: Implications to Improve the Existing Credit Rating System”, 2009 International Conference on Management and Service Science, 2009 IEEE, pp. 1-4.
Shibata et al., “3D Retrieval System Based on Cognitive Level—Human Interface for 3D Building Database”, Proceedings of the 2004 International Conference on Cyberworlds (CW'04), 2004, pp. 6.
Provisional Applications (2)
Number Date Country
62832159 Apr 2019 US
62809469 Feb 2019 US
Continuations (1)
Number Date Country
Parent 16797697 Feb 2020 US
Child 17644540 US