Soliciting information based on a computer user's context

Information

  • Patent Grant
  • 8103665
  • Patent Number
    8,103,665
  • Date Filed
    Monday, May 11, 2009
    15 years ago
  • Date Issued
    Tuesday, January 24, 2012
    12 years ago
Abstract
A user search request is received and context information for the user is identified. The user search request and the context information are then combined to generate search criteria corresponding to the user search request, providing for information solicitation based on a computer user's context.
Description
TECHNICAL FIELD

This disclosure relates generally to computer-assisted solicitation of desired information, and more particularly to soliciting information based on a computer user's context.


BACKGROUND

As computers become increasingly powerful and ubiquitous, users increasingly employ their computers for a broad variety of tasks. For example, in addition to traditional activities such as running word processing and database applications, users increasingly rely on their computers as an integral part of their daily lives. Programs to schedule activities, generate reminders, and provide rapid communication capabilities are becoming increasingly popular. Moreover, computers are increasingly present during virtually all of a person's daily activities. For example, hand-held computer organizers (e.g., PDAs) are more common, and communication devices such as portable phones are increasingly incorporating computer capabilities. Thus, users may be presented with output information from one or more computers at any time.


Accompanying the increasing use and portability of computers is an increasing desire on the part of users to obtain information through wireless and other communication media. When a consumer becomes aware of a situation in which they perceive a need that might be able to be fulfilled with goods or services that may or may not be available, they are currently limited in how they can gain product information. Often, when the need arrives, the consumer is not in a convenient circumstance to review printed materials, ask others, or wait for uncontrolled media like radio or television to present an advertisement or review. This inconvenience may result in the user choice that if significant effort or time is required to learn about potential product claims, availability, or cost is required to learn about the offered goods and services, then it is not worth it.


The advent of computers, especially when coupled to the data-rich environment of the Internet, expands consumer's ability to gain product information without regard for geographic proximity or time of day. However, current product search techniques rely on either what the user has directly specified (e.g., in a search text box), or past behavior (e.g., Internet merchants tracking past purchases). And, even though many product providers collect and sell individual and aggregate consumer profiles, and so do sometimes provide assistance to consumers as they consider offered products, there is currently no general mechanism such that detailed user characterizations can facilitate the location of a specific desired product or information.


Some Internet-related products, such as the Microsoft® (Internet Explorer web browser, can record the information that a user enters in form fields. When the user begins filling out a new form, those values can automatically be entered or suggested, easing the form completion. Despite this easing, problems still exist with such products. One problem is that the user is limited to data already entered in other forms. Another problem is that such products require presentation to the user of form fields that are already filled out, which can be inconvenient for the user and degrade from the user-friendliness of the solution (e.g., it can be inconvenient for the user to see his or her name for every form).


Accordingly, there is a need for improved techniques for soliciting information.


SUMMARY

Soliciting information based on a computer user's context is described herein.


According to one aspect, a user search request is received and context information for the user is identified. The user search request and the context information are then combined to generate search criteria corresponding to the user search request. The context information includes, for example, information regarding one or more of: the user's physical environment, the user's mental environment, the user's computing environment, and the user's data environment.


According to another aspect, a product interest characterization (PIC) is generated that includes multiple fields, some fields being populated with user-defined data inputs and other fields being populated with automatically-generated user context information. The generated PIC is then communicated to one or more information sources where the PIC is compared with information at these sources to identify content that matches the parameters in the various fields of the PIC. The matching content is then presented to the user.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows an exemplary system 100 including multiple information sources and multiple clients.



FIG. 2 illustrates an exemplary suitable implementation of a client as a body-mounted wearable computer worn by a user.



FIG. 3 illustrates an exemplary information solicitation environment including multiple computing resources and searchable information sources.



FIG. 4 is a flowchart illustrating an exemplary process for soliciting information based at least in part on the user's context.



FIG. 5 illustrates an exemplary product interest characterization manager.



FIG. 6 illustrates product interest characterization generation and storage in additional detail.



FIG. 7 is a flow diagram illustrating an exemplary process followed by a product interest characterization manager.



FIG. 8 illustrates an exemplary information solicitation system employing a product interest characterization broker.



FIG. 9 illustrates another exemplary information solicitation system.





DETAILED DESCRIPTION

This disclosure describes soliciting information for a user based at least in part on the user's context. Search parameters or other data associated with a user's search request is combined with context information for the user to generate search criteria. The search criteria can then be compared with data (stored locally and/or remotely) to identify information that matches the search criteria. The user is able to solicit any of a wide variety of information, such as advertisements (e.g., of products or services), reference materials (e.g., electronic books or articles), as well as actual goods or products themselves (e.g., in electronic form (such as audio content that can be downloaded and played immediately), or for more traditional physical delivery (such as ordering a coat and having it shipped via an overnight shipping agent)).



FIG. 1 shows a system 100 in which multiple information sources 102(1), 102(2), . . . , 102(N) transmit information over one or more networks 104 to a multiple clients 106(1), 106(1), . . . , 106(M). The information is typically solicited by the clients, and hence is said to be “pulled” from the information sources 102 to to the clients 106.


Information sources 102 may be implemented in a number of ways, such as a host server at a Website, a dedicated search engine (e.g., that stores information for searching but not the content for search hits), a voice-driven telephony system, and so forth. The content can be organized and made available to clients 106 in any of a wide variety of conventional manners. As one exemplary implementation, an information source, as represented by source 102(1), may include a content store 110 to store the information and a content server 112 to serve the content to clients 106. The information communicated from the information sources may be in any data type (e.g., text, graphics, audio, video, etc.) and contain essentially any type of subject matter. As one particular example, the information may be in the form of solicited advertisements or product/service descriptions pulled to clients 106 from advertisers.


Network 104 is representative of many different network types, including public networks (e.g., the Internet) and/or proprietary networks. The network may be implemented using wireless technologies (e.g., RF, microwave, cellular, etc.), wire-based technologies (e.g., cable fiber optic, wire, etc.), or a combination of them. Any one or more of many diverse protocols and formats may be used to package data and transmit it from source 102 to a client 106.


Clients 106 may be implemented in a variety of ways, including as computers, portable digital assistants (PDAs), communication devices, and the like. The clients are equipped with conventional mechanisms to receive the information from network 104, such as ports, network cards, receivers, modems, and so on.


Each client, as represented by client 106(1), is equipped with a Condition-Dependent Output Supplier (CDOS) system 120 that monitors the user and the user's environment. As the user moves about in various environments, the CDOS system receives various input information including explicit user input, sensed user information, and sensed environment information. The CDOS system maintains and updates a model of the user condition. One or more sensors 122 provide data to the CDOS system 120 pertaining to the user's environment.



FIG. 2 illustrates one suitable implementation of client 106 as a body-mounted wearable computer worn by a user 150. The computer 106 includes a variety of body-worn input devices, such as a microphone 152, a hand-held flat panel display 154 with character recognition capabilities, and various other user input devices 156. Examples of other types of input devices with which a user can supply information to the computer 106 include speech recognition devices, traditional qwerty keyboards, chording keyboards, half qwerty keyboards, dual forearm keyboards, chest mounted keyboards, handwriting recognition and digital ink devices, a mouse, a track pad, a digital stylus, a finger or glove device to capture user movement, pupil tracking devices, a gyropoint, a trackball, a voice grid device, digital cameras (still and motion), and so forth.


The computer 106 also has a variety of body-worn output devices, including the hand-held flat panel display 154, an earpiece speaker 158, and a head-mounted display in the form of an eyeglass-mounted display 159. Other output devices 160 may also be incorporated into the computer 106, such as a tactile display, an olfactory output device, tactile output devices, and the like.


The computer 106 may also be equipped with one or more various body-worn user sensor devices 162. For example, a variety of sensors can provide information about the current physiological state of the user and current user activities. Examples of such sensors include thermometers, sphygmometers, heart rate sensors, shiver response sensors, skin galvanometry sensors, eyelid blink sensors, pupil dilation detection sensors, EEG and EKG sensors, sensors to detect brow furrowing, blood sugar monitors, etc. In addition, sensors elsewhere in the near environment can provide information about the user, such as motion detector sensors (e.g., whether the user is present and is moving), badge readers, still and video cameras (including low light, infra-red, and x-ray), remote microphones, etc. These sensors can be both passive (i.e., detecting information generated external to the sensor, such as a heart beat) or active (i.e., generating a signal to obtain information, such as sonar or x-rays).


The computer 106 may also be equipped with various environment sensor devices 164 that sense conditions of the environment surrounding the user. For example, devices such as microphones or motion sensors may be able to detect whether there are other people near the user and whether the user is interacting with those people. Sensors can also detect environmental conditions that may affect the user, such as air thermometers or Geiger counters. Sensors, either body-mounted or remote, can also provide information related to a wide variety of user and environment factors including location, orientation, speed, direction, distance, and proximity to other locations (e.g., GPS and differential GPS devices, orientation tracking devices, gyroscopes, altimeters, accelerometers, anemometers, pedometers, compasses, laser or optical range finders, depth gauges, sonar, etc.). Identity and informational sensors (e.g., bar code readers, biometric scanners, laser scanners, OCR, badge readers, etc.) and remote sensors (e.g., home or car alarm systems, remote camera, national weather service web page, a baby monitor, traffic sensors, etc.) can also provide relevant environment information.


The computer 106 further includes a central computing unit 166 that may or may not be worn on the user. The various inputs, outputs, and sensors are connected to the central computing unit 166 via one or more data communications interfaces 168 that may be implemented using wire-based technologies (e.g., wires, coax, fiber optic, etc.) or wireless technologies (e.g., RF, etc.).


The central computing unit 166 includes a central processing unit (CPU) 170, a memory 172, and a storage device 174. The memory 172 may be implemented using both volatile and non-volatile memory, such as RAM, ROM, Flash, EEPROM, disk and so forth. The storage device 174 is typically implemented using non-volatile permanent memory, such as ROM, EEPROM, diskette, memory cards, and the like.


One or more application programs 176 are stored in memory 172 and executed by the CPU 170. The application programs 176 generate data that may be output to the user via one or more of the output devices 154, 158, 159, and 160.


In the illustrated implementation, the CDOS system 120 is shown stored in memory 172 and executes on the processing unit 170. The CDOS system 120 monitors the user and the user's environment, and creates and maintains an updated model of the current condition of the user. As the user moves about in various environments, the CDOS system receives various input information including explicit user input, sensed user information, and sensed environment information. The CDOS system updates the current model of the user condition, and presents output information to the user via appropriate output devices.


A more detailed explanation of the CDOS system 120 may be found in a co-pending U.S. patent application Ser. No. 09/216,193, entitled “Method and System For Controlling Presentation of Information To a User Based On The User's Condition”, which was filed Dec. 18, 1998, and is commonly assigned to Tangis Corporation. The reader might also be interested in a more detailed discussion of context attributes (or condition variables) discussed in U.S. patent application Ser. No. 09/724,902, entitled “Dynamically Exchanging Computer User's Context”, which was filed Nov. 28, 2000, and is commonly assigned to Tangis Corporation. These applications are hereby incorporated by reference.


An optional information solicitation manager 178 is also shown stored in memory 172 and executes on processing unit 170. Information solicitation manager 178 utilizes data from CDOS system 120 to generate search criteria based on the user's current environment. Alternatively, information solicitation manager 178 and CDOS system 120 may be implemented at a remote location (e.g., not in close physical proximity to the user 150).


The body-mounted computer 106 may be connected to one or more networks through wired or wireless communication technologies (e.g., wireless RF, a cellular phone or modem, infrared, physical cable, a docking station, etc.). For example, the body-mounted computer of a user could make use of output devices in a smart room, such as a television and stereo when the user is at home, if the body-mounted computer can transmit information to those devices via a wireless medium or if a cabled or docking mechanism is available to transmit the information. Alternately, kiosks or other information devices can be installed at various locations (e.g., in airports or at tourist spots) to transmit relevant (and typically, unsolicited) information to body-mounted computers within the range of the information device.



FIG. 3 illustrates an exemplary information solicitation environment 200 including multiple computing resources 202 and 204, as well as searchable information sources or providers 102. Solicitation environment 200 allows a user to solicit information from information sources 102. The user is able to input a search request via one or more local input devices 208 (e.g., devices 152 or 156 of FIG. 2). Information that is found based at least in part on the input search request is then presented to the user via one or more output devices 210 (e.g., devices 154, 158, 159, or 160). The input and output devices 208 and 210 are local resources 202, being local (in close physical proximity) to the user. Other resources, discussed in more detail below, can be implemented local to the user and/or remote from the user.


User search requests are input to an information solicitation management component, which, in the illustrated example, is a product interest characterization (PIC) manager 212. PIC manager 212 receives the user request and combines the request with the user's current context from context awareness model 214 in order to generate search criteria. The generated search criteria is then communicated to the locally and/or remotely situated information source 102. The search criteria is compared to the information at source 102 (e.g., an Internet search engine) to determine what information (if any) at source 102 matches the search criteria, and optionally how well that information matches the search criteria. The results of the comparison are then returned to PIC manager 212, which returns the results as appropriate to output device(s) 210 for presentation to the user. The results returned to PIC manager 212 may be sufficient to present to the user, or alternatively may only identify content that needs to be accessed by PIC manager 212 and presented to the user. For example, the results returned to PIC manager 212 may be a set of uniform resource locators (URLs). Those URLs may be presented to the user, or alternatively PIC manager 212 may access the locations identified by those URLs and return the content at those locations for presentation to the user.


Context awareness model 214 maintains context information for the user, allowing a characterization module 216 to attempt to characterize the user's context (e.g., his or her current context at the time a user search request is made by the user and/or received by PIC manager 212) and communicate this context information to PIC manager 212. Context awareness model 214 is built based on input from various modules 218, 220, 222, and 224 that capture and pass information based on inputs from one or more sensors 226 (e.g., environment sensors 164, user sensors 162, etc. of FIG. 2). Sensors 226 monitor the environment parameters and provide data to the modules 218-224, and can be local to the user and/or remote from the user. Sensors 226 can be any transducer or software module that provides data used (or potentially used) in the context awareness model 214.


In the illustrated implementation, the context awareness model 214 gathers information on (1) the user's physical environment from module 218, (2) the user's mental environment from module 220, (3) the user's computing environment from module 222, and (4) the user's data environment from module 224.


Physical environment module 218, generates information pertaining to the user's present location (e.g., geographical, relative to a structure such as a building, etc.), the current time, and surrounding objects that may be used as a basis for searching. As an example of this latter situation, a user with a wearable computer may be traversing through a mall having numerous stores therein. While in this location, the user may request product sale information and only advertisements of products sold in stores in the mall and currently on sale are presented to the user.


The mental environment module 220 generates information pertaining to the user's likely intentions, their preferences, and their current attention. For instance, the mental environment module 220 may use data from a pupil tracking sensor or head orientation sensor to identify a direction or object on which the user is focused. If the user appears to be focused on administrative items presented on the heads up display, then the user context module 220 might determine that it is safe to present search results.


The computing environment module 222 generates information pertaining to the computing capabilities of the client, including available I/O devices, connectivity, processing capabilities, available storage space, and so on. The data environment module 224 generates information pertaining to the data and software resources on the client computer, including the communication resources, applications, operating system, and data.


The search criteria generated by PIC manager 212 is encapsulated in a data structure referred to as a PIC. A PIC is the data that is sent from the consumer computing system (e.g., PIC manager 212) to information sources 102. If the information provider determines that there is content that sufficiently conforms to the consumer's interest (e.g., matches all of the search criteria, or at least a threshold amount of the search criteria), an indication of a match, optionally with product description information and other commerce facilitating code and data, can be sent to the consumer.


A PIC can contain a variety of information for a variety of purposes. Table I illustrates exemplary information that may be included in a PIC. A PIC, however, need not include all of the information in Table I. Rather, different PICs can include different subsets of the information described in Table I.










TABLE I





Information
Description







Keywords
A distillation of desired information (e.g., product



characteristics). Keywords are typically chosen for



brevity and precision, and can serve as search terms



for many of the currently available Internet search



engines.


Context Awareness
Contain any data (e.g., name/value pair(s))


Attributes
characterizing the user's current or past context. Based



on information received from the characterization



module.


Security Keys
Allows some or all PIC data to be read by only



intended recipients. This may be optionally included



when security issues are important to the consumer



and/or the information provider.


Internet Cookies
Allows an Internet site to identify and profile a



particular consumer. This may be optionally included



by the user to facilitate repeat business or information



requests.


User Comments
Information the user may wish to include to more



fully characterize their interests. Note that information



providers may supplement their automated processes



with people who can review PICs of interest. It should



therefore not be assumed that a PIC must contain only



machine understandable data. For instance, a PIC can



be in the form of an audio file, which the user



recorded and has the computer send to product



information providers' telephony systems.


Code
Support a variety of executable code formats. For



instance, information providers may support advanced



queries using SQL, or automatic purchase



mechanisms may be shared. These mechanisms may



first be provided by the information provider, and then



included in the PIC during subsequent purchase



requests. May be used to support the convenient



purchase of items satisfying a sufficient number of



parameters in the PIC.


Filters
A special case of code. Filters aid the interpretation of



interest characterizations. They can also be used by



the information return process to restrict when search



results are presented to the user.


Authorizations
Allows the user to indicate how much data can be



provided to different classes of product information



providers. This information can include purchase-



enabling information like credit card numbers. This is



part of a general CA permissioning scheme, that



supports dynamic authorizations. Therefore,



depending on the current context of the user, the PIC



can change its: exposure (who sees it), content (what



it contains), and validity (how well does it match



desired goods, services, or information).


Consumer
Provides an identification of the user. This


Identification
identification may not necessarily correspond with



any legal identification. For instance, it may be unique



to a particular product information provider, or class



of product information provider.


PIC Version
Identifies what version of the PIC manager the PIC



data is conforming to.


PIC Certificates
Securely identifies the origin of the component



generating the PIC.


PIC Description
Describes the fields included in a particular PIC, as



well as their purpose and use.


Previous Search
Provides search facilities with a history of what the


Results
consumer has already been provided, so, among other



functions, allows the search engine to eliminate



providing repetitive information.


Weighting
Characterizes what the consumer found interesting in



previous information searches.


Purchase History
Characterizes what the consumer had previously



purchased. In some cases, this information can be very



detailed and so provide a rich product interest



characterization.









PIC manager 212 is thus able to formulate search criteria (e.g., in the form of PICs) encompassing a wide variety of different information. This can include, for example, basic keyword inputs by the user which are then combined with other information (e.g., from context awareness model 214) by PIC manager 212 to formulate the search criteria.



FIG. 4 is a flowchart illustrating an exemplary process for soliciting information based at least in part on the user's context. The process of FIG. 4 is performed by, for example, PIC manager 212 of FIG. 3, and may be implemented in software.


Initially, user input is received (act 252). The current user context is then identified (act 254), and search criteria (e.g., a PIC) generated based on both the received user input and the identified user context (act 256). A search is then performed for information that satisfies the search criteria (act 258). The search may be performed by the component that generates the search criteria (in act 256), or alternatively the search may be performed by communicating the search criteria to a search component (e.g., an information store 102 of FIG. 3). Once the search is at least partly completed, the search results are presented to the user (act 260). Search results may be presented to the user as they are received by PIC manager 212, or alternatively after all searching has been completed.


One example of soliciting information involves the user passing (walking, riding, driving, etc.) a store and submitting an advertisement search request. The search criteria include the advertisement request as well as context information indicating that the user is in close proximity to the store. The search results include an advertisement that the store is selling a product (e.g., a specific brand of cigarettes, including cigarettes the user's context knows that the user has purchased in the past) for a price that the user may be willing to purchase the item (e.g., the cigarettes are on sale, or cheaper than other stores, or cheaper than the user's last purchase of cigarettes). The cigarette advertisement is thus presented to the user. In general terms the user's context determines whether a particular criteria is met and presents an advertisement (in this case, generated by the store, but not directed at specific consumers) to the user.



FIG. 5 illustrates an exemplary product interest characterization manager 212 in additional detail. PIC manager 212 is illustrated communicating with both remote information stores 302 and local information store 304, although alternatively PIC manager 212 may communicate with only one or the other of stores 302 and 304.


Additionally, PIC manager 212 may optionally maintain a user profile(s) for each user. By using the detailed, automatically updated, under user control, profile, PICs can be further customized or personalized to the individual users. The information maintained in a profile can include, for example, the user's needs (explicitly defined by user or inferred by system), desires (courses on dog training, good bargains, etc.), preferences (red leather with prominent logos, German sedans, etc.), budget (current cash, monthly goals, shared funds, credit limits, etc.), legal constraints (age, criminal history, licenses, etc.), physical limitations (require wheelchair entry/exit ramps, need sign-language interpretation, location must not be subject to cold winds, etc.), time availability (does user have enough time in schedule to review information or get product?, is movie too late in evening?), route (is supplier of product convenient to my planned route?), access to transportation (when will family car be available, what is bus schedule, etc.), need (is product already included in a shopping list?, is a product currently being used going to be depleted soon?), and so forth. The preceding considerations can be to derived from, or supplemented by, past individual or aggregate consumer behaviors.


To solicit information, or add data to a PIC for subsequent solicitation requests, the user interacts with PIC manager 212. PIC manager 212 includes multiple PIC functionality modules: a profile manager 306, a PIC builder 308, a PIC sender 310, a PIC receiver 312, and a presentation manager 314. The user 316 can interact, either directly or indirectly, with these functionality modules.


Profile manager 306 allows user access and control for individual functions and specific PICs, and also allows the user to access and modify portions of the user's model context pertinent to product interest. For example, this is where a user can modify his or her address, credit card numbers and authorizations, shirt size, and so forth.


Profile manager module 306 presents various choices to the user, including: use profile (allows the user to select his or her profile (or one of his or her profiles) for use), change profile (allows the user to change information stored in his or her profile), view active/inactive PIC results (view any search results that have been received and stored by the PIC manager (e.g., because they were not supposed to be presented to the user yet)), change active/inactive PIC status (allows the user to have multiple PICs defined and toggle individual PICs between an active status (causing searches to be performed based on the data in the PIC) and inactive status (for which searches are not performed)), initiate new PICs (allows the user to create a new PIC, such as by entering search terms (key words)), help (makes a user help function available to the user).


PIC builder module 308 allows the user to generate new PICs and modify existing PICs. Once the user has generated a PIC, he or she can set the PIC status to active, causing PIC manager 212 to share the PIC with specified agencies, or whomever is interested, and has a compatible information description data store. PIC builder module 308 provides interfaces to help the user create PICs. In one implementation, PIC builder module 308 provides both blank PIC forms and default PIC forms to aid the user in the PIC creation process.


Blank PIC forms can be built from scratch using menus, tool bars, and other UI elements providing prompts for elemental PIC form fields (common fields like time, location, price, store, quality, material, and so forth), query building logic (AND, OR, NOT, SIMILAR, ONLY, ALL, INCLUDING, wildcards, and so forth). Blank forms can include automatically visible or hidden fields with values included derived from the context model.


Default PIC Forms are forms that are at least partly filled in, relating to specific information categories. For example, there can be a default PIC form for “New Car”, which would present fields that are useful in specifying a car of interest.


By default, PIC forms do not show form fields that the context awareness model has values available for. These fields can be automatically filled in for the user, thereby freeing him or her of the time needed to do so (and even the knowledge that they are being filled in). Alternatively, these fields can be displayed, or displayed only under certain circumstances. For example, a context model provided by a company may include fields used for accounting, security, and performance measurement that cannot be displayed with default user privilege.


As there are many product area forms potentially useful to a user, organization and search capabilities such as keyword search, graphic information-tree traversal, and many other techniques as provided in file browsers, Internet search engines, and online broadcast program schedules may optionally be made available by PIC builder 308.


Additionally, PICs can include specification of when and how a PIC result should be presented. This specification of when and how PIC results should be presented is limited only by availability of criteria in the context model. However, since the context awareness model is user extensible, users are free to add new model attributes. For example, a user may purchase for his or her car a device that allows him or her to use an alternative fuel. The user could then add to his or her context model a new attribute/field, associated with other attributes related to his or her car, having an indication of interest/desirability/ability to use this alternative fuel. Now a PIC can be created that shows the user a list of sources of the fuel within the limits of the car's fuel-determined cruising range.



FIG. 6 illustrates PIC generation and storage in additional detail. Information to assist in the generation and modification of PICs by PIC builder 308 is available to the user via multiple PIC data stores. In the illustrated example of FIG. 6, these PIC data stores include: new PIC data store 352, previous PIC data store 354, user profile data store 356, generic product characterization data store 358, navigation preferences data store 360, and generic product preferences data store 362. Additionally, a log 364, managed by a logging component 366, is also accessible to PIC builder 308.


New PIC data store 352 is used to generate a unique PIC. Data store 352 can contain different types of information, such as information provided by the user to characterize new information (e.g., a new product) of interest. Data store 352 may also include information previously provided by the user to characterize other information (e.g., product(s)) of interest. This information may be included because the user indicated a desire to have PICs with similar fields share values as default. Additionally, system-suggested information may also be included. For example, based on previous PICs, the system can suggest PIC fields and values based on previous user behavior. A more detailed explanation of such predictive behavior can be found in a co-pending U.S. patent application Ser. No. 09/825159, entitled “Thematic Response To A Computer User's Context, Such As By A Wearable Personal Computer” to James O. Robarts and Eric Matteson, which was filed Apr. 2, 2001, and is commonly assigned to Tangis Corporation. This application is hereby incorporated by reference.


Previous PIC data store 354 includes all PICs generated by the user, either active or inactive, until deleted by the user. These are available for modification (or change of status), as well as for reference when generating new PICs.


User profile PIC data store 356 contains product-independent information. Examples of the type of information contained include: user identification information (e.g., name, alias, anonymizing ID, etc.); financial transaction data (e.g., credit card data, bank account data, authorizations (such as list of trusted institutions, indication of whether explicit user verification is required), transaction limits, etc.); authorizations (e.g., indication of trust per external institution or person, default permissions, permission overrides, need for accounting logs, etc.); and so forth.


Generic product characterization data store 358 allows the user to rely on recognition rather than recall to create a PIC. This is valuable because the PIC fields required for the precise characterization of a product interest are significantly different for different types of products, and there are many of them, and they can change over time. Therefore, a generalized taxonomy of generic products is provided, that can be navigated (e.g., hierarchically, graphically with pseudo-spatial relationships, keyword searched, and so forth) similarly to actual product catalogs (e.g., online Yellow Pages). As the user traverses the data store, he or she can both be learning about general product characteristics (new luxury SUVs are available from which manufacturers, in a bounded price range), and providing candidate fields and values for the PIC Builder (for storage in data store 358).


Navigation preferences data store 360 maintains a record of the explicit and inferred user preferences for using the generic product characterization data store 358. Examples of such records include: navigation preferences (e.g., showing an hierarchical tree organized by color, building Boolean logic with compositing transparent filter frames like conventional Magic Lens filters, etc.); previously explored areas of the data store (e.g., ID shows previously navigated links in different color), and so forth.


Generic product preferences data store 362 records a user's indication that a particular generically described product is of interest.


Log 364 lists all previously used PIC fields. Log 364 can combine values from previous PICs 354, Generic Product Preferences 362, and inferred product interest from Pattern Recognizer.


Returning to FIG. 5, once a PIC is generated and made active by user 316, the PIC is made available to PIC sender 310 which distributes the PIC to one or more information sources 302 and/or 304. One or more sending options for the PIC may also be identified by the user or automatically (e.g., based on the user's context). The sending options identify how, when, and/or where the PIC is sent. For example, the PIC may be saved until bandwidth is available, or collected in a set of PICs (e.g., perhaps purchases of products need to be coordinated: medication, scuba diving equipment, computer hardware & software). Once the sending options have been indicated, the actual process of sending a PIC and receiving responses can be transparent to the user. The user may simply see the (results of the query.


Once sent, the PIC is compared to data in those information sources, and an indication of any match returned to PIC receiver 312. The matching information is then returned by PIC receiver 312 to presentation manager 314 for presentation to the user. PIC receiver 312 is responsible for handling the communications from the content sources, and is concerned with the content. For instance, PIC receiver 312 could stack rank the received data. Presentation manager 314, on the other hand, is primarily dealing with the presentation of the data (e.g., is there a display available? Can it handle color?).


What is received by PIC receiver 312 may be a completed PIC, a data packet associated with a PIC, or even a product itself (e.g., radio cablecast link, MPEG file, etc.). PIC receiver 312 can optionally combine the results of multiple active PICs. For instance, a user may send two similar PICs: one for a new car and one for used cars.


PIC receiver 312 handles all solicited information, optionally verifying that received information matches an active PIC, and storing the information for immediate or delayed presentation. PIC receiver 312 can be explicitly configured by the user and/or determined by rules embedded in the context model and PIC Manager. In one implementation, PIC manager 212 is an extension to the general context model 214 of FIG. 3.


PIC receiver 312 may also optionally include an appropriateness filter that is used to determine whether the query results are returned to presentation manager 314, or the appropriateness filter may be a separate component of PIC manager 212. In some situations, the filter may not be needed. For example, a PIC may be submitted both to a broker trusted not to provide information inappropriate to children, and to other product information sources. It may not be necessary to have the trusted PIC results filtered for inappropriate content, while other results are filtered for inappropriate content.


Additionally, the appropriateness filter may be used by PIC receiver 312 to defer delivery of query results. For example, the user may have insufficient attention to attend to them because he or she is working on other tasks, is sleeping, etc. In this case the query results are available to the user if he or she decides to view them, provided that doing so does not violate some other context model rule (for example, it may be unsafe to do so because user is driving in heavy traffic, or system may have security schemes that only allow the use of PICs during certain hours, or at specified locations, or while performing particular tasks).


In addition, PIC receiver 312 may use the user context to determine how filters get applied to the content. For example, a user may generate a PIC for information about changing a flat tire. However, the search may take a long time and the results of the search may not be returned to PIC manager 212 until after the user has fixed the flat tire. In this situation, the appropriateness filter can filter out the search results and not have them presented to the user because based on the user context (the flat tiring having been fixed), the search results are no longer important to the user.


PIC receiver 312 (or alternatively presentation manager 314) may also communicate with context awareness model 214 or characterization module 216 of FIG. 3 to present information in an improved form. Context model 214 includes logic that describes functional and presentational characteristics of a desired UI for the current user context. This may include UI layout, selection of presentation surface, privacy, and so forth.



FIG. 7 is a flow diagram illustrating an exemplary process followed by a PIC manager 212 of FIG. 5, which may be performed in software. Initially, an indication that a user desires information is received (act 402). A check is made as to whether the user is a first-time user of the information solicitation system (act 404), and if so gives the user the option to create a user profile (act 406). If the user desires to create a user profile, then processing is handed to the profile manager (act 408) to establish a user profile and PIC. However, if the user does not wish to create a user profile, then processing is handed to the PIC builder for generation of an unpersonalized PIC (act 410).


Returning to act 404, if the user is not a first time user of the system, then a check is made as to whether the user desires a particular user profile (act 412). If no user profile is desired, then processing is handed to the PIC builder for generation of an unpersonalized PIC (act 410). However, if a user profile is desired, then the user is verified as an authorized user of the profile (act 414), such as by user ID and password. Processing is then handed to the profile manager for generation of a personalized PIC (act 416).



FIG. 8 illustrates an exemplary information solicitation system employing a PIC Broker 450. PIC Broker 450 provides a service, in exchange for direct compensation from the user (in the form of money per transaction/subscription or access to their demographics), or indirect compensation (broker provides unsolicited message that advertises other products) or no user compensation. Product providers may compensate PIC Broker 450 at various transaction stages. For example, they may pay for every PIC/product match message sent to user, or when user views message, or when their product is purchased.


When the PIC is created, an indication on what to do when a correlation is found can be included. Some of the options include:

    • Immediately provide product—for example, if the PIC characterizes interest in a radio broadcast of discussion of a particular topic, and the content is found to be available, the PIC could have been created authorizing its immediate presentation.
    • Immediately notify the user—notification can include terse message (a PIC has a match, a particular PIC has a match) or arbitrarily complex descriptions. Descriptions could scale to the entire correlation result (a composite value of strength of match, description of what characteristics do or do not match, source of product, supplemental information provided by PIC Broker including ID, recommendations or review of product or product provider, suggestions).
    • Submit results to Appropriateness Filter—even if a message describes it as from a trusted source, and provides a perfect match between interest and product characterization, it may not be desirable or safe to present it immediately, or in a particular form, or even for a particular user (e.g., though a PIC could indicate that products provided must be suitable for children, PIC brokers may not be reliable. By always submitting product messages to filtering, a higher degree of confidence of appropriateness can be achieved.
    • Cache messages until requested—the PIC Broker can wait until contacted to present correlation results. Note this is in contrast to having the user's computing environment store them. In either case, one convenient way to view them is via the PIC Manager.


Once generated, the PIC is communicated by PIC manager 452 to a PIC receiver 454 at PIC broker 450. The PICs 456 from this user, as well as other PICs 458 from other users, are made available to a correlation logic 460. Correlation logic compares the search criteria in the PICs 456 and 458 to multiple product characterizations 462(1), 462(2), 462(3), . . . , 462(X). Any of the product characterizations 462 that satisfy the search criteria are communicated to the product provider(s) 464 corresponding to the matching product characterization(s), which in turn provide the corresponding product information (or the product itself) to the user computing resources 466 (e.g., a client 106 of FIG. 2). Correlation logic 460 may also optionally provide the product characterizations 462 that satisfy the search criteria to the user computing resources 466. The product information or characterization received at resources 466 may also be filtered by filter 468, which may prevent presentation of the information or characterization, or delay its presentation until an appropriate time.


Different components may be used besides PIC broker 450 to provide information or products. For example, the functionality of PIC broker 450 may alternatively be provided by a product provider(s) 464. By way of another example, “agents” may be used. Agents are semi-autonomous software objects that are less constrained than a PIC broker in that they can theoretically reach a less constrained source of product descriptions. They may therefore provide a more complete set of query results. However, unless they coordinate with PIC providers on the definition of product interest or descriptions, they may not be as precise. Further, since the source of the agent, and what it returns, may not be as controlled as a PIC broker, the results may not be as appropriate.


Yet another example is a content aggregator. Much like a PIC broker, content aggregators can provide interfaces to their data stores compatible with the user's context model (or vice versa, any party can provide a dictionary and write the translation filter). In this scenario very tight control on the product descriptions, including availability, can be provided, insuring timely and accurate product offers. There can also be cooperation between different user models. For example, a variety of affinity mechanisms may be used which suggest products that are similar to the ones requested. The user's context models can directly, or thru the PIC Manager mechanism, indicate if this type of information is desired. The models can also cooperate by sharing the user's desire to have products having a high-correlation (clearly satisfying the PIC) be automatically purchased. Necessary information for purchase can also be exchanged securely.



FIG. 9 illustrates another exemplary information solicitation system 500 including a client 502 and a remote service 504. User defined parameters 506 and context module (CM) models 508, also referred to as context awareness models, are combined to determine an interest 510 (e.g., a PIC). The interest 510 is provided to remote service 504, where it is stored along with other interests in an interest rule database 512. The rules (e.g., search parameters) in the interests of database 512 are then compared to messages 514 (e.g., advertisements or other information) provided to remote service 504 from one or more customers or product suppliers 516. Results of the comparison are then returned to the client for presentation to the user.


Revenue can be generated in system 500 in a variety of different manners. For example, the customers or product suppliers 516 may pay for the ability to have their messages stored as part of messages 514, or the customers or product suppliers 516 may pay for each message forwarded to client 502. By way of another example, the user of client 502 may also receive payment from the sale of the interest to remote service 504, or the user's willingness to receive messages from remote server 504 (e.g., for each message presented to the user).


Various aspects of the solicitation of information described herein can be seen from the following examples. A first example is the purchase of a child car seat. Assume that a user has a computer that maintains an explicit, extensible, dynamic model of his or her context. The user has used this model to maintain a data store of personal, detailed, private information, as well as information that is shared with others and available publicly. He or she is about to become a first time parent, and wishes to purchase a car seat for the new infant, but does not have familiarity with car seats and does not have a lot of time for research. Further, he or she understands that there are likely trade-offs between product characteristics such as price and safety.


Using the Generic Product Description feature, the user can traverse a tree to locate a PIC form that gives the user the following blank fields:

    • Weight of Child
    • Built In or Removable
    • Converts to Mobile Chair?
    • Removable Covers?
    • Headrest?
    • Footrest?
    • Optional Padding?


If the user does not use the Generic Product Description feature, he or she can use a similar PIC if he or she had created one, or use a PIC form from some other source, or create one from scratch. If the user creates it from scratch, he or she could include the fields described above.


Regardless, the resulting PIC could have the following fields already filled out. They could be hidden by default, but can be viewed and modified if desired (assuming security authorizations permit modification):

    • Default Priorities: Safety=5, 3rd Party. Rating=4, Availability=3, Cost=2, Esthetics=1
    • Car Model
    • Car Interior Color
    • User Location
    • Preferred in-person Stores
    • Availability for in-store shopping
    • Willingness to purchase online
    • Preferred online suppliers
    • Desired product detail (H, M, L)
    • Under what conditions can this PIC result in an automatic purchase?(Never, when product match is from a PIC Broker user is subscriber to, and when PIC and product have only one very strong match of top three priorities)
    • When user prefers to view product information (within 1 minute, within 10 min, at specific time)
    • Who should view product information (self only, self and others, other only)
    • How much personal info to automatically divulge (if more details from manufacturer are requested, should they be shared?)
    • Where should PIC be sent (to specific information sources only? Trusted sources only? Sources that offer anonymous listings only? Anyone offering related product information?)
    • Does this PIC have permission to be forwarded from original receiver?
    • Does product information received in response to PIC need to go thru an Appropriateness Filter?
    • Should this PIC, and resulting information, be logged? If so, should this information be available to data mining applications?


Once the user is satisfied with his or her PIC, the PIC is submitted to outside product description data stores (information sources). This online distribution does not need to happen immediately, it can be delayed by some preset time or availability of connection to remote computing/communication resources. Eventually, a match is found and the resultant car seat information is returned and presented to the user for him or her to make a purchase selection. Alternatively, the car seat could be automatically purchased on behalf of the user from a product provider and delivered to the user (e.g., via a mail service) or made available for the user's pickup.


Another example of a PIC, which makes use of a context model, is searching for a movie. The fields in the PIC include:

    • Location (based on user's current context)
    • Maximum movie rating (or youngest age of party)
    • Budget
    • Recommendation (could engage service like MovieCritic.com, where previous user movie ratings are used with preference & clustering engine to suggest what they would like)
    • List of movies already seen
    • List of movies interested in
    • Time constraints (user's (and parties) schedule/appointments)
    • Time of PIC creation


Another example of a PIC which makes use of a context model is a PIC for getting a repair procedure. Assume the user is driving in a remote area and has an auto (vehicle) breakdown. The user's PIC is a request for assistance. Appropriate responses would be repair advice in written form from the manufacturer or other expertise publisher (e.g. Chiltons), or remote expert (via cell phone) or listing of closest service stations/towing services (phone number, hours, rates). Fields in the PIC include:

    • Location (current and destination)
    • Object context (Make/model of car, car self diagnostic info)
    • User (club memberships (AAA), subscriptions (maybe subscribe to online publishing services), self rating of auto repair expertise
    • Desire of user to obtain repair instructions


Another example of a PIC, which can be sent without user interaction or verification (approval is given beforehand within rule logic) is a medical emergency PIC. Fields in the PIC include:

    • User Location
    • User Activity
    • Current Physical Condition
    • Historic Physical Condition
    • Current Emotion State
    • List of Prescribed and taken medication
    • List of Medication on hand
    • List of who is in vicinity
    • Description of immediate environment


CONCLUSION

Although the description above uses language that is specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the invention.

Claims
  • 1. A method of operating a computer, the method comprising: storing in a memory of the computer a data structure, the data structure comprising a field indicating information desired by a user of the computer;with a processor on the computer: formulating a request for information based on the stored data structure and information about a current user context at a first time;communicating the request over a network to an information source;in response to the communicating, receiving information from the information source;determining, based on information about a current user context at a second time, an amount of the received information to present to the user; andselectively presenting the received information to the user based on the determining.
  • 2. The method of claim 1, wherein the current user context is based on a current physical environment of a user that is external to the computing system.
  • 3. The method of claim 2, wherein: the current user context at the second time indicates an activity being performed by a user; anddetermining an amount of the received information to present comprises determining to present none of the received information when the received information is not relevant to a user performing the activity.
  • 4. The method of claim 2, wherein: when the user context changes from the first time to the second time, the determining comprises suppressing display of a portion of the received information that is relevant in the context at the first time but not the second time.
  • 5. The method of claim 2, wherein: the current user context is determined at least in part by an activity being performed by a user; anddetermining an amount of the received information to present to the user comprises determining, based on the activity, to delay presenting the received information to the user.
  • 6. The method of claim 2, further comprising: determining current user context information at the first time and the second time based on a physical environment, a computing environment and a data environment.
  • 7. The method of claim 1, wherein: determining an amount of the received information to present to the user comprises determining at the second time to present none of the received information; andthe method further comprises: storing the received information;determining, based on information about a user context at a third time, whether to present the received information to the user; andin response to the determining, presenting the stored received information to the user at the third time.
  • 8. The method of claim 1, wherein: the data structure comprises user preference information associated with each of a plurality of products;the data structure comprises timing information about when to present information about each of the plurality of products to the user;the formulating comprises formulating a request for information about a product of the plurality of products; andthe determining comprises determining based on timing information in the data structure associated with the product.
  • 9. A computer storage medium comprising computer executable instructions that, when executed by a processor of a computer, perform a method comprising: generating current user context information from information about a plurality of user environments at least a first time and a second time;formulating a request for information based on information about a current user context at the first time and user preference information stored in memory associated with the computer and indicating information desired by a user of the computer;communicating the request over a network to an information source;receiving information from the information source over the network;determining, based on the current user context at the second time, an amount of the received information to present to the user; andselectively presenting the received information based on the determining.
  • 10. The computer storage medium of claim 9, wherein generating current user context information from information about a plurality of user environments comprises generating the current user context information from information about a physical environment, a computing environment and a data environment.
  • 11. The computer storage medium of claim 9, further comprising computer executable instructions that, when executed by a processor of a computer, perform a method comprising: receiving user input about each of a plurality of items; andstoring the preference information in data structures associated with each of the plurality of items.
  • 12. The computer storage medium of claim 11, wherein: the plurality of items comprise items available for purchase by the user;receiving user input comprises receiving user input designating at least one data structure associated with an item of the plurality of items as an active data structure; andthe formulating based on user preference information comprises determining based on the active data structures.
  • 13. The computer storage medium of claim 11, wherein: the selectively presenting the received information comprises displaying a URL to information matching the request.
  • 14. The computer storage medium of claim 11, wherein: the data structure comprises computer executable instructions defining an information filter for each of the plurality of items; anddetermining comprises executing the filter for a selected item.
  • 15. A computer storage medium comprising computer executable instructions that, when executed by a processor of a computer, perform a method comprising: determining information about a current user context at a first time, the determining comprising determining the context based on information from each of a plurality of environments including a physical environment, a computing environment and a data environment;formulating a request for information about a product based on information about a current user context at the first time and data stored in a memory associated with the computer, the data comprising data defining user interest in each of a plurality of products;communicating the request over a network to an information source;in response to the request, receiving over a network information about the product from an information source;after the receiving, determining information about a current user context at a second time, the determining comprising determining the context based on information from the plurality of environments including a physical environment, a computing environment and a data environment;determining, based on the current context information at the second time, an amount of the received information to present the user; andselectively presenting the received information based on the determining.
  • 16. The computer storage medium of claim 15, wherein the determining information about a current user context at a first time and a second time is further based on information generated by sensors detecting the user's current attention.
  • 17. The computer storage medium of claim 15, wherein: the computer storage medium further comprises computer executable instructions that, when executed by a processor of a computer, perform a method comprising receiving user input identifying search query terms; andformulating a request for information comprises combining the search query terms with data relating to the product selected from the data defining user interest.
  • 18. The computer storage medium of claim 15, wherein: determining information about the current user context at the first time and the second time comprises maintaining a context model of the user based on information from each of the plurality of environments;the current user context indicates an activity being performed by a user; anddetermining whether to present the received information comprises determining whether the received information is relevant to a user performing the activity.
  • 19. The computer storage medium of claim 15, wherein: determining information about the current user context at the first time and the second time comprises maintaining a context model of the user based on information from each of the plurality of environments;the current user context indicates an activity being performed by a user; anddetermining whether to present the received information comprises determining based on the activity, to delay presenting the received information to the user.
  • 20. The computer storage medium of claim 15, wherein: determining whether to present the received information comprises determining at a first time not to present the received information based on a context model rule identifying a context in which the user has insufficient attention; andthe computer storage medium further comprises computer-executable instructions that, when executed, perform a method comprising: storing the received information;determining, based on information about a user context at a second time, whether to present the received information to the user; andpresenting the stored received information to the user at the second time.
RELATED APPLICATIONS

This application claims priority to provisional application No. 60/194,000, filed Apr. 2, 2000, which is hereby incorporated by reference. This application also claims priority to provisional application No. 60/194,758, filed Apr. 9, 2000, which is hereby incorporated by reference.

US Referenced Citations (400)
Number Name Date Kind
3973251 Stephans Aug 1976 A
4283712 Goody Aug 1981 A
4458331 Amezcua et al. Jul 1984 A
4569026 Best Feb 1986 A
4815030 Cross et al. Mar 1989 A
4905163 Garber et al. Feb 1990 A
4916441 Gombrich Apr 1990 A
4970683 Harshaw et al. Nov 1990 A
4991087 Burkowski et al. Feb 1991 A
5032083 Friedman Jul 1991 A
5133075 Risch Jul 1992 A
5201034 Matsuura et al. Apr 1993 A
5208449 Eastman et al. May 1993 A
5214757 Mauney et al. May 1993 A
5227614 Danielson et al. Jul 1993 A
5237684 Record et al. Aug 1993 A
5251294 Abelow Oct 1993 A
5267147 Harshaw et al. Nov 1993 A
5278946 Shimada et al. Jan 1994 A
5285398 Janik Feb 1994 A
5317568 Bixby et al. May 1994 A
5327529 Fults et al. Jul 1994 A
5335276 Thompson et al. Aug 1994 A
5339395 Pickett et al. Aug 1994 A
5349654 Bond et al. Sep 1994 A
5353399 Kuwamoto et al. Oct 1994 A
5388198 Layman et al. Feb 1995 A
5398021 Moore Mar 1995 A
5416730 Lookofsky May 1995 A
5454074 Hartel et al. Sep 1995 A
5470233 Fruchterman et al. Nov 1995 A
5471629 Risch Nov 1995 A
5481667 Bieniek et al. Jan 1996 A
5493692 Theimer et al. Feb 1996 A
5506580 Whiting et al. Apr 1996 A
5513646 Lehrman et al. May 1996 A
5522026 Records et al. May 1996 A
5535323 Miller et al. Jul 1996 A
5537618 Boulton et al. Jul 1996 A
5539665 Lamming et al. Jul 1996 A
5544321 Theimer et al. Aug 1996 A
5553609 Chen et al. Sep 1996 A
5555376 Theimer et al. Sep 1996 A
5559520 Barzegar et al. Sep 1996 A
5560012 Ryu et al. Sep 1996 A
5566337 Szymanski et al. Oct 1996 A
5568645 Morris et al. Oct 1996 A
5572401 Carroll Nov 1996 A
5592664 Starkey Jan 1997 A
5601435 Quy Feb 1997 A
5603054 Theimer et al. Feb 1997 A
5611050 Theimer et al. Mar 1997 A
5642303 Small et al. Jun 1997 A
5646629 Loomis et al. Jul 1997 A
5659746 Bankert et al. Aug 1997 A
5675358 Bullock et al. Oct 1997 A
5689619 Smyth Nov 1997 A
5689708 Regnier et al. Nov 1997 A
5701894 Cherry et al. Dec 1997 A
5704366 Tacklind et al. Jan 1998 A
5710884 Dedrick Jan 1998 A
5715451 Marlin Feb 1998 A
5717747 Boyle, III et al. Feb 1998 A
5719744 Jenkins et al. Feb 1998 A
5726660 Purdy et al. Mar 1998 A
5726688 Siefert et al. Mar 1998 A
5738102 Lemelson Apr 1998 A
5740037 McCann et al. Apr 1998 A
5742279 Yamamoto et al. Apr 1998 A
5745110 Ertemalp Apr 1998 A
5751260 Nappi et al. May 1998 A
5752019 Rigoutsos et al. May 1998 A
5754938 Herz et al. May 1998 A
5761662 Dasan Jun 1998 A
5769085 Kawakami et al. Jun 1998 A
5781913 Felsenstein et al. Jul 1998 A
5787234 Molloy Jul 1998 A
5787279 Rigoutsos Jul 1998 A
5790974 Tognazzini Aug 1998 A
5796952 Davis et al. Aug 1998 A
5798733 Ethridge Aug 1998 A
5806079 Rivette et al. Sep 1998 A
5812865 Theimer et al. Sep 1998 A
5818446 Bertram et al. Oct 1998 A
5826253 Bredenberg Oct 1998 A
5831594 Tognazzini et al. Nov 1998 A
5832296 Wang et al. Nov 1998 A
5835087 Herz et al. Nov 1998 A
5852814 Allen Dec 1998 A
5867171 Murata et al. Feb 1999 A
5873070 Bunte et al. Feb 1999 A
5878274 Kono et al. Mar 1999 A
5879163 Brown et al. Mar 1999 A
5881231 Takagi et al. Mar 1999 A
5899963 Hutchings May 1999 A
5902347 Backman et al. May 1999 A
5905492 Straub et al. May 1999 A
5910799 Carpenter et al. Jun 1999 A
5911132 Sloane Jun 1999 A
5913030 Lotspiech et al. Jun 1999 A
5924074 Evans Jul 1999 A
5930501 Neil Jul 1999 A
5937160 Davis et al. Aug 1999 A
5938721 Dussell et al. Aug 1999 A
5942986 Shabot et al. Aug 1999 A
5945988 Williams et al. Aug 1999 A
5948041 Abo et al. Sep 1999 A
5953718 Wical Sep 1999 A
5959611 Smailagic et al. Sep 1999 A
5963914 Skinner et al. Oct 1999 A
5966126 Szabo Oct 1999 A
5966533 Moody Oct 1999 A
5966710 Burrows Oct 1999 A
5971580 Hall et al. Oct 1999 A
5974262 Fuller et al. Oct 1999 A
5977968 Le Blanc Nov 1999 A
5980096 Thalhammer-Reyero Nov 1999 A
5983335 Dwyer, III Nov 1999 A
5991687 Hale et al. Nov 1999 A
5991735 Gerace Nov 1999 A
5995956 Nguyen Nov 1999 A
5999932 Paul Dec 1999 A
5999943 Nori et al. Dec 1999 A
5999975 Kittaka et al. Dec 1999 A
6003082 Gampper et al. Dec 1999 A
6006251 Toyouchi et al. Dec 1999 A
6012152 Douik et al. Jan 2000 A
6014638 Burge et al. Jan 2000 A
6023729 Samuel et al. Feb 2000 A
6031455 Grube et al. Feb 2000 A
6035264 Donaldson et al. Mar 2000 A
6041331 Weiner et al. Mar 2000 A
6041365 Kleinerman Mar 2000 A
6044415 Futral et al. Mar 2000 A
6047301 Bjorklund et al. Apr 2000 A
6047327 Tso et al. Apr 2000 A
6055516 Johnson et al. Apr 2000 A
6061610 Boer May 2000 A
6061660 Eggleston et al. May 2000 A
6064943 Clark, Jr. et al. May 2000 A
6067084 Fado et al. May 2000 A
6081814 Mangat et al. Jun 2000 A
6085086 La Porta et al. Jul 2000 A
6088689 Kohn et al. Jul 2000 A
6091411 Straub et al. Jul 2000 A
6092101 Birrell et al. Jul 2000 A
6094625 Ralston Jul 2000 A
6098065 Skillen et al. Aug 2000 A
6105063 Hayes, Jr. Aug 2000 A
6108197 Janik Aug 2000 A
6108665 Bair et al. Aug 2000 A
6112246 Horbal et al. Aug 2000 A
6122348 French-St. George et al. Sep 2000 A
6122657 Hoffman, Jr. et al. Sep 2000 A
6122960 Hutchings et al. Sep 2000 A
6127990 Zwern Oct 2000 A
6128663 Thomas Oct 2000 A
6131067 Girerd et al. Oct 2000 A
6134532 Lazarus et al. Oct 2000 A
6154745 Kari et al. Nov 2000 A
6155960 Roberts et al. Dec 2000 A
6164541 Dougherty et al. Dec 2000 A
6169976 Colosso Jan 2001 B1
6185534 Breese et al. Feb 2001 B1
6188399 Voas et al. Feb 2001 B1
6195622 Altschuler et al. Feb 2001 B1
6198394 Jacobsen et al. Mar 2001 B1
6199099 Gershman et al. Mar 2001 B1
6199102 Cobb Mar 2001 B1
6215405 Handley et al. Apr 2001 B1
6218958 Eichstaedt et al. Apr 2001 B1
6230111 Mizokawa May 2001 B1
6236768 Rhodes et al. May 2001 B1
6256633 Dharap Jul 2001 B1
6262720 Jeffrey et al. Jul 2001 B1
6263268 Nathanson Jul 2001 B1
6263317 Sharp et al. Jul 2001 B1
6272470 Teshima Aug 2001 B1
6272507 Pirolli et al. Aug 2001 B1
6282517 Wolfe et al. Aug 2001 B1
6282537 Madnick et al. Aug 2001 B1
6285757 Carroll et al. Sep 2001 B1
6285889 Nykanen et al. Sep 2001 B1
6289316 Aghili et al. Sep 2001 B1
6289513 Bentwich Sep 2001 B1
6292796 Drucker et al. Sep 2001 B1
6294953 Steeves Sep 2001 B1
6301609 Aravamudan et al. Oct 2001 B1
6305007 Mintz Oct 2001 B1
6305221 Hutchings Oct 2001 B1
6308203 Itabashi et al. Oct 2001 B1
6311162 Reichwein et al. Oct 2001 B1
6314384 Goetz Nov 2001 B1
6317718 Fano Nov 2001 B1
6321158 DeLorme et al. Nov 2001 B1
6321279 Bonola Nov 2001 B1
6324569 Ogilvie et al. Nov 2001 B1
6327535 Evans et al. Dec 2001 B1
6349307 Chen Feb 2002 B1
6353398 Amin et al. Mar 2002 B1
6353823 Kumar Mar 2002 B1
6356905 Gershman et al. Mar 2002 B1
6363377 Kravets et al. Mar 2002 B1
6385589 Trusheim et al. May 2002 B1
6392670 Takeuchi et al. May 2002 B1
6401085 Gershman et al. Jun 2002 B1
6405159 Bushey et al. Jun 2002 B2
6405206 Kayahara Jun 2002 B1
6418424 Hoffberg et al. Jul 2002 B1
6421700 Holmes et al. Jul 2002 B1
6427142 Zachary et al. Jul 2002 B1
6430531 Polish Aug 2002 B1
6438618 Lortz et al. Aug 2002 B1
6442549 Schneider Aug 2002 B1
6442589 Takahashi et al. Aug 2002 B1
6442620 Thatte et al. Aug 2002 B1
6446076 Burkey et al. Sep 2002 B1
6446109 Gupta Sep 2002 B2
6460036 Herz Oct 2002 B1
6462759 Kurtzberg et al. Oct 2002 B1
6466232 Newell et al. Oct 2002 B1
6477117 Narayanaswami et al. Nov 2002 B1
6483485 Huang et al. Nov 2002 B1
6484200 Angal et al. Nov 2002 B1
6487552 Lei et al. Nov 2002 B1
6490579 Gao et al. Dec 2002 B1
6505196 Drucker et al. Jan 2003 B2
6507567 Willars Jan 2003 B1
6507845 Cohen et al. Jan 2003 B1
6513046 Abbott, III et al. Jan 2003 B1
6519552 Sampath et al. Feb 2003 B1
6526035 Atarius et al. Feb 2003 B1
6529723 Bentley Mar 2003 B1
6539336 Vock et al. Mar 2003 B1
6542889 Aggarwal et al. Apr 2003 B1
6546425 Hanson et al. Apr 2003 B1
6546554 Schmidt et al. Apr 2003 B1
6549915 Abbott, III et al. Apr 2003 B2
6549944 Weinberg et al. Apr 2003 B1
6553336 Johnson et al. Apr 2003 B1
6563430 Kemink et al. May 2003 B1
6568595 Russell et al. May 2003 B1
6571279 Herz et al. May 2003 B1
6578019 Suda et al. Jun 2003 B1
6615197 Chai Sep 2003 B1
6625135 Johnson et al. Sep 2003 B1
6636831 Profit, Jr. et al. Oct 2003 B1
6643684 Malkin et al. Nov 2003 B1
6652283 Van Schaack et al. Nov 2003 B1
6661437 Miller et al. Dec 2003 B1
6672506 Swartz et al. Jan 2004 B2
6697836 Kawano et al. Feb 2004 B1
6704722 Wang Baldonado Mar 2004 B2
6704785 Koo et al. Mar 2004 B1
6704812 Bakke et al. Mar 2004 B2
6707476 Hochstedler Mar 2004 B1
6712615 Martin Mar 2004 B2
6714977 Fowler et al. Mar 2004 B1
6738040 Jahn et al. May 2004 B2
6738759 Wheeler et al. May 2004 B1
6741188 Miller et al. May 2004 B1
6741610 Volftsun et al. May 2004 B1
6747675 Abbott et al. Jun 2004 B1
6751620 Orbanes et al. Jun 2004 B2
6766245 Padmanabhan Jul 2004 B2
D494584 Schlieffers et al. Aug 2004 S
6791580 Abbott et al. Sep 2004 B1
6795806 Lewis et al. Sep 2004 B1
6796505 Pellaumail et al. Sep 2004 B2
6801223 Abbott et al. Oct 2004 B1
6812937 Abbott et al. Nov 2004 B1
6829639 Lawson et al. Dec 2004 B1
6834195 Brandenberg et al. Dec 2004 B2
6834208 Gonzales et al. Dec 2004 B2
6837436 Swartz et al. Jan 2005 B2
6842877 Robarts et al. Jan 2005 B2
6850252 Hoffberg Feb 2005 B1
6853966 Bushey et al. Feb 2005 B2
6868525 Szabo Mar 2005 B1
6874017 Inoue et al. Mar 2005 B1
6874127 Newell et al. Mar 2005 B2
6885734 Eberle et al. Apr 2005 B1
6899539 Stallman et al. May 2005 B1
6963899 Fernandez et al. Nov 2005 B1
6968333 Abbott et al. Nov 2005 B2
7000187 Messinger et al. Feb 2006 B2
7010501 Roslak et al. Mar 2006 B1
7010603 Martin, Jr. et al. Mar 2006 B2
7040541 Swartz et al. May 2006 B2
7046263 Abbott et al. May 2006 B1
7055101 Abbott et al. May 2006 B2
7058893 Abbott et al. Jun 2006 B2
7058894 Abbott et al. Jun 2006 B2
7062715 Abbott et al. Jun 2006 B2
7063263 Swartz et al. Jun 2006 B2
7076737 Abbott et al. Jul 2006 B2
7080322 Abbott et al. Jul 2006 B2
7089497 Abbott et al. Aug 2006 B2
7096253 Vinson et al. Aug 2006 B2
7103806 Horvitz Sep 2006 B1
7107539 Abbott et al. Sep 2006 B2
7110764 Blair et al. Sep 2006 B1
7120558 McIntyre et al. Oct 2006 B2
7124125 Cook et al. Oct 2006 B2
7137069 Abbott et al. Nov 2006 B2
7155456 Abbott, III et al. Dec 2006 B2
7162473 Dumais et al. Jan 2007 B2
7171378 Petrovich et al. Jan 2007 B2
7195157 Swartz et al. Mar 2007 B2
7203906 Abbott et al. Apr 2007 B2
7225229 Abbott et al. May 2007 B1
7231439 Abbott et al. Jun 2007 B1
7260453 Poier et al. Aug 2007 B2
7349894 Barth et al. Mar 2008 B2
7360152 Capps et al. Apr 2008 B2
7385501 Miller et al. Jun 2008 B2
7386477 Fano Jun 2008 B2
7392486 Gyde et al. Jun 2008 B1
7395221 Doss et al. Jul 2008 B2
7444594 Abbott et al. Oct 2008 B2
7464153 Abbott et al. Dec 2008 B1
7512889 Newell et al. Mar 2009 B2
7533052 Tilfors et al. May 2009 B2
7533082 Abbott et al. May 2009 B2
7561200 Garvey, III et al. Jul 2009 B2
7571218 Tanaka et al. Aug 2009 B2
7614001 Abbott et al. Nov 2009 B2
7647400 Abbott et al. Jan 2010 B2
7689919 Abbott et al. Mar 2010 B2
7734780 Abbott et al. Jun 2010 B2
7739607 Abbott et al. Jun 2010 B2
7779015 Abbott et al. Aug 2010 B2
7827281 Abbott et al. Nov 2010 B2
7877686 Abbott et al. Jan 2011 B2
7945859 Abbott et al. May 2011 B2
20010030664 Shulman et al. Oct 2001 A1
20010040590 Abbott et al. Nov 2001 A1
20010040591 Abbott et al. Nov 2001 A1
20010043231 Abbott et al. Nov 2001 A1
20010043232 Abbott et al. Nov 2001 A1
20020032689 Abbott et al. Mar 2002 A1
20020044152 Abbott et al. Apr 2002 A1
20020052930 Abbott et al. May 2002 A1
20020052963 Abbott et al. May 2002 A1
20020054130 Abbott et al. May 2002 A1
20020054174 Abbott et al. May 2002 A1
20020078204 Newell et al. Jun 2002 A1
20020080155 Abbott et al. Jun 2002 A1
20020080156 Abbott et al. Jun 2002 A1
20020083025 Robarts et al. Jun 2002 A1
20020083158 Abbott et al. Jun 2002 A1
20020087525 Abbott et al. Jul 2002 A1
20020099817 Abbott et al. Jul 2002 A1
20020147880 Wang Baldonado Oct 2002 A1
20020191034 Sowizral et al. Dec 2002 A1
20030046401 Abbott et al. Mar 2003 A1
20030154476 Abbott et al. Aug 2003 A1
20030186201 Martin Oct 2003 A1
20030229900 Reisman Dec 2003 A1
20040088328 Cook et al. May 2004 A1
20040133600 Homer Jul 2004 A1
20040186854 Choi Sep 2004 A1
20040201500 Miller et al. Oct 2004 A1
20040215663 Liu et al. Oct 2004 A1
20040267700 Dumais et al. Dec 2004 A1
20040267812 Harris et al. Dec 2004 A1
20050027704 Hammond et al. Feb 2005 A1
20050034078 Abbott et al. Feb 2005 A1
20050066282 Abbott et al. Mar 2005 A1
20050086243 Abbott et al. Apr 2005 A1
20050160113 Sipusic et al. Jul 2005 A1
20050165843 Capps et al. Jul 2005 A1
20050193017 Kim Sep 2005 A1
20050266858 Miller et al. Dec 2005 A1
20050272442 Miller et al. Dec 2005 A1
20060004680 Robarts et al. Jan 2006 A1
20060019676 Miller et al. Jan 2006 A1
20060136393 Abbott et al. Jun 2006 A1
20060259494 Watson et al. Nov 2006 A1
20070022384 Abbott et al. Jan 2007 A1
20070043459 Abbott et al. Feb 2007 A1
20070089067 Abbott et al. Apr 2007 A1
20070130524 Abbott et al. Jun 2007 A1
20070168502 Abbott et al. Jul 2007 A1
20070185864 Budzik et al. Aug 2007 A1
20070266318 Abbott et al. Nov 2007 A1
20080090591 Miller et al. Apr 2008 A1
20080091537 Miller et al. Apr 2008 A1
20080147775 Abbott et al. Jun 2008 A1
20080161018 Miller et al. Jul 2008 A1
20080313271 Abbott et al. Dec 2008 A1
20090013052 Robarts et al. Jan 2009 A1
20090055752 Abbott et al. Feb 2009 A1
20090094524 Abbott et al. Apr 2009 A1
20090150535 Abbott et al. Jun 2009 A1
20090228552 Abbott et al. Sep 2009 A1
20090234878 Herz et al. Sep 2009 A1
20100217862 Abbott et al. Aug 2010 A1
20100257235 Abbott et al. Oct 2010 A1
20100262573 Abbott et al. Oct 2010 A1
Foreign Referenced Citations (20)
Number Date Country
0 661 627 Jul 1995 EP
0 759 591 Feb 1997 EP
0 801 342 Oct 1997 EP
0 823 813 Feb 1998 EP
0 846 440 Jun 1998 EP
0 924 615 Jun 1999 EP
05-260188 Oct 1993 JP
09-091112 Apr 1997 JP
11-306002 Nov 1999 JP
WO-9008361 Jul 1990 WO
WO-9531773 Nov 1995 WO
WO-9703434 Jan 1997 WO
WO-9734388 Sep 1997 WO
WO-9800787 Jan 1998 WO
WO-9847084 Oct 1998 WO
WO-9917228 Apr 1999 WO
WO-9926180 May 1999 WO
WO-9966394 Dec 1999 WO
WO-9967698 Dec 1999 WO
WO-0036493 Jun 2000 WO
Related Publications (1)
Number Date Country
20090282030 A1 Nov 2009 US
Provisional Applications (2)
Number Date Country
60194758 Apr 2000 US
60194000 Apr 2000 US
Continuations (2)
Number Date Country
Parent 11179822 Jul 2005 US
Child 12464064 US
Parent 09824900 Apr 2001 US
Child 11179822 US