This disclosure relates generally to computer-assisted solicitation of desired information, and more particularly to soliciting information based on a computer user's context.
As computers become increasingly powerful and ubiquitous, users increasingly employ their computers for a broad variety of tasks. For example, in addition to traditional activities such as running word processing and database applications, users increasingly rely on their computers as an integral part of their daily lives. Programs to schedule activities, generate reminders, and provide rapid communication capabilities are becoming increasingly popular. Moreover, computers are increasingly present during virtually all of a person's daily activities. For example, hand-held computer organizers (e.g., PDAs) are more common, and communication devices such as portable phones are increasingly incorporating computer capabilities. Thus, users may be presented with output information from one or more computers at any time.
Accompanying the increasing use and portability of computers is an increasing desire on the part of users to obtain information through wireless and other communication media. When a consumer becomes aware of a situation in which they perceive a need that might be able to be fulfilled with goods or services that may or may not be available, they are currently limited in how they can gain product information. Often, when the need arrives, the consumer is not in a convenient circumstance to review printed materials, ask others, or wait for uncontrolled media like radio or television to present an advertisement or review. This inconvenience may result in the user choice that if significant effort or time is required to learn about potential product claims, availability, or cost is required to learn about the offered goods and services, then it is not worth it.
The advent of computers, especially when coupled to the data-rich environment of the Internet, expands consumer's ability to gain product information without regard for geographic proximity or time of day. However, current product search techniques rely on either what the user has directly specified (e.g., in a search text box), or past behavior (e.g., Internet merchants tracking past purchases). And, even though many product providers collect and sell individual and aggregate consumer profiles, and so do sometimes provide assistance to consumers as they consider offered products, there is currently no general mechanism such that detailed user characterizations can facilitate the location of a specific desired product or information.
Some Internet-related products, such as the Microsoft® (Internet Explorer web browser, can record the information that a user enters in form fields. When the user begins filling out a new form, those values can automatically be entered or suggested, easing the form completion. Despite this easing, problems still exist with such products. One problem is that the user is limited to data already entered in other forms. Another problem is that such products require presentation to the user of form fields that are already filled out, which can be inconvenient for the user and degrade from the user-friendliness of the solution (e.g., it can be inconvenient for the user to see his or her name for every form).
Accordingly, there is a need for improved techniques for soliciting information.
Soliciting information based on a computer user's context is described herein.
According to one aspect, a user search request is received and context information for the user is identified. The user search request and the context information are then combined to generate search criteria corresponding to the user search request. The context information includes, for example, information regarding one or more of: the user's physical environment, the user's mental environment, the user's computing environment, and the user's data environment.
According to another aspect, a product interest characterization (PIC) is generated that includes multiple fields, some fields being populated with user-defined data inputs and other fields being populated with automatically-generated user context information. The generated PIC is then communicated to one or more information sources where the PIC is compared with information at these sources to identify content that matches the parameters in the various fields of the PIC. The matching content is then presented to the user.
This disclosure describes soliciting information for a user based at least in part on the user's context. Search parameters or other data associated with a user's search request is combined with context information for the user to generate search criteria. The search criteria can then be compared with data (stored locally and/or remotely) to identify information that matches the search criteria. The user is able to solicit any of a wide variety of information, such as advertisements (e.g., of products or services), reference materials (e.g., electronic books or articles), as well as actual goods or products themselves (e.g., in electronic form (such as audio content that can be downloaded and played immediately), or for more traditional physical delivery (such as ordering a coat and having it shipped via an overnight shipping agent)).
Information sources 102 may be implemented in a number of ways, such as a host server at a Website, a dedicated search engine (e.g., that stores information for searching but not the content for search hits), a voice-driven telephony system, and so forth. The content can be organized and made available to clients 106 in any of a wide variety of conventional manners. As one exemplary implementation, an information source, as represented by source 102(1), may include a content store 110 to store the information and a content server 112 to serve the content to clients 106. The information communicated from the information sources may be in any data type (e.g., text, graphics, audio, video, etc.) and contain essentially any type of subject matter. As one particular example, the information may be in the form of solicited advertisements or product/service descriptions pulled to clients 106 from advertisers.
Network 104 is representative of many different network types, including public networks (e.g., the Internet) and/or proprietary networks. The network may be implemented using wireless technologies (e.g., RF, microwave, cellular, etc.), wire-based technologies (e.g., cable fiber optic, wire, etc.), or a combination of them. Any one or more of many diverse protocols and formats may be used to package data and transmit it from source 102 to a client 106.
Clients 106 may be implemented in a variety of ways, including as computers, portable digital assistants (PDAs), communication devices, and the like. The clients are equipped with conventional mechanisms to receive the information from network 104, such as ports, network cards, receivers, modems, and so on.
Each client, as represented by client 106(1), is equipped with a Condition-Dependent Output Supplier (CDOS) system 120 that monitors the user and the user's environment. As the user moves about in various environments, the CDOS system receives various input information including explicit user input, sensed user information, and sensed environment information. The CDOS system maintains and updates a model of the user condition. One or more sensors 122 provide data to the CDOS system 120 pertaining to the user's environment.
The computer 106 also has a variety of body-worn output devices, including the hand-held flat panel display 154, an earpiece speaker 158, and a head-mounted display in the form of an eyeglass-mounted display 159. Other output devices 160 may also be incorporated into the computer 106, such as a tactile display, an olfactory output device, tactile output devices, and the like.
The computer 106 may also be equipped with one or more various body-worn user sensor devices 162. For example, a variety of sensors can provide information about the current physiological state of the user and current user activities. Examples of such sensors include thermometers, sphygmometers, heart rate sensors, shiver response sensors, skin galvanometry sensors, eyelid blink sensors, pupil dilation detection sensors, EEG and EKG sensors, sensors to detect brow furrowing, blood sugar monitors, etc. In addition, sensors elsewhere in the near environment can provide information about the user, such as motion detector sensors (e.g., whether the user is present and is moving), badge readers, still and video cameras (including low light, infra-red, and x-ray), remote microphones, etc. These sensors can be both passive (i.e., detecting information generated external to the sensor, such as a heart beat) or active (i.e., generating a signal to obtain information, such as sonar or x-rays).
The computer 106 may also be equipped with various environment sensor devices 164 that sense conditions of the environment surrounding the user. For example, devices such as microphones or motion sensors may be able to detect whether there are other people near the user and whether the user is interacting with those people. Sensors can also detect environmental conditions that may affect the user, such as air thermometers or Geiger counters. Sensors, either body-mounted or remote, can also provide information related to a wide variety of user and environment factors including location, orientation, speed, direction, distance, and proximity to other locations (e.g., GPS and differential GPS devices, orientation tracking devices, gyroscopes, altimeters, accelerometers, anemometers, pedometers, compasses, laser or optical range finders, depth gauges, sonar, etc.). Identity and informational sensors (e.g., bar code readers, biometric scanners, laser scanners, OCR, badge readers, etc.) and remote sensors (e.g., home or car alarm systems, remote camera, national weather service web page, a baby monitor, traffic sensors, etc.) can also provide relevant environment information.
The computer 106 further includes a central computing unit 166 that may or may not be worn on the user. The various inputs, outputs, and sensors are connected to the central computing unit 166 via one or more data communications interfaces 168 that may be implemented using wire-based technologies (e.g., wires, coax, fiber optic, etc.) or wireless technologies (e.g., RF, etc.).
The central computing unit 166 includes a central processing unit (CPU) 170, a memory 172, and a storage device 174. The memory 172 may be implemented using both volatile and non-volatile memory, such as RAM, ROM, Flash, EEPROM, disk and so forth. The storage device 174 is typically implemented using non-volatile permanent memory, such as ROM, EEPROM, diskette, memory cards, and the like.
One or more application programs 176 are stored in memory 172 and executed by the CPU 170. The application programs 176 generate data that may be output to the user via one or more of the output devices 154, 158, 159, and 160.
In the illustrated implementation, the CDOS system 120 is shown stored in memory 172 and executes on the processing unit 170. The CDOS system 120 monitors the user and the user's environment, and creates and maintains an updated model of the current condition of the user. As the user moves about in various environments, the CDOS system receives various input information including explicit user input, sensed user information, and sensed environment information. The CDOS system updates the current model of the user condition, and presents output information to the user via appropriate output devices.
A more detailed explanation of the CDOS system 120 may be found in a co-pending U.S. patent application Ser. No. 09/216,193, entitled “Method and System For Controlling Presentation of Information To a User Based On The User's Condition”, which was filed Dec. 18, 1998, and is commonly assigned to Tangis Corporation. The reader might also be interested in a more detailed discussion of context attributes (or condition variables) discussed in U.S. patent application Ser. No. 09/724,902, entitled “Dynamically Exchanging Computer User's Context”, which was filed Nov. 28, 2000, and is commonly assigned to Tangis Corporation. These applications are hereby incorporated by reference.
An optional information solicitation manager 178 is also shown stored in memory 172 and executes on processing unit 170. Information solicitation manager 178 utilizes data from CDOS system 120 to generate search criteria based on the user's current environment. Alternatively, information solicitation manager 178 and CDOS system 120 may be implemented at a remote location (e.g., not in close physical proximity to the user 150).
The body-mounted computer 106 may be connected to one or more networks through wired or wireless communication technologies (e.g., wireless RF, a cellular phone or modem, infrared, physical cable, a docking station, etc.). For example, the body-mounted computer of a user could make use of output devices in a smart room, such as a television and stereo when the user is at home, if the body-mounted computer can transmit information to those devices via a wireless medium or if a cabled or docking mechanism is available to transmit the information. Alternately, kiosks or other information devices can be installed at various locations (e.g., in airports or at tourist spots) to transmit relevant (and typically, unsolicited) information to body-mounted computers within the range of the information device.
User search requests are input to an information solicitation management component, which, in the illustrated example, is a product interest characterization (PIC) manager 212. PIC manager 212 receives the user request and combines the request with the user's current context from context awareness model 214 in order to generate search criteria. The generated search criteria is then communicated to the locally and/or remotely situated information source 102. The search criteria is compared to the information at source 102 (e.g., an Internet search engine) to determine what information (if any) at source 102 matches the search criteria, and optionally how well that information matches the search criteria. The results of the comparison are then returned to PIC manager 212, which returns the results as appropriate to output device(s) 210 for presentation to the user. The results returned to PIC manager 212 may be sufficient to present to the user, or alternatively may only identify content that needs to be accessed by PIC manager 212 and presented to the user. For example, the results returned to PIC manager 212 may be a set of uniform resource locators (URLs). Those URLs may be presented to the user, or alternatively PIC manager 212 may access the locations identified by those URLs and return the content at those locations for presentation to the user.
Context awareness model 214 maintains context information for the user, allowing a characterization module 216 to attempt to characterize the user's context (e.g., his or her current context at the time a user search request is made by the user and/or received by PIC manager 212) and communicate this context information to PIC manager 212. Context awareness model 214 is built based on input from various modules 218, 220, 222, and 224 that capture and pass information based on inputs from one or more sensors 226 (e.g., environment sensors 164, user sensors 162, etc. of
In the illustrated implementation, the context awareness model 214 gathers information on (1) the user's physical environment from module 218, (2) the user's mental environment from module 220, (3) the user's computing environment from module 222, and (4) the user's data environment from module 224.
Physical environment module 218, generates information pertaining to the user's present location (e.g., geographical, relative to a structure such as a building, etc.), the current time, and surrounding objects that may be used as a basis for searching. As an example of this latter situation, a user with a wearable computer may be traversing through a mall having numerous stores therein. While in this location, the user may request product sale information and only advertisements of products sold in stores in the mall and currently on sale are presented to the user.
The mental environment module 220 generates information pertaining to the user's likely intentions, their preferences, and their current attention. For instance, the mental environment module 220 may use data from a pupil tracking sensor or head orientation sensor to identify a direction or object on which the user is focused. If the user appears to be focused on administrative items presented on the heads up display, then the user context module 220 might determine that it is safe to present search results.
The computing environment module 222 generates information pertaining to the computing capabilities of the client, including available I/O devices, connectivity, processing capabilities, available storage space, and so on. The data environment module 224 generates information pertaining to the data and software resources on the client computer, including the communication resources, applications, operating system, and data.
The search criteria generated by PIC manager 212 is encapsulated in a data structure referred to as a PIC. A PIC is the data that is sent from the consumer computing system (e.g., PIC manager 212) to information sources 102. If the information provider determines that there is content that sufficiently conforms to the consumer's interest (e.g., matches all of the search criteria, or at least a threshold amount of the search criteria), an indication of a match, optionally with product description information and other commerce facilitating code and data, can be sent to the consumer.
A PIC can contain a variety of information for a variety of purposes. Table I illustrates exemplary information that may be included in a PIC. A PIC, however, need not include all of the information in Table I. Rather, different PICs can include different subsets of the information described in Table I.
PIC manager 212 is thus able to formulate search criteria (e.g., in the form of PICs) encompassing a wide variety of different information. This can include, for example, basic keyword inputs by the user which are then combined with other information (e.g., from context awareness model 214) by PIC manager 212 to formulate the search criteria.
Initially, user input is received (act 252). The current user context is then identified (act 254), and search criteria (e.g., a PIC) generated based on both the received user input and the identified user context (act 256). A search is then performed for information that satisfies the search criteria (act 258). The search may be performed by the component that generates the search criteria (in act 256), or alternatively the search may be performed by communicating the search criteria to a search component (e.g., an information store 102 of
One example of soliciting information involves the user passing (walking, riding, driving, etc.) a store and submitting an advertisement search request. The search criteria include the advertisement request as well as context information indicating that the user is in close proximity to the store. The search results include an advertisement that the store is selling a product (e.g., a specific brand of cigarettes, including cigarettes the user's context knows that the user has purchased in the past) for a price that the user may be willing to purchase the item (e.g., the cigarettes are on sale, or cheaper than other stores, or cheaper than the user's last purchase of cigarettes). The cigarette advertisement is thus presented to the user. In general terms the user's context determines whether a particular criteria is met and presents an advertisement (in this case, generated by the store, but not directed at specific consumers) to the user.
Additionally, PIC manager 212 may optionally maintain a user profile(s) for each user. By using the detailed, automatically updated, under user control, profile, PICs can be further customized or personalized to the individual users. The information maintained in a profile can include, for example, the user's needs (explicitly defined by user or inferred by system), desires (courses on dog training, good bargains, etc.), preferences (red leather with prominent logos, German sedans, etc.), budget (current cash, monthly goals, shared funds, credit limits, etc.), legal constraints (age, criminal history, licenses, etc.), physical limitations (require wheelchair entry/exit ramps, need sign-language interpretation, location must not be subject to cold winds, etc.), time availability (does user have enough time in schedule to review information or get product?, is movie too late in evening?), route (is supplier of product convenient to my planned route?), access to transportation (when will family car be available, what is bus schedule, etc.), need (is product already included in a shopping list?, is a product currently being used going to be depleted soon?), and so forth. The preceding considerations can be to derived from, or supplemented by, past individual or aggregate consumer behaviors.
To solicit information, or add data to a PIC for subsequent solicitation requests, the user interacts with PIC manager 212. PIC manager 212 includes multiple PIC functionality modules: a profile manager 306, a PIC builder 308, a PIC sender 310, a PIC receiver 312, and a presentation manager 314. The user 316 can interact, either directly or indirectly, with these functionality modules.
Profile manager 306 allows user access and control for individual functions and specific PICs, and also allows the user to access and modify portions of the user's model context pertinent to product interest. For example, this is where a user can modify his or her address, credit card numbers and authorizations, shirt size, and so forth.
Profile manager module 306 presents various choices to the user, including: use profile (allows the user to select his or her profile (or one of his or her profiles) for use), change profile (allows the user to change information stored in his or her profile), view active/inactive PIC results (view any search results that have been received and stored by the PIC manager (e.g., because they were not supposed to be presented to the user yet)), change active/inactive PIC status (allows the user to have multiple PICs defined and toggle individual PICs between an active status (causing searches to be performed based on the data in the PIC) and inactive status (for which searches are not performed)), initiate new PICs (allows the user to create a new PIC, such as by entering search terms (key words)), help (makes a user help function available to the user).
PIC builder module 308 allows the user to generate new PICs and modify existing PICs. Once the user has generated a PIC, he or she can set the PIC status to active, causing PIC manager 212 to share the PIC with specified agencies, or whomever is interested, and has a compatible information description data store. PIC builder module 308 provides interfaces to help the user create PICs. In one implementation, PIC builder module 308 provides both blank PIC forms and default PIC forms to aid the user in the PIC creation process.
Blank PIC forms can be built from scratch using menus, tool bars, and other UI elements providing prompts for elemental PIC form fields (common fields like time, location, price, store, quality, material, and so forth), query building logic (AND, OR, NOT, SIMILAR, ONLY, ALL, INCLUDING, wildcards, and so forth). Blank forms can include automatically visible or hidden fields with values included derived from the context model.
Default PIC Forms are forms that are at least partly filled in, relating to specific information categories. For example, there can be a default PIC form for “New Car”, which would present fields that are useful in specifying a car of interest.
By default, PIC forms do not show form fields that the context awareness model has values available for. These fields can be automatically filled in for the user, thereby freeing him or her of the time needed to do so (and even the knowledge that they are being filled in). Alternatively, these fields can be displayed, or displayed only under certain circumstances. For example, a context model provided by a company may include fields used for accounting, security, and performance measurement that cannot be displayed with default user privilege.
As there are many product area forms potentially useful to a user, organization and search capabilities such as keyword search, graphic information-tree traversal, and many other techniques as provided in file browsers, Internet search engines, and online broadcast program schedules may optionally be made available by PIC builder 308.
Additionally, PICs can include specification of when and how a PIC result should be presented. This specification of when and how PIC results should be presented is limited only by availability of criteria in the context model. However, since the context awareness model is user extensible, users are free to add new model attributes. For example, a user may purchase for his or her car a device that allows him or her to use an alternative fuel. The user could then add to his or her context model a new attribute/field, associated with other attributes related to his or her car, having an indication of interest/desirability/ability to use this alternative fuel. Now a PIC can be created that shows the user a list of sources of the fuel within the limits of the car's fuel-determined cruising range.
New PIC data store 352 is used to generate a unique PIC. Data store 352 can contain different types of information, such as information provided by the user to characterize new information (e.g., a new product) of interest. Data store 352 may also include information previously provided by the user to characterize other information (e.g., product(s)) of interest. This information may be included because the user indicated a desire to have PICs with similar fields share values as default. Additionally, system-suggested information may also be included. For example, based on previous PICs, the system can suggest PIC fields and values based on previous user behavior. A more detailed explanation of such predictive behavior can be found in a co-pending U.S. patent application Ser. No. 09/825159, entitled “Thematic Response To A Computer User's Context, Such As By A Wearable Personal Computer” to James O. Robarts and Eric Matteson, which was filed Apr. 2, 2001, and is commonly assigned to Tangis Corporation. This application is hereby incorporated by reference.
Previous PIC data store 354 includes all PICs generated by the user, either active or inactive, until deleted by the user. These are available for modification (or change of status), as well as for reference when generating new PICs.
User profile PIC data store 356 contains product-independent information. Examples of the type of information contained include: user identification information (e.g., name, alias, anonymizing ID, etc.); financial transaction data (e.g., credit card data, bank account data, authorizations (such as list of trusted institutions, indication of whether explicit user verification is required), transaction limits, etc.); authorizations (e.g., indication of trust per external institution or person, default permissions, permission overrides, need for accounting logs, etc.); and so forth.
Generic product characterization data store 358 allows the user to rely on recognition rather than recall to create a PIC. This is valuable because the PIC fields required for the precise characterization of a product interest are significantly different for different types of products, and there are many of them, and they can change over time. Therefore, a generalized taxonomy of generic products is provided, that can be navigated (e.g., hierarchically, graphically with pseudo-spatial relationships, keyword searched, and so forth) similarly to actual product catalogs (e.g., online Yellow Pages). As the user traverses the data store, he or she can both be learning about general product characteristics (new luxury SUVs are available from which manufacturers, in a bounded price range), and providing candidate fields and values for the PIC Builder (for storage in data store 358).
Navigation preferences data store 360 maintains a record of the explicit and inferred user preferences for using the generic product characterization data store 358. Examples of such records include: navigation preferences (e.g., showing an hierarchical tree organized by color, building Boolean logic with compositing transparent filter frames like conventional Magic Lens filters, etc.); previously explored areas of the data store (e.g., ID shows previously navigated links in different color), and so forth.
Generic product preferences data store 362 records a user's indication that a particular generically described product is of interest.
Log 364 lists all previously used PIC fields. Log 364 can combine values from previous PICs 354, Generic Product Preferences 362, and inferred product interest from Pattern Recognizer.
Returning to
Once sent, the PIC is compared to data in those information sources, and an indication of any match returned to PIC receiver 312. The matching information is then returned by PIC receiver 312 to presentation manager 314 for presentation to the user. PIC receiver 312 is responsible for handling the communications from the content sources, and is concerned with the content. For instance, PIC receiver 312 could stack rank the received data. Presentation manager 314, on the other hand, is primarily dealing with the presentation of the data (e.g., is there a display available? Can it handle color?).
What is received by PIC receiver 312 may be a completed PIC, a data packet associated with a PIC, or even a product itself (e.g., radio cablecast link, MPEG file, etc.). PIC receiver 312 can optionally combine the results of multiple active PICs. For instance, a user may send two similar PICs: one for a new car and one for used cars.
PIC receiver 312 handles all solicited information, optionally verifying that received information matches an active PIC, and storing the information for immediate or delayed presentation. PIC receiver 312 can be explicitly configured by the user and/or determined by rules embedded in the context model and PIC Manager. In one implementation, PIC manager 212 is an extension to the general context model 214 of
PIC receiver 312 may also optionally include an appropriateness filter that is used to determine whether the query results are returned to presentation manager 314, or the appropriateness filter may be a separate component of PIC manager 212. In some situations, the filter may not be needed. For example, a PIC may be submitted both to a broker trusted not to provide information inappropriate to children, and to other product information sources. It may not be necessary to have the trusted PIC results filtered for inappropriate content, while other results are filtered for inappropriate content.
Additionally, the appropriateness filter may be used by PIC receiver 312 to defer delivery of query results. For example, the user may have insufficient attention to attend to them because he or she is working on other tasks, is sleeping, etc. In this case the query results are available to the user if he or she decides to view them, provided that doing so does not violate some other context model rule (for example, it may be unsafe to do so because user is driving in heavy traffic, or system may have security schemes that only allow the use of PICs during certain hours, or at specified locations, or while performing particular tasks).
In addition, PIC receiver 312 may use the user context to determine how filters get applied to the content. For example, a user may generate a PIC for information about changing a flat tire. However, the search may take a long time and the results of the search may not be returned to PIC manager 212 until after the user has fixed the flat tire. In this situation, the appropriateness filter can filter out the search results and not have them presented to the user because based on the user context (the flat tiring having been fixed), the search results are no longer important to the user.
PIC receiver 312 (or alternatively presentation manager 314) may also communicate with context awareness model 214 or characterization module 216 of
Returning to act 404, if the user is not a first time user of the system, then a check is made as to whether the user desires a particular user profile (act 412). If no user profile is desired, then processing is handed to the PIC builder for generation of an unpersonalized PIC (act 410). However, if a user profile is desired, then the user is verified as an authorized user of the profile (act 414), such as by user ID and password. Processing is then handed to the profile manager for generation of a personalized PIC (act 416).
When the PIC is created, an indication on what to do when a correlation is found can be included. Some of the options include:
Once generated, the PIC is communicated by PIC manager 452 to a PIC receiver 454 at PIC broker 450. The PICs 456 from this user, as well as other PICs 458 from other users, are made available to a correlation logic 460. Correlation logic compares the search criteria in the PICs 456 and 458 to multiple product characterizations 462(1), 462(2), 462(3), . . . , 462(X). Any of the product characterizations 462 that satisfy the search criteria are communicated to the product provider(s) 464 corresponding to the matching product characterization(s), which in turn provide the corresponding product information (or the product itself) to the user computing resources 466 (e.g., a client 106 of
Different components may be used besides PIC broker 450 to provide information or products. For example, the functionality of PIC broker 450 may alternatively be provided by a product provider(s) 464. By way of another example, “agents” may be used. Agents are semi-autonomous software objects that are less constrained than a PIC broker in that they can theoretically reach a less constrained source of product descriptions. They may therefore provide a more complete set of query results. However, unless they coordinate with PIC providers on the definition of product interest or descriptions, they may not be as precise. Further, since the source of the agent, and what it returns, may not be as controlled as a PIC broker, the results may not be as appropriate.
Yet another example is a content aggregator. Much like a PIC broker, content aggregators can provide interfaces to their data stores compatible with the user's context model (or vice versa, any party can provide a dictionary and write the translation filter). In this scenario very tight control on the product descriptions, including availability, can be provided, insuring timely and accurate product offers. There can also be cooperation between different user models. For example, a variety of affinity mechanisms may be used which suggest products that are similar to the ones requested. The user's context models can directly, or thru the PIC Manager mechanism, indicate if this type of information is desired. The models can also cooperate by sharing the user's desire to have products having a high-correlation (clearly satisfying the PIC) be automatically purchased. Necessary information for purchase can also be exchanged securely.
Revenue can be generated in system 500 in a variety of different manners. For example, the customers or product suppliers 516 may pay for the ability to have their messages stored as part of messages 514, or the customers or product suppliers 516 may pay for each message forwarded to client 502. By way of another example, the user of client 502 may also receive payment from the sale of the interest to remote service 504, or the user's willingness to receive messages from remote server 504 (e.g., for each message presented to the user).
Various aspects of the solicitation of information described herein can be seen from the following examples. A first example is the purchase of a child car seat. Assume that a user has a computer that maintains an explicit, extensible, dynamic model of his or her context. The user has used this model to maintain a data store of personal, detailed, private information, as well as information that is shared with others and available publicly. He or she is about to become a first time parent, and wishes to purchase a car seat for the new infant, but does not have familiarity with car seats and does not have a lot of time for research. Further, he or she understands that there are likely trade-offs between product characteristics such as price and safety.
Using the Generic Product Description feature, the user can traverse a tree to locate a PIC form that gives the user the following blank fields:
If the user does not use the Generic Product Description feature, he or she can use a similar PIC if he or she had created one, or use a PIC form from some other source, or create one from scratch. If the user creates it from scratch, he or she could include the fields described above.
Regardless, the resulting PIC could have the following fields already filled out. They could be hidden by default, but can be viewed and modified if desired (assuming security authorizations permit modification):
Once the user is satisfied with his or her PIC, the PIC is submitted to outside product description data stores (information sources). This online distribution does not need to happen immediately, it can be delayed by some preset time or availability of connection to remote computing/communication resources. Eventually, a match is found and the resultant car seat information is returned and presented to the user for him or her to make a purchase selection. Alternatively, the car seat could be automatically purchased on behalf of the user from a product provider and delivered to the user (e.g., via a mail service) or made available for the user's pickup.
Another example of a PIC, which makes use of a context model, is searching for a movie. The fields in the PIC include:
Another example of a PIC which makes use of a context model is a PIC for getting a repair procedure. Assume the user is driving in a remote area and has an auto (vehicle) breakdown. The user's PIC is a request for assistance. Appropriate responses would be repair advice in written form from the manufacturer or other expertise publisher (e.g. Chiltons), or remote expert (via cell phone) or listing of closest service stations/towing services (phone number, hours, rates). Fields in the PIC include:
Another example of a PIC, which can be sent without user interaction or verification (approval is given beforehand within rule logic) is a medical emergency PIC. Fields in the PIC include:
Although the description above uses language that is specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the invention.
This application claims priority to provisional application No. 60/194,000, filed Apr. 2, 2000, which is hereby incorporated by reference. This application also claims priority to provisional application No. 60/194,758, filed Apr. 9, 2000, which is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
3973251 | Stephans | Aug 1976 | A |
4283712 | Goody | Aug 1981 | A |
4458331 | Amezcua et al. | Jul 1984 | A |
4569026 | Best | Feb 1986 | A |
4815030 | Cross et al. | Mar 1989 | A |
4905163 | Garber et al. | Feb 1990 | A |
4916441 | Gombrich | Apr 1990 | A |
4970683 | Harshaw et al. | Nov 1990 | A |
4991087 | Burkowski et al. | Feb 1991 | A |
5032083 | Friedman | Jul 1991 | A |
5133075 | Risch | Jul 1992 | A |
5201034 | Matsuura et al. | Apr 1993 | A |
5208449 | Eastman et al. | May 1993 | A |
5214757 | Mauney et al. | May 1993 | A |
5227614 | Danielson et al. | Jul 1993 | A |
5237684 | Record et al. | Aug 1993 | A |
5251294 | Abelow | Oct 1993 | A |
5267147 | Harshaw et al. | Nov 1993 | A |
5278946 | Shimada et al. | Jan 1994 | A |
5285398 | Janik | Feb 1994 | A |
5317568 | Bixby et al. | May 1994 | A |
5327529 | Fults et al. | Jul 1994 | A |
5335276 | Thompson et al. | Aug 1994 | A |
5339395 | Pickett et al. | Aug 1994 | A |
5349654 | Bond et al. | Sep 1994 | A |
5353399 | Kuwamoto et al. | Oct 1994 | A |
5388198 | Layman et al. | Feb 1995 | A |
5398021 | Moore | Mar 1995 | A |
5416730 | Lookofsky | May 1995 | A |
5454074 | Hartel et al. | Sep 1995 | A |
5470233 | Fruchterman et al. | Nov 1995 | A |
5471629 | Risch | Nov 1995 | A |
5481667 | Bieniek et al. | Jan 1996 | A |
5493692 | Theimer et al. | Feb 1996 | A |
5506580 | Whiting et al. | Apr 1996 | A |
5513646 | Lehrman et al. | May 1996 | A |
5522026 | Records et al. | May 1996 | A |
5535323 | Miller et al. | Jul 1996 | A |
5537618 | Boulton et al. | Jul 1996 | A |
5539665 | Lamming et al. | Jul 1996 | A |
5544321 | Theimer et al. | Aug 1996 | A |
5553609 | Chen et al. | Sep 1996 | A |
5555376 | Theimer et al. | Sep 1996 | A |
5559520 | Barzegar et al. | Sep 1996 | A |
5560012 | Ryu et al. | Sep 1996 | A |
5566337 | Szymanski et al. | Oct 1996 | A |
5568645 | Morris et al. | Oct 1996 | A |
5572401 | Carroll | Nov 1996 | A |
5592664 | Starkey | Jan 1997 | A |
5601435 | Quy | Feb 1997 | A |
5603054 | Theimer et al. | Feb 1997 | A |
5611050 | Theimer et al. | Mar 1997 | A |
5642303 | Small et al. | Jun 1997 | A |
5646629 | Loomis et al. | Jul 1997 | A |
5659746 | Bankert et al. | Aug 1997 | A |
5675358 | Bullock et al. | Oct 1997 | A |
5689619 | Smyth | Nov 1997 | A |
5689708 | Regnier et al. | Nov 1997 | A |
5701894 | Cherry et al. | Dec 1997 | A |
5704366 | Tacklind et al. | Jan 1998 | A |
5710884 | Dedrick | Jan 1998 | A |
5715451 | Marlin | Feb 1998 | A |
5717747 | Boyle, III et al. | Feb 1998 | A |
5719744 | Jenkins et al. | Feb 1998 | A |
5726660 | Purdy et al. | Mar 1998 | A |
5726688 | Siefert et al. | Mar 1998 | A |
5738102 | Lemelson | Apr 1998 | A |
5740037 | McCann et al. | Apr 1998 | A |
5742279 | Yamamoto et al. | Apr 1998 | A |
5745110 | Ertemalp | Apr 1998 | A |
5751260 | Nappi et al. | May 1998 | A |
5752019 | Rigoutsos et al. | May 1998 | A |
5754938 | Herz et al. | May 1998 | A |
5761662 | Dasan | Jun 1998 | A |
5769085 | Kawakami et al. | Jun 1998 | A |
5781913 | Felsenstein et al. | Jul 1998 | A |
5787234 | Molloy | Jul 1998 | A |
5787279 | Rigoutsos | Jul 1998 | A |
5790974 | Tognazzini | Aug 1998 | A |
5796952 | Davis et al. | Aug 1998 | A |
5798733 | Ethridge | Aug 1998 | A |
5806079 | Rivette et al. | Sep 1998 | A |
5812865 | Theimer et al. | Sep 1998 | A |
5818446 | Bertram et al. | Oct 1998 | A |
5826253 | Bredenberg | Oct 1998 | A |
5831594 | Tognazzini et al. | Nov 1998 | A |
5832296 | Wang et al. | Nov 1998 | A |
5835087 | Herz et al. | Nov 1998 | A |
5852814 | Allen | Dec 1998 | A |
5867171 | Murata et al. | Feb 1999 | A |
5873070 | Bunte et al. | Feb 1999 | A |
5878274 | Kono et al. | Mar 1999 | A |
5879163 | Brown et al. | Mar 1999 | A |
5881231 | Takagi et al. | Mar 1999 | A |
5899963 | Hutchings | May 1999 | A |
5902347 | Backman et al. | May 1999 | A |
5905492 | Straub et al. | May 1999 | A |
5910799 | Carpenter et al. | Jun 1999 | A |
5911132 | Sloane | Jun 1999 | A |
5913030 | Lotspiech et al. | Jun 1999 | A |
5924074 | Evans | Jul 1999 | A |
5930501 | Neil | Jul 1999 | A |
5937160 | Davis et al. | Aug 1999 | A |
5938721 | Dussell et al. | Aug 1999 | A |
5942986 | Shabot et al. | Aug 1999 | A |
5945988 | Williams et al. | Aug 1999 | A |
5948041 | Abo et al. | Sep 1999 | A |
5953718 | Wical | Sep 1999 | A |
5959611 | Smailagic et al. | Sep 1999 | A |
5963914 | Skinner et al. | Oct 1999 | A |
5966126 | Szabo | Oct 1999 | A |
5966533 | Moody | Oct 1999 | A |
5966710 | Burrows | Oct 1999 | A |
5971580 | Hall et al. | Oct 1999 | A |
5974262 | Fuller et al. | Oct 1999 | A |
5977968 | Le Blanc | Nov 1999 | A |
5980096 | Thalhammer-Reyero | Nov 1999 | A |
5983335 | Dwyer, III | Nov 1999 | A |
5991687 | Hale et al. | Nov 1999 | A |
5991735 | Gerace | Nov 1999 | A |
5995956 | Nguyen | Nov 1999 | A |
5999932 | Paul | Dec 1999 | A |
5999943 | Nori et al. | Dec 1999 | A |
5999975 | Kittaka et al. | Dec 1999 | A |
6003082 | Gampper et al. | Dec 1999 | A |
6006251 | Toyouchi et al. | Dec 1999 | A |
6012152 | Douik et al. | Jan 2000 | A |
6014638 | Burge et al. | Jan 2000 | A |
6023729 | Samuel et al. | Feb 2000 | A |
6031455 | Grube et al. | Feb 2000 | A |
6035264 | Donaldson et al. | Mar 2000 | A |
6041331 | Weiner et al. | Mar 2000 | A |
6041365 | Kleinerman | Mar 2000 | A |
6044415 | Futral et al. | Mar 2000 | A |
6047301 | Bjorklund et al. | Apr 2000 | A |
6047327 | Tso et al. | Apr 2000 | A |
6055516 | Johnson et al. | Apr 2000 | A |
6061610 | Boer | May 2000 | A |
6061660 | Eggleston et al. | May 2000 | A |
6064943 | Clark, Jr. et al. | May 2000 | A |
6067084 | Fado et al. | May 2000 | A |
6081814 | Mangat et al. | Jun 2000 | A |
6085086 | La Porta et al. | Jul 2000 | A |
6088689 | Kohn et al. | Jul 2000 | A |
6091411 | Straub et al. | Jul 2000 | A |
6092101 | Birrell et al. | Jul 2000 | A |
6094625 | Ralston | Jul 2000 | A |
6098065 | Skillen et al. | Aug 2000 | A |
6105063 | Hayes, Jr. | Aug 2000 | A |
6108197 | Janik | Aug 2000 | A |
6108665 | Bair et al. | Aug 2000 | A |
6112246 | Horbal et al. | Aug 2000 | A |
6122348 | French-St. George et al. | Sep 2000 | A |
6122657 | Hoffman, Jr. et al. | Sep 2000 | A |
6122960 | Hutchings et al. | Sep 2000 | A |
6127990 | Zwern | Oct 2000 | A |
6128663 | Thomas | Oct 2000 | A |
6131067 | Girerd et al. | Oct 2000 | A |
6134532 | Lazarus et al. | Oct 2000 | A |
6154745 | Kari et al. | Nov 2000 | A |
6155960 | Roberts et al. | Dec 2000 | A |
6164541 | Dougherty et al. | Dec 2000 | A |
6169976 | Colosso | Jan 2001 | B1 |
6185534 | Breese et al. | Feb 2001 | B1 |
6188399 | Voas et al. | Feb 2001 | B1 |
6195622 | Altschuler et al. | Feb 2001 | B1 |
6198394 | Jacobsen et al. | Mar 2001 | B1 |
6199099 | Gershman et al. | Mar 2001 | B1 |
6199102 | Cobb | Mar 2001 | B1 |
6215405 | Handley et al. | Apr 2001 | B1 |
6218958 | Eichstaedt et al. | Apr 2001 | B1 |
6230111 | Mizokawa | May 2001 | B1 |
6236768 | Rhodes et al. | May 2001 | B1 |
6256633 | Dharap | Jul 2001 | B1 |
6262720 | Jeffrey et al. | Jul 2001 | B1 |
6263268 | Nathanson | Jul 2001 | B1 |
6263317 | Sharp et al. | Jul 2001 | B1 |
6272470 | Teshima | Aug 2001 | B1 |
6272507 | Pirolli et al. | Aug 2001 | B1 |
6282517 | Wolfe et al. | Aug 2001 | B1 |
6282537 | Madnick et al. | Aug 2001 | B1 |
6285757 | Carroll et al. | Sep 2001 | B1 |
6285889 | Nykanen et al. | Sep 2001 | B1 |
6289316 | Aghili et al. | Sep 2001 | B1 |
6289513 | Bentwich | Sep 2001 | B1 |
6292796 | Drucker et al. | Sep 2001 | B1 |
6294953 | Steeves | Sep 2001 | B1 |
6301609 | Aravamudan et al. | Oct 2001 | B1 |
6305007 | Mintz | Oct 2001 | B1 |
6305221 | Hutchings | Oct 2001 | B1 |
6308203 | Itabashi et al. | Oct 2001 | B1 |
6311162 | Reichwein et al. | Oct 2001 | B1 |
6314384 | Goetz | Nov 2001 | B1 |
6317718 | Fano | Nov 2001 | B1 |
6321158 | DeLorme et al. | Nov 2001 | B1 |
6321279 | Bonola | Nov 2001 | B1 |
6324569 | Ogilvie et al. | Nov 2001 | B1 |
6327535 | Evans et al. | Dec 2001 | B1 |
6349307 | Chen | Feb 2002 | B1 |
6353398 | Amin et al. | Mar 2002 | B1 |
6353823 | Kumar | Mar 2002 | B1 |
6356905 | Gershman et al. | Mar 2002 | B1 |
6363377 | Kravets et al. | Mar 2002 | B1 |
6385589 | Trusheim et al. | May 2002 | B1 |
6392670 | Takeuchi et al. | May 2002 | B1 |
6401085 | Gershman et al. | Jun 2002 | B1 |
6405159 | Bushey et al. | Jun 2002 | B2 |
6405206 | Kayahara | Jun 2002 | B1 |
6418424 | Hoffberg et al. | Jul 2002 | B1 |
6421700 | Holmes et al. | Jul 2002 | B1 |
6427142 | Zachary et al. | Jul 2002 | B1 |
6430531 | Polish | Aug 2002 | B1 |
6438618 | Lortz et al. | Aug 2002 | B1 |
6442549 | Schneider | Aug 2002 | B1 |
6442589 | Takahashi et al. | Aug 2002 | B1 |
6442620 | Thatte et al. | Aug 2002 | B1 |
6446076 | Burkey et al. | Sep 2002 | B1 |
6446109 | Gupta | Sep 2002 | B2 |
6460036 | Herz | Oct 2002 | B1 |
6462759 | Kurtzberg et al. | Oct 2002 | B1 |
6466232 | Newell et al. | Oct 2002 | B1 |
6477117 | Narayanaswami et al. | Nov 2002 | B1 |
6483485 | Huang et al. | Nov 2002 | B1 |
6484200 | Angal et al. | Nov 2002 | B1 |
6487552 | Lei et al. | Nov 2002 | B1 |
6490579 | Gao et al. | Dec 2002 | B1 |
6505196 | Drucker et al. | Jan 2003 | B2 |
6507567 | Willars | Jan 2003 | B1 |
6507845 | Cohen et al. | Jan 2003 | B1 |
6513046 | Abbott, III et al. | Jan 2003 | B1 |
6519552 | Sampath et al. | Feb 2003 | B1 |
6526035 | Atarius et al. | Feb 2003 | B1 |
6529723 | Bentley | Mar 2003 | B1 |
6539336 | Vock et al. | Mar 2003 | B1 |
6542889 | Aggarwal et al. | Apr 2003 | B1 |
6546425 | Hanson et al. | Apr 2003 | B1 |
6546554 | Schmidt et al. | Apr 2003 | B1 |
6549915 | Abbott, III et al. | Apr 2003 | B2 |
6549944 | Weinberg et al. | Apr 2003 | B1 |
6553336 | Johnson et al. | Apr 2003 | B1 |
6563430 | Kemink et al. | May 2003 | B1 |
6568595 | Russell et al. | May 2003 | B1 |
6571279 | Herz et al. | May 2003 | B1 |
6578019 | Suda et al. | Jun 2003 | B1 |
6615197 | Chai | Sep 2003 | B1 |
6625135 | Johnson et al. | Sep 2003 | B1 |
6636831 | Profit, Jr. et al. | Oct 2003 | B1 |
6643684 | Malkin et al. | Nov 2003 | B1 |
6652283 | Van Schaack et al. | Nov 2003 | B1 |
6661437 | Miller et al. | Dec 2003 | B1 |
6672506 | Swartz et al. | Jan 2004 | B2 |
6697836 | Kawano et al. | Feb 2004 | B1 |
6704722 | Wang Baldonado | Mar 2004 | B2 |
6704785 | Koo et al. | Mar 2004 | B1 |
6704812 | Bakke et al. | Mar 2004 | B2 |
6707476 | Hochstedler | Mar 2004 | B1 |
6712615 | Martin | Mar 2004 | B2 |
6714977 | Fowler et al. | Mar 2004 | B1 |
6738040 | Jahn et al. | May 2004 | B2 |
6738759 | Wheeler et al. | May 2004 | B1 |
6741188 | Miller et al. | May 2004 | B1 |
6741610 | Volftsun et al. | May 2004 | B1 |
6747675 | Abbott et al. | Jun 2004 | B1 |
6751620 | Orbanes et al. | Jun 2004 | B2 |
6766245 | Padmanabhan | Jul 2004 | B2 |
D494584 | Schlieffers et al. | Aug 2004 | S |
6791580 | Abbott et al. | Sep 2004 | B1 |
6795806 | Lewis et al. | Sep 2004 | B1 |
6796505 | Pellaumail et al. | Sep 2004 | B2 |
6801223 | Abbott et al. | Oct 2004 | B1 |
6812937 | Abbott et al. | Nov 2004 | B1 |
6829639 | Lawson et al. | Dec 2004 | B1 |
6834195 | Brandenberg et al. | Dec 2004 | B2 |
6834208 | Gonzales et al. | Dec 2004 | B2 |
6837436 | Swartz et al. | Jan 2005 | B2 |
6842877 | Robarts et al. | Jan 2005 | B2 |
6850252 | Hoffberg | Feb 2005 | B1 |
6853966 | Bushey et al. | Feb 2005 | B2 |
6868525 | Szabo | Mar 2005 | B1 |
6874017 | Inoue et al. | Mar 2005 | B1 |
6874127 | Newell et al. | Mar 2005 | B2 |
6885734 | Eberle et al. | Apr 2005 | B1 |
6899539 | Stallman et al. | May 2005 | B1 |
6963899 | Fernandez et al. | Nov 2005 | B1 |
6968333 | Abbott et al. | Nov 2005 | B2 |
7000187 | Messinger et al. | Feb 2006 | B2 |
7010501 | Roslak et al. | Mar 2006 | B1 |
7010603 | Martin, Jr. et al. | Mar 2006 | B2 |
7040541 | Swartz et al. | May 2006 | B2 |
7046263 | Abbott et al. | May 2006 | B1 |
7055101 | Abbott et al. | May 2006 | B2 |
7058893 | Abbott et al. | Jun 2006 | B2 |
7058894 | Abbott et al. | Jun 2006 | B2 |
7062715 | Abbott et al. | Jun 2006 | B2 |
7063263 | Swartz et al. | Jun 2006 | B2 |
7076737 | Abbott et al. | Jul 2006 | B2 |
7080322 | Abbott et al. | Jul 2006 | B2 |
7089497 | Abbott et al. | Aug 2006 | B2 |
7096253 | Vinson et al. | Aug 2006 | B2 |
7103806 | Horvitz | Sep 2006 | B1 |
7107539 | Abbott et al. | Sep 2006 | B2 |
7110764 | Blair et al. | Sep 2006 | B1 |
7120558 | McIntyre et al. | Oct 2006 | B2 |
7124125 | Cook et al. | Oct 2006 | B2 |
7137069 | Abbott et al. | Nov 2006 | B2 |
7155456 | Abbott, III et al. | Dec 2006 | B2 |
7162473 | Dumais et al. | Jan 2007 | B2 |
7171378 | Petrovich et al. | Jan 2007 | B2 |
7195157 | Swartz et al. | Mar 2007 | B2 |
7203906 | Abbott et al. | Apr 2007 | B2 |
7225229 | Abbott et al. | May 2007 | B1 |
7231439 | Abbott et al. | Jun 2007 | B1 |
7260453 | Poier et al. | Aug 2007 | B2 |
7349894 | Barth et al. | Mar 2008 | B2 |
7360152 | Capps et al. | Apr 2008 | B2 |
7385501 | Miller et al. | Jun 2008 | B2 |
7386477 | Fano | Jun 2008 | B2 |
7392486 | Gyde et al. | Jun 2008 | B1 |
7395221 | Doss et al. | Jul 2008 | B2 |
7444594 | Abbott et al. | Oct 2008 | B2 |
7464153 | Abbott et al. | Dec 2008 | B1 |
7512889 | Newell et al. | Mar 2009 | B2 |
7533052 | Tilfors et al. | May 2009 | B2 |
7533082 | Abbott et al. | May 2009 | B2 |
7561200 | Garvey, III et al. | Jul 2009 | B2 |
7571218 | Tanaka et al. | Aug 2009 | B2 |
7614001 | Abbott et al. | Nov 2009 | B2 |
7647400 | Abbott et al. | Jan 2010 | B2 |
7689919 | Abbott et al. | Mar 2010 | B2 |
7734780 | Abbott et al. | Jun 2010 | B2 |
7739607 | Abbott et al. | Jun 2010 | B2 |
7779015 | Abbott et al. | Aug 2010 | B2 |
7827281 | Abbott et al. | Nov 2010 | B2 |
7877686 | Abbott et al. | Jan 2011 | B2 |
7945859 | Abbott et al. | May 2011 | B2 |
20010030664 | Shulman et al. | Oct 2001 | A1 |
20010040590 | Abbott et al. | Nov 2001 | A1 |
20010040591 | Abbott et al. | Nov 2001 | A1 |
20010043231 | Abbott et al. | Nov 2001 | A1 |
20010043232 | Abbott et al. | Nov 2001 | A1 |
20020032689 | Abbott et al. | Mar 2002 | A1 |
20020044152 | Abbott et al. | Apr 2002 | A1 |
20020052930 | Abbott et al. | May 2002 | A1 |
20020052963 | Abbott et al. | May 2002 | A1 |
20020054130 | Abbott et al. | May 2002 | A1 |
20020054174 | Abbott et al. | May 2002 | A1 |
20020078204 | Newell et al. | Jun 2002 | A1 |
20020080155 | Abbott et al. | Jun 2002 | A1 |
20020080156 | Abbott et al. | Jun 2002 | A1 |
20020083025 | Robarts et al. | Jun 2002 | A1 |
20020083158 | Abbott et al. | Jun 2002 | A1 |
20020087525 | Abbott et al. | Jul 2002 | A1 |
20020099817 | Abbott et al. | Jul 2002 | A1 |
20020147880 | Wang Baldonado | Oct 2002 | A1 |
20020191034 | Sowizral et al. | Dec 2002 | A1 |
20030046401 | Abbott et al. | Mar 2003 | A1 |
20030154476 | Abbott et al. | Aug 2003 | A1 |
20030186201 | Martin | Oct 2003 | A1 |
20030229900 | Reisman | Dec 2003 | A1 |
20040088328 | Cook et al. | May 2004 | A1 |
20040133600 | Homer | Jul 2004 | A1 |
20040186854 | Choi | Sep 2004 | A1 |
20040201500 | Miller et al. | Oct 2004 | A1 |
20040215663 | Liu et al. | Oct 2004 | A1 |
20040267700 | Dumais et al. | Dec 2004 | A1 |
20040267812 | Harris et al. | Dec 2004 | A1 |
20050027704 | Hammond et al. | Feb 2005 | A1 |
20050034078 | Abbott et al. | Feb 2005 | A1 |
20050066282 | Abbott et al. | Mar 2005 | A1 |
20050086243 | Abbott et al. | Apr 2005 | A1 |
20050160113 | Sipusic et al. | Jul 2005 | A1 |
20050165843 | Capps et al. | Jul 2005 | A1 |
20050193017 | Kim | Sep 2005 | A1 |
20050266858 | Miller et al. | Dec 2005 | A1 |
20050272442 | Miller et al. | Dec 2005 | A1 |
20060004680 | Robarts et al. | Jan 2006 | A1 |
20060019676 | Miller et al. | Jan 2006 | A1 |
20060136393 | Abbott et al. | Jun 2006 | A1 |
20060259494 | Watson et al. | Nov 2006 | A1 |
20070022384 | Abbott et al. | Jan 2007 | A1 |
20070043459 | Abbott et al. | Feb 2007 | A1 |
20070089067 | Abbott et al. | Apr 2007 | A1 |
20070130524 | Abbott et al. | Jun 2007 | A1 |
20070168502 | Abbott et al. | Jul 2007 | A1 |
20070185864 | Budzik et al. | Aug 2007 | A1 |
20070266318 | Abbott et al. | Nov 2007 | A1 |
20080090591 | Miller et al. | Apr 2008 | A1 |
20080091537 | Miller et al. | Apr 2008 | A1 |
20080147775 | Abbott et al. | Jun 2008 | A1 |
20080161018 | Miller et al. | Jul 2008 | A1 |
20080313271 | Abbott et al. | Dec 2008 | A1 |
20090013052 | Robarts et al. | Jan 2009 | A1 |
20090055752 | Abbott et al. | Feb 2009 | A1 |
20090094524 | Abbott et al. | Apr 2009 | A1 |
20090150535 | Abbott et al. | Jun 2009 | A1 |
20090228552 | Abbott et al. | Sep 2009 | A1 |
20090234878 | Herz et al. | Sep 2009 | A1 |
20100217862 | Abbott et al. | Aug 2010 | A1 |
20100257235 | Abbott et al. | Oct 2010 | A1 |
20100262573 | Abbott et al. | Oct 2010 | A1 |
Number | Date | Country |
---|---|---|
0 661 627 | Jul 1995 | EP |
0 759 591 | Feb 1997 | EP |
0 801 342 | Oct 1997 | EP |
0 823 813 | Feb 1998 | EP |
0 846 440 | Jun 1998 | EP |
0 924 615 | Jun 1999 | EP |
05-260188 | Oct 1993 | JP |
09-091112 | Apr 1997 | JP |
11-306002 | Nov 1999 | JP |
WO-9008361 | Jul 1990 | WO |
WO-9531773 | Nov 1995 | WO |
WO-9703434 | Jan 1997 | WO |
WO-9734388 | Sep 1997 | WO |
WO-9800787 | Jan 1998 | WO |
WO-9847084 | Oct 1998 | WO |
WO-9917228 | Apr 1999 | WO |
WO-9926180 | May 1999 | WO |
WO-9966394 | Dec 1999 | WO |
WO-9967698 | Dec 1999 | WO |
WO-0036493 | Jun 2000 | WO |
Number | Date | Country | |
---|---|---|---|
20090282030 A1 | Nov 2009 | US |
Number | Date | Country | |
---|---|---|---|
60194758 | Apr 2000 | US | |
60194000 | Apr 2000 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11179822 | Jul 2005 | US |
Child | 12464064 | US | |
Parent | 09824900 | Apr 2001 | US |
Child | 11179822 | US |