Methods and systems to process a selection of a browser back button

Information

  • Patent Grant
  • 11455680
  • Patent Number
    11,455,680
  • Date Filed
    Friday, December 27, 2019
    4 years ago
  • Date Issued
    Tuesday, September 27, 2022
    a year ago
Abstract
A method and system to process a selection of a browser back button at a client machine. The system receives a browser back button selection, associates the browser back button selection to a first user interface identifier, retrieves the first user interface based on the first user interface identifier, associates the first user interface identifier to a second user interface, and displays the second user interface responsive to selection of the browser back button.
Description
TECHNICAL FIELD

This application relates generally to the technical field of data communication and, in one example embodiment, methods and systems to process a selection of a browser back button.


BACKGROUND

A user that selects a browser back button is sometimes surprised to see their computer display a user interface that he or she did not expect. This problem may arise when a program (e.g., an applet) is loaded onto the user's computer (e.g., via a web page) and executes at the user's computer to update the display without communicating the update to the browser. Consequently, the browser may unwittingly retrieve a user interface that is not expected by the user in response to the user selecting the back button associated with the browser.


SUMMARY OF INVENTION

There is provided a system to process a selection of a browser back button at a client machine, the system including: at the client machine, a programmatic client to receive a browser back button selection, associate the browser back button selection to a first user interface identifier and retrieve the first user interface based on the first user interface identifier; and at the client machine, a client application program to associate the first user interface identifier to a second user interface and display the second user interface responsive to selection of the browser back button


There is further provided a system to process a selection of a browser back button, the system including: at a server machine, a receiving module to receive a request for a user interface message that includes a client application program; and at the server machine, a communicating module to communicate the user interface message that includes the client application program to the client machine, the client application to associate a first user interface identifier to a second user interface and to display the second user interface at the client machine responsive to a browser that receives a browser back button selection, associates the browser back button selection to a first user interface identifier and retrieves the first user interface based on the first user interface identifier.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the present invention are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:



FIG. 1 is a block diagram illustrating a system, according to an embodiment, to search a data resource;



FIG. 2 is a network diagram depicting a system, according to one embodiment, to search a data resource;



FIG. 3 is a block diagram illustrating multiple applications that, in one example embodiment of the present invention, are provided as part of the computer-based system;



FIG. 4 is a high-level entity-relationship diagram, illustrating various tables that are utilized by and support the network-based applications, according to an example embodiment of the present invention;



FIG. 5 is a block diagram illustrating a system, according to an embodiment, facilitate a search of a data resource;



FIG. 6 is a block diagram illustrating search applications and search related data structures, according to an embodiment, to classify an information item;



FIG. 7 is a block diagram illustrating search metadata, according to an embodiment;



FIG. 8 is a block diagram illustrating a method, according to an embodiment, to facilitate searching a data resource;



FIG. 9 is a block diagram illustrating a method, according to an embodiment, to evaluate information with a classification rule;



FIG. 10 is a block diagram illustrating a method, according to an embodiment, to evaluate information with an inference rule;



FIG. 11 is a block diagram illustrating a system, according to an embodiment, to generate a query;



FIG. 12 is a block diagram illustrating search applications and search metadata, according to an embodiment;



FIG. 13 is a block diagram illustrating a method, according to an embodiment, to generate a query to search a data resource;



FIG. 14 is a block diagram illustrating a method, according to an embodiment, to determine domains based on a keyword query;



FIG. 15 is a block diagram illustrating a method, according to an embodiment, to determine selected characteristics based on a keyword query and a domain;



FIG. 16 is a block diagram illustrating a system, according to an embodiment, to identify data items for browsing;



FIG. 17 is a block diagram illustrating search applications and search metadata, according to an embodiment;



FIG. 18 is a block diagram illustrating a classification engine, according to an embodiment;



FIG. 19 is a block diagram illustrating a method, according to an embodiment, to identify data items for browsing;



FIG. 20 is a block diagram illustrating a method to generate a user interface based on selected characteristics, according to an embodiment;



FIG. 21 is a block diagram illustrating a method, according to an example embodiment, to determine a set of items based on selected characteristics;



FIG. 22 is a block diagram illustrating a method, according to an embodiment, to determine browsing sets;



FIG. 23 is a block diagram illustrating a method, according to an embodiment, to generate counts for browsing values;



FIG. 24 is a block diagram illustrating user interfaces and browser controls, according to an embodiment;



FIG. 25 is a block diagram illustrating a system, according to an embodiment, to process a browser back button;



FIG. 26 is a block diagram further illustrating software components associated with the client machine, according to an embodiment;



FIG. 27 is an interactive flow chart illustrating a method, according to an embodiment, to process a back button at a client machine;



FIG. 28 is an interactive flow chart illustrating a method, according to an embodiment, to request a user interface;



FIG. 29A illustrate a method, according to an embodiment, to process a user selection of the “more” user interface element;



FIG. 29B illustrate a method, according to an embodiment, to process a user selection of the “ALL” user interface element;



FIG. 29C illustrate a method, according to an embodiment, to process a user selection of the back button;



FIGS. 30-40 illustrate user interface screens, according to an example embodiment of the present invention; and



FIG. 41 illustrates a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.





DETAILED DESCRIPTION

Methods and systems to process a selection of a browser back button are described. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be evident, however, to one skilled in the art that the present invention may be practiced without these specific details.


According to a first aspect there is provided a method and system to facilitate searching of a data resource. The system receives information from a seller, that is associated with an item and evaluates the received information in real time with a rule that includes an expression (e.g., Boolean) and supplemental information. If the expression evaluates true, the system associates and stores, in real time, the supplemental information with the received information in a data resource. The supplemental information may include classification information and inferential information. The classification information may be utilized to structure the information according to concepts that may be later utilized to search the information. The inferential information be inferred from the received information (e.g., the color red may be inferred from the color ruby because ruby is a type of red) and may also be later utilized to search the information.


According to a second aspect there is provided a method and system to generate a query to search a data resource utilized by a seller and a buyer that transact an item in a network-based marketplace. The system receives a keyword query, from a buyer, and evaluates the keywords in the keyword query with a rule that includes an expression (e.g., Boolean) and classification information. If the expression evaluates true, then the system generates a concept query that corresponds to the keyword query and includes the classification information. The concept query may subsequently be used to search information in a data resource (e.g., query) that includes the classification information (e.g., according to the first aspect described above) that is generated from information that is entered by the seller.


According to a third aspect there is provided a method and system to enable a buyer to browse listings that are listed by sellers on a network-based marketplace. The system generates a user interface that displays a concept and multiple values that are associated with the concept. The system may receive two or more selections from the buyer that correspond to values that may be utilized to identify the listings on the network-based marketplace. For example, a user interface for shoes may include brands (e.g., concept) and brand names including Nike, Reebok, Keds, etc. (e.g., values). The buyer may then select two or more brand names that are received by the system and utilized by the system to identify and display shoes that exhibit the selected values (e.g., Nike and Reebok) based on information entered by the seller.


According to a fourth aspect there is provided a method and system to cancel characteristics used to identify data listings that are listed by sellers on a network-based marketplace. The system communicates a set of characteristics, utilized to identify listings that may have been selected by a user to identify data items. The user may then cancel a characteristic other than a most recently selected characteristic. In response, the system may utilize the remaining characteristics to identify data items that are determined to exhibit the remaining characteristics based on an evaluation of information that is entered by the seller.


According to a fifth aspect there is provided a method and system to determine the size of an area associated with a user interface that is utilized to display data items to a user. The system receives a request for a user interface that includes two areas that are complementary in size. The system uses the first area to display data items and the second area to display browsing options that may be selected by the user to identify data items in a data source. The system automatically determines a size associated with the area to display data items by computing a number of data items that are to be displayed in the first area. If the number of data items to be displayed in the first area exceeds a predetermined threshold then the system decreases the area to display data items and increases the area associated with the browsing options. Accordingly, a high count of data items may trigger the generation of user interface that emphasizes browsing options that may be selected by the user to identify data items in a data source.


According to a sixth aspect there is provided a method and system to process a selection of a browser back button at a client computer. The system, at the client computer, receives a browser back button selection that is processed by a browser that retrieves a user interface that does not include displayable user interface elements. The identity of the retrieved user interface is monitored by a client application program (e.g., script, applet, etc.) that utilizes the identity of the requested user interface to identify a user interface that the user expects to be displayed responsive to selection of the back button.


Overview



FIG. 1 is a block diagram illustrating a computer-based system 11, according to an embodiment, to search a data resource. The system 11 is described to provide an example context for what follows. At operation 13, an author or publisher (e.g., a seller) enters information including information items (e.g., an item description) into a client computer. The client computer communicates the information to the system 11 (e.g., a computer-based system), where it is stored in a database. The item description (or listing) may include a title, a description, one or more listing categories, etc.


At operation 15, a classification engine determines a domain for the received information (e.g., whether the item description relates to a shoe, toy, book, etc.), adds classification and inference tags to the received information and stores the classification tags and the inference tags with the received information in a data resource (e.g., memory, database, storage device, etc.). The classification engine determines a domain for the item by applying domain specific queries to the item. The classification engine may add classification tags to the received information responsive to the application of classification rules to the received information. For example, the classification engine may read “ruby” (e.g., item information) and respond by generating “color=ruby” (e.g., classification tag). Accordingly, the item information “ruby” is structured under the concept “color.” In addition, the classification engine may add inference tags responsive to the application of inference rules to the item information and the classification tags. For example, the classification engine may read “color=ruby” (e.g., classification tag) and respond by generating “color=red” (e.g., inference tag). Accordingly, the inference tag “color=red” adds information to the item information by inferring that “ruby” is a type of “red.”


At operation 17, a user enters a keyword query that is received by a client computer that communicates the keyword query to the computer-based system 11.


At operation 19, the keyword query is received and utilized by search applications to generate a domain and a conceptual query. For example the keyword query “Nike Black Size 8” may be utilized to generate the domain “shoes” and the conceptual query “Brand=Nike”, “Color=black”, Size=Size 8.”


At operation 21, the domain and the conceptual query are received by the classification engine and utilized to find information (e.g., item listings) for presentation to the buyer. Continuing with the present example, the classification engine may search for item listings in the domain “shoes” that includes classification tags or inference tags that match the conceptual query “Brand=Nike” or “Color=black” or Size=Size 8.”


At operation 23, the search applications may determine browsing sets to enable the user to further refine their search. A browsing set may include a browsing concept (e.g., Price Range) and multiple browsing values (e.g., $1.00 to $5.00, $5.00 to $10.00, $10.00 to $15.00). The user may select a browsing value which effectively specifies a browsing characteristic, a browsing concept-browsing value pair (e.g., Price Range—$1.00-$5.00). Accordingly, the search applications may determine multiple browsing characteristics that may be selected by the user.


At operation 25, the computer-based system 11 presents the concept query, the domain, the multiple browsing characteristics, and the list of items to the user.


Definitions

The word “value” in this document means numerical information or textual information or numerical information signifying textual information (e.g., 1=Red, 2=Blue, etc.) or textual information signifying numerical information or any combination thereof.


The phrase “real time” in this document means with little or no delay.


Platform Architecture



FIG. 2 is a network diagram depicting a system 10, according to one embodiment, having a client-server architecture. A computer-based system platform 12 (e.g., a computer-based system) provides server-side functionality, via a network 14 (e.g., the Internet) to one or more clients. FIG. 2 illustrates, for example, a web client 16 (e.g., a browser, such as the Internet Explorer browser developed by Microsoft Corporation of Redmond, Wash. State), and a programmatic client 18 executing on respective client machines 20 and 22.


Turning specifically to the computer-based system 12, an Application Program Interface (API) server 24 and a web server 26 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 28. The application servers 28 host one or more applications 30. The application servers 28 are, in turn, shown to be coupled to one or more databases servers 34 that facilitate access to one or more databases 36. The computer-based system 12 is further shown to include an administrator 33 may enter metadata (e.g., search metadata) that may be stored via the database servers 34 in the database 36.


The applications 30 provide a number of commerce functions and services to users that access the computer-based system 12.


Further, while the system 10 shown in FIG. 2 employs a client-server architecture, the aspects described in this application of course are not limited to such an architecture, and could equally well find application in a distributed, or peer-to-peer, architecture system. The various applications 30 could also be implemented as standalone software programs, which do not necessarily have networking capabilities.


The web client 16, it will be appreciated, accesses the various applications 30 via the web interface supported by the web server 26. Similarly, the programmatic client 18 accesses the various services and functions provided by the applications 30 via the programmatic interface provided by the API server 24. The programmatic client 18 may, for example, be a seller application (e.g., the TurboLister application developed by eBay Inc., of San Jose, Calif.) to enable sellers to author and manage listings on the computer-based system 12 in an off-line manner, and to perform batch-mode communications between the programmatic client 18 and the computer-based system 12.



FIG. 2 also illustrates a third party application 38, executing on a third party server machine 40, as having programmatic access to the computer-based system 12 via the programmatic interface provided by the API server 24. For example, the third party application 38 may, utilizing information retrieved from the computer-based system 12, support one or more features or functions on a website hosted by the third party. The third party website may, for example, provide one or more promotional, commerce or payment functions that are supported by the relevant applications of the computer-based system 12.


Applications



FIG. 3 is a block diagram illustrating multiple applications 30 that, in one example embodiment, are provided as part of the computer-based system 12. In an exemplary embodiment in which the computer-based system may support a network-based marketplace, the computer-based system 12 may provide a number of listing and price-setting mechanisms whereby a seller may list goods or services for sale, a buyer can express interest in or indicate a desire to purchase such goods or services, and a price can be set for a transaction pertaining to the goods or services. To this end, the applications 30 are shown to include one or more auction applications 44 which support auction-format listing and price setting mechanisms (e.g., English, Dutch, Vickrey, Chinese, Double, Reverse auctions etc.). The various auction applications 44 may also provide a number of features in support of such auction-format listings, such as a reserve price feature whereby a seller may specify a reserve price in connection with a listing and a proxy-bidding feature whereby a bidder may invoke automated proxy bidding.


A number of fixed-price applications 46 support fixed-price listing formats (e.g., the traditional classified advertisement-type listing or a catalogue listing) and buyout-type listings. Specifically, buyout-type listings (e.g., including the Buy-It-Now (BIN) technology developed by eBay Inc., of San Jose, Calif.) may be offered in conjunction with an auction-format listing, and allow a buyer to purchase goods or services, which are also being offered for sale via an auction, for a fixed-price that is typically higher than the starting price of the auction.


Store applications 48 allow sellers to group their listings within a “virtual” store, which may be branded and otherwise personalized by and for the sellers. Such a virtual store may also offer promotions, incentives and features that are specific and personalized to a relevant seller.


Reputation applications 50 allow parties that transact utilizing the computer-based system 12 to establish, build and maintain reputations, which may be made available and published to potential trading partners. Consider that where, for example, the computer-based system 12 supports person-to-person trading, users may have no history or other reference information whereby the trustworthiness and credibility of potential trading partners may be assessed. The reputation applications 50 allow a user, for example through feedback provided by other transaction partners, to establish a reputation within the computer-based system 12 over time. Other potential trading partners may then reference such a reputation for the purposes of assessing credibility and trustworthiness.


Personalization applications 52 allow users of the computer-based system 12 to personalize various aspects of their interactions with the computer-based system 12. For example a user may, utilizing an appropriate personalization application 52, create a personalized reference page at which information regarding transactions to which the user is (or has been) a party may be viewed. Further, a personalization application 52 may enable a user to personalize listings and other aspects of their interactions with the computer-based system 12 and other parties.


In one embodiment, the computer-based system 12 may support a number of commerce systems that are customized, for example, for specific geographic regions. A version of the computer-based system 12 may be customized for the United Kingdom, whereas another version of the computer-based system 12 may be customized for the United States. Each of these versions may operate as an independent commerce system, or may be customized (or internationalized) presentations of a common underlying commerce system.


Navigation and such of a service (e.g., the network-based marketplace) supported by the computer-based system 12 may be facilitated by one or more search applications 57. For example, the search applications 57 may enable the classification of information (e.g., item listings) published via the computer-based system 12, and may also enable the subsequent searching of the items with keyword queries, concept queries, and multi-path browsing.


In order to make information, available via the computer-based system 12, as visually informing and attractive as possible, the applications 30 may include one or more imaging applications 58 utilizing which users may upload images for inclusion within listings. An imaging application 58 also operates to incorporate images within viewed information. The imaging applications 58 may also support one or more promotional features, such as image galleries that are presented to potential buyers. For example, sellers may pay an additional fee to have an image included within a gallery of images for promoted item information.


Authoring/publishing applications, in the example form of the listing creation applications 60, allow authors/publishers sellers conveniently to author information (e.g., listings pertaining to goods or services that they wish to transact via the computer-based system 12), and application management applications (e.g., listing management applications 62) allow authors/publishers to manage such published information. For example, where a particular seller has authored and/or published a large number of listings, the management of such listings may present a challenge. The listing management applications 62 provide a number of features (e.g., auto-relisting, inventory level monitors, etc.) to assist the seller in managing such listings. One or more post-listing management applications 64 also assist sellers with a number of activities that typically occur post-listing. For example, upon completion of an auction facilitated by one or more auction applications 44, a buyer may wish to leave feedback regarding a particular seller. To this end, a post-listing management application 64 may provide an interface to one or more reputation applications 50, so as to allow the buyer conveniently to provide feedback regarding a seller to the reputation applications 50. Feedback may take the form of a review that is registered as a positive comment, a neutral comment or a negative comment. Further, points may be associated with each form of comment (e.g., +1 point for each positive comment, 0 for each neutral comment, and −1 for each negative comment) and summed to generate a rating for the seller.


Dispute resolution applications 66 provide mechanisms whereby disputes arising between transacting parties may be resolved. For example, the dispute resolution applications 66 may provide guided procedures whereby the parties are guided through a number of steps in an attempt to settle a dispute. In the event that the dispute cannot be settled via the guided procedures, the dispute may be escalated to a third party mediator or arbitrator.


A number of outlying behavior applications 68 implement various fraud detection and prevention mechanisms to reduce the occurrence of fraud within the computer-based system 12, and customer segmentation mechanisms to identify and classify high value users.


Messaging applications 70 are responsible for the generation and delivery of messages to users of the computer-based system 12, such messages for example advising users regarding the status of listings at the computer-based system 12 (e.g., providing “outbid” notices to bidders during an auction process or to provide promotional and merchandising information to users).


Merchandising applications 72 support various merchandising functions that are made available to sellers to enable sellers to increase sales via the computer-based system 12. The merchandising applications 72 also operate the various merchandising features that may be invoked by sellers, and may monitor and track the success of merchandising strategies employed by sellers.


The computer-based system 12 itself, or one or more parties that transact via the computer-based system 12, may operate loyalty programs that are supported by one or more loyalty/promotions applications 74. For example, a buyer may earn loyalty or promotions points for each transaction established and/or concluded with a particular seller, and be offered a reward for which accumulated loyalty points can be redeemed.


Data Structures



FIG. 4 is a high-level entity-relationship diagram, illustrating various tables 90 that may be maintained within the databases 36, and that are utilized by and support the applications 30. While the example embodiment of the present invention is described as being at least partially implemented utilizing a relational database, other embodiments may utilize other database-architectures (e.g., an object-oriented database schema) or data organization structures.


A user table 92 contains a record for each registered user of the computer-based system 12, and may include identifier, address and financial instrument information pertaining to each such registered user. In one embodiment, a user may operate as an author/publisher (e.g., seller) and information consumer (e.g., a buyer), or both, within the computer-based system 12. In one example embodiment of the present invention, a buyer may be a user that has accumulated value (e.g., commercial or proprietary currency), and is then able to exchange the accumulated value for items that are offered for sale by the computer-based system 12.


The tables 90 also include an items table 94 in which are maintained item records for goods and services (e.g., items) that are available to be, or have been, transacted via the computer-based system 12. Each item record within the items table 94 may furthermore be linked to one or more user records within the user table 92, so as to associate a seller and one or more actual or potential buyers with each item record.


A search metadata table 152 includes search metadata to classify item information and search information (e.g., classification rules and inference rules) and to display browsing characteristics (e.g., display instructions).


A transaction table 96 contains a record for each transaction (e.g., a purchase transaction) pertaining to items for which records exist within the items table 94.


An order table 98 is populated with order records, each order record being associated with an order. Each order, in turn, may be with respect to one or more transactions for which records exist within the transactions table 96.


Bid records within a bids table 100 each relate to a bid received at the computer-based system 12 in connection with an auction-format listing supported by an auction application 44. A feedback table 102 is utilized by one or more reputation applications 50, in one example embodiment, to construct and maintain reputation information concerning users. A history table 104 maintains a history of transactions to which a user has been a party. One or more attributes tables including an item attributes table 105 that records attribute information pertaining to items for which records exist within the items table 94 and a user attributes table 106 that records attribute information pertaining to users for which records exist within the user table 92.


Searching a Data Resource



FIG. 5 is a block diagram illustrating a system 81, according to an embodiment, to facilitate a search of a data resource. The system 81 is described to provide an example overview for what follows. The system 81 includes a classification engine 83, classification rules 89, and inference rules 91. The classification engine 83 is shown to receive an information (e.g., an item listing 85) from an author/publisher (e.g., a seller 87), generate a tagged item information 93 (e.g., a tagged item listing 93) that includes classification tags 97, inference tags 99 and store the tagged item information 93 in the classification engine 83. The classification engine 83 utilizes the classification rules 89 and the inference rules 91 to generate and apply the classification tags 97 and the inference tags 99 to the information.



FIG. 6 is a block diagram illustrating search applications 57 and search related data structures, according to an embodiment, to classify information (e.g., item information). The search applications 57 include a receiving module 422 and a classification engine 83. The receiving module 422 may receive information or information items (e.g., item information 120) that may have been entered by a user (e.g., a seller) from a client machine. The receiving module 422 may add catalog information to the item information 120, store the item information 120 in a database and communicates the item information 120 to the classification engine 83. The classification engine 83 includes a processing module 116, a rule application module 118 and tagged information (e.g., tagged item information 93). The tagged item information 93 includes item information 120 and item classification information 131.


The processing module 116 associates one or more domains 130 with the item information 120 and generates a set of item classification information 131 for each domain 130. Finally, the processing module 116 stores the item information 120, item classification information 131, and domains 130 in the classification engine 83.


The rule application module 118 applies classification rules and inference rules to generate classification tags 97 and/or inference tags 99 tags that are stored in the item classification information 131.


The item information 120 includes a title 122, a description 124, one or more listing categories 126, one or more optional item specifics 128, price information 101, selling format 103, payment method 121, shipping information 123, item location 125, buyer requirements 127, and miscellaneous information 145. The title 122 may include information in the form of alphanumeric strings that are entered by the user to provide a title for the item information 120. The description 124 may include information in the form of alphanumeric strings, pictures (e.g., JPEG, MPEG, etc.), illustrations, etc. The listing category 126 may include one or more listing categories selected by the user within which to list the item information 120 on the computer-based system 12. The item specific 128 is shown to include an attribute 132 and a value 134. The value 134 may be entered by a user from a pull down menu. For instance, item information 120 relating to a “shoe” may be associated with an item specific 128 “brand” that includes a pull down menu that lists different values 134 that correspond to brands of shoe manufacturers (e.g., Reebok, Nike, etc.). The price information 101 may include a starting price for an auction, an optional reserve price for an auction (e.g., a price below which the seller refuses to sell their item), a price for which the seller will immediately sell the item (e.g., a buyout-type listing), or other pricing related information. The selling format 103 may include information that specifies how the item is to be sold (e.g., a fixed-price selling format, an auction-format, auction types including English, Dutch, Vickery, Chinese, Double, Reverse auctions, etc.), the duration of time the item may be available for sale or for auction, and other selling format information. The payment method 121 may include information that specifies payment method(s) the seller will accept (e.g., payment service(s), credit card(s), checks, money orders, etc.). The shipping information 123 may include information that specifies the sellers shipping terms (e.g., who pays, locations the seller may or may not ship the item, etc.). The item location 125 may include information that specifies the physical location from which the item may be shipped or picked up. The buyer requirements 127 may include information that specifies which buyers are blocked from bidding on or purchasing the listed item based on criteria such as whether the buyer utilizes a specific payment service, whether the buyer utilizes a specific credit card, whether the buyer is registered in a specific country, the buyers reputation (e.g., the buyer has a feedback score of 1, 2, 3, or lower, the buyer has been identified as purchasing or winning an item in an auction and not paying for the item) and other related information.


The received information (e.g., item information 120) is supplemented with supplemental information (e.g., item classification information 131). Instances of item classification information 131 may include a domain 130, classification tags 97, and inference tags 99. Example domains 130 may include, “shoes”, “toys”, “books”, etc. Each classification tag 97 may include a tagged concept 136 and a tagged value 138. For example, an example tagged concept 136 for the domain 130 “shoes” may include “brand” and corresponding example tagged values 138 may include “Nike”, “Reebok” and “Adidas.” Adding classification tags 97 (e.g., classification information) to the tagged item information 93 structures the item information 120 and, in one embodiment, enables a conceptual search of the item information 120 (e.g., from the point of view of the buyer, in the language of a buyer, etc.).


Each inference tag 99 may include an inference concept 141 and an inference value 143 (e.g., inferential information). The inference tag 99 may be added to the item classification information 131 based on the item information 120 or the classification tags 97. For example, the classification engine 83 may infer from item information 120 that a glass item made by Corning has a “region of origin” in the United States because Corning makes glass in the United States, (e.g., inference concept 141=“region of origin”, inference value 143=“North America”). It should also be appreciated that an inference tag 99 may be utilized to broaden a tagged concept 136 or tagged value 138 and thereby bring users (e.g., a buyer or seller) together that may not otherwise be speaking the same language even though they share a common interest in information (e.g., an item listing described by the item information 120). For example, a seller may describe an item within the item information 120 as being “ruby slippers”. However, the buyer may be searching for “red slippers.” In this instance the classification engine 83 may add an inference tag 99 with the inference concept 141 “color” and the inference value 143 “red” based on a classification tag 97 with a tagged concept 136 “color” and a tagged value 138 “ruby.”



FIG. 7 is a block diagram illustrating search metadata 152, according to an embodiment. The search metadata 152 is shown to include an entry for each domain 130 that may be defined in the computer-based system 12. Each domain 130 is associated with a set of classification rules 89, a set of inference rules 91 and a domain query 158. Each classification rule 89 includes a classification clause 133 that may include an expression (e.g., Boolean) and a classification predicate 135 that may be executed if the classification clause 133 evaluates TRUE. The classification predicate 135 is shown to include a classification concept 140 and a classification value 142 (e.g., classification information), as previously described. The classification rule 89 may be utilized by the classification engine 83 to apply a classification tag 97 (e.g., classification concept 140 and a classification value 142). For example, the classification engine 83 may search the item information 120 based on the classification clause 133 and, if the classification clause 133 evaluates TRUE (e.g., if title contains “ruby”), the classification engine 83 may then execute the classification predicate 135. In present example, the classification predicate 135 tags the corresponding item information 120 with the classification concept 140 and a classification value 142 (e.g., color=ruby). Henceforth the classification concept 140 and the classification value 142 may respectively be referred to as the tagged concept 136 and the tagged value 138 (e.g., color=ruby) with regard to the tagged item information 93.


Each inference rule 91 includes an inference clause 137 that may include an expression (e.g., Boolean) and an inference predicate 139 that may be executed if the inference clause 137 evaluates TRUE. The inference predicate 139 is shown to include an inference concept 141 and an inference value 143 (e.g., inferential information), as previously described. The inference rule 91 may be utilized by the classification engine 83 to apply an inference tag 99 (e.g., inference concept 141 and an inference value 143). For example, the classification engine 83 may evaluate the item information 120 and classification tags 97 by utilizing inference clause 137. If the inference clause 137 evaluates TRUE (e.g., if description 120 contains ‘red’ OR a tagged concept 140-tagged value 138 contains ‘color=red’), the inference predicate 139 may be executed, which in the present example tags the corresponding item information 120 with additional information (e.g., the inference concept 141 and the inference value 143) (e.g., color=ruby). Henceforth the added inference concept 141 and inference value 143 may collectively be referred to as an inference tag 99 with regard to the tagged item information 93.


The domain query 158 may be utilized to identify item information 120 as included in the corresponding domain 130. The domain query 153 may include an expression (e.g., Boolean) and a domain 130 that may be associated with corresponding tagged item information 93, if the expression (e.g., Boolean) evaluates TRUE. The domain query 153 may be designed by a computer program or an administrator. For example, the expression (e.g., Boolean) associated with the domain “shoes” may require a description 124 that contains ‘Nike’ and a title 122 that contains ‘shoes.’ Another embodiment may include an expression (e.g., Boolean) that also requires an item specific 128 associated with a value 134 that indicates “cross-training” or a listing category 126 that indicates “athletic shoes.”



FIG. 8 is a flowchart illustrating a method 160, according to an embodiment, to facilitate searching of a data resource. Operations performed by the client machine 22 are illustrated on the left and operations performed by the application server 28 are illustrated on the right.


Commencing at operation 162, the seller at the client machine 22 enters item information 120 (e.g., an item listing) that is communicated to the application server 28, which then receives the item information 120 (e.g., at operation 164). FIG. 30 illustrates a user interface screen 165, according to an embodiment, that displays example item information 120. The item information 120 includes a title 122, a listing category 126, an item specific 128, and a description 124 that includes an ISBN number (e.g., 123456). For example, the title 122 may be the user selected title, “The Cat in the Hat Strikes Back.” The listing category 126 shows that the user selected the listing category “Children's Books”. Other embodiments may show the user as having entered or selected multiple listing categories 126 (e.g., books, toys, children's classics, etc.). The item specific 128 further illustrates the condition of the book as “new.” The value “new” may have been selected from a pull down menu that includes multiple values including “old”, “used”, “good”, etc. The ISBN number (e.g., 123456) may be utilized as a trigger to add additional information.


Returning to FIG. 8, at operation 166, the receiving module 422 searches the item information 120 (e.g., title 122, the description 124, the listing category 126, item specific 128, etc.) to identify strings, values, or other information items that may trigger the addition of catalog information to the item information 120. For example, the ISBN number may trigger that addition of information (e.g., alphanumeric text, graphics, pictures, audio, multi-media, etc.) from an appropriate catalog. Indeed, the ISBN number may uniquely identify the book “The Cat in the Hat Strikes Back” and accordingly provide a trigger to include information from a catalog that may further describe the book (e.g., name of the author, number of pages, publisher, list price in new condition, picture of author, audio recording of the first chapter, etc.). Other embodiments may include other types of catalogs that may be utilized to identify information (e.g., Universal Product Numbers, Universal Product Codes, Proper Nouns, etc.) that may provide a trigger to add additional information. In another embodiment the addition of catalog information may be performed before the item has been submitted at the time the seller is entering information for the item.


At operation 168, the receiving module 422 stores the item information 120 in the database 36 and communicates the item information 120 to the classification engine 83. At operation 170, the processing module 116 generates a tagged item information 93 in the classification engine 83 and stores the item information 120 in the tagged item information 93. Next, the processing module 116 reads a domain query 158 from the search metadata 158.


At decision operation 172, the processing module 116 determines if the item described by the item information 120 entered by the user has been found with the domain query 158 by evaluating the expression (e.g., Boolean) associated with the domain query 158 against the item information 120. If the expression (e.g., Boolean) evaluates TRUE then a branch is made to operation 174. Otherwise a branch is made to decision operation 180.


At operation 174, the processing module 116 registers the item information 120 as included in the current domain 130. For example, the processing module 116 may register the item information 120 by storing the domain 130 in the item classification information 131 associated with the tagged item information 93.


At operation 176, the rule application module 118 applies classification rules 89 to the item information 120 associated with the tagged item information 93.



FIG. 9 illustrates a method 186, according to an embodiment, to evaluate information with classification rules. The method 186 commences at operation 188 with the rule application module 118 reading or selecting the next classification rule 89 from the search metadata 152 based on the current domain 130.


At decision operation 190, the rule application module 118 utilizes the classification clause 133 (e.g., ‘if title contains “ruby”’) associated with the classification rule 89 to evaluate the item information 120 (e.g., title 122, description 124, listing category 126, item specific 128, etc.). If the classification clause 133 evaluates to TRUE then a branch is made to operation 200. Otherwise a branch is made to decision operation 202.


At operation 200, the rule application module 118 executes the classification predicate 135 (e.g., color=ruby) associated with the classification rule 89 against the tagged item information 93. For example, rule application module 118 may attach or store the classification predicate 135 as tagged item information 93. The classification predicate 135 may henceforth be referred to as a classification tag 97 (e.g., color=ruby) with regard to the tagged item information 93.


At decision operation 202, the rule application module 118 determines if there are more classification rules 89 in the current domain 130. If there are more classification rules 89, then a branch is made to operation 188. Otherwise the method 186 ends.


Returning to FIG. 8, at operation 178, the rule application module 118 applies the inference rules 91 to the classification tags 97 associated with the tagged item information 93.



FIG. 10 illustrates a block diagram of a method 204, according to an embodiment, to evaluate information with inference rules 99. Commencing at operation 206, the rule application module 118 reads or selects the next inference rule 91 from the search metadata 152 based on the current domain 130.


At operation 208, the rule application module 118 reads the next tagged item information 93 (e.g., including classification tags 97) associated with the current domain 130. At decision operation 210, the rule application module 118 utilizes the inference clause 137 (e.g., “if description contains ‘ruby’ OR color=ruby”) associated with the inference rule 91 to evaluate the item information 120 (e.g., title 122, description 124, listing category 126, item specific 128) and the classification tags 97 (e.g., color=red). If the inference clause 137 evaluates to TRUE, then a branch is made to operation 212. Otherwise a branch is made to decision operation 214.


At operation 212, the rule application module 118 executes the inference predicate 139 (e.g., color=red) associated with the inference rule 91 against the tagged item information 93. For example, the inference predicate 139 may be added or attached to the tagged item information 93. Henceforth the inference predicate 139 may be referred to as an inference tag 99 with regard to the tagged item information 93.


At decision operation 214, the rule application module 118 determines if more tagged item information 93 is associated with the current domain 130. If there is more tagged item information 93, then a branch is made to operation 208. Otherwise a branch is made to decision operation 216.


At decision operation 216, the rule application module 118 determines if more inference rules 91 may be associated with the current domain 130. If the rule application module 118 determines there are more inference rules 91, then a branch is made to operation 206. Otherwise processing ends.


Returning to FIG. 8, at operation 180, the processing module 116 determines if there are more domains 130. If the processing module 116 determines there are more domains 130, then a branch is made to operation 170. Otherwise the method 160 ends.


Another embodiment of the classification engine 83 may include a single Boolean evaluation graph. The Boolean evaluation graph may be utilized by the classification engine 83 to enhance the performance of Boolean evaluations. For example, a Boolean evaluation graph may evaluate a large set of classification rules 89 and inference rules 91 against a large set of information (e.g., item listings 85) while minimizing the total number of evaluation events computed by the classification engine 83.


Generating a Query



FIG. 11 is a block diagram illustrating a system 107, according to an embodiment, to generate a query to search a data resource. The system 107 is described to provide an example overview for what follows. The system 107 includes search applications 57 and classification rules 89. The search applications 57 are shown to receive a keyword query 109 from a buyer 119 and respond by utilizing the classification rules 89 to determine a domain 130, generate a concept query 111 and possibly determine keywords each of which are communicated back to the buyer 119. The concept query 111 includes one or more selected characteristics 113 (e.g., classification information) that correspond to the keywords in the keyword query 109 as determined by the classification rules 89. In some instances keywords in the keyword query 109 may not correspond to a selected characteristic 113 and may be communicated back to the buyer as such. Each selected characteristic 113 includes a selected concept 115 and a selected value 117.



FIG. 12 is a block diagram illustrating search applications 57 and search metadata 152, according to an example embodiment. The search applications 57 include a computing module 221 and a query generation module 223. The computing module 221 receives the keyword query 109 from the buyer 119, and communicates a user interface back to the buyer 119 that includes the concept query 111 and the domain 130. The query generation module 223 determines the domain 130 of the keyword query 109 and applies classification rule 89 to the keyword query 109 to generate a concept query 111 and possibly identify keywords.


The search metadata 152 may include all of the defined domains 130 for the computer-based system 12, as previously described. Each domain 130 may be associated with a domain clause 129 and classification rules 89. The domain clause 129 includes a expression (e.g., Boolean) that may be utilized to evaluate the keyword query 109. If the domain clause evaluates TRUE then the keyword query may be associated with the domain 130. Each classification rule 89 includes a classification clause 133 and a classification predicate 135, as previously described. The classification clause 133 includes an expression (e.g., Boolean) that may be utilized to evaluate keywords in the keyword query 109. If the classification clause 133 evaluates TRUE then the classification predicate 135 (e.g., classification concept 140 and the classification value 142) may be executed against the keyword query 109 thereby associating the classification concept 140 and the classification value 142 (e.g., classification information) with the keyword(s) in the keyword query 109.



FIG. 13 illustrates a method 220, according to an embodiment, to generate a query to search a data resource. Operations performed by a client machine 22 are illustrated on the left and operations performed an application server 28 are illustrated on the right. The method 220 commences at operation 222 where the user enters the keyword query 109.



FIG. 31 illustrates a user interface 224, according to an embodiment, to receive a keyword query. The user interface 224 includes a dialogue box 226 that may be utilized by the buyer 119 to enter a keyword query 109. The dialogue box 226 is shown to include the keyword query 109, “Nike black size 8.” The keyword query 109 includes keywords 228, “Nike”, “black” and “size 8.” It will be appreciated that keywords 228 may include one or more words or alphanumeric expressions (e.g., size 8). The present example user interface does not require the user to manually identify a domain 130; however, it will be appreciated that other embodiments may include user interfaces that require the user to manually identify a domain 130 that may be associated with the keyword query 109 entered by the user. For example, in one embodiment a user may be required to navigate a tree structure to locate a dialogue box 226 to enter a keyword query 109 that may be associated with a specific domain 130.


Returning to FIG. 13, at operation 230, the computing module 221 receives the keyword query 109 and communicates the keyword query 109 to the query generation module 223 which determines whether the keyword query 109 may be associated with one or more domains 130.



FIG. 14 illustrates a method 230, according to an embodiment, to determine domains 130 based on the keyword query 109. The method 230 commences at operation 233 with the query generation module 223 reading the next domain clause 129 from the search metadata 152. The domain clause 129 may contain an expression (e.g., Boolean).


At decision operation 236, the query generation module 223 evaluates the keyword query 109 with the domain clause 129 that may include an expression (e.g., Boolean). If the expression (e.g., Boolean) evaluates TRUE, then a branch is made to operation 238. Otherwise a branch is made to decision operation 242.


At operation 238, the query generation module 223 associates the domain 130 with the concept query 239 by registering the domain 130 to the concept query 239.


At decision operation 242, the query generation module 223 determines if there are more domain clauses 129 to process. If there are more domains domain clauses 129 to process then a branch is made to operation 233. Otherwise the processing ends.


Returning to FIG. 13, at decision operation 249, the computing module 221 determines if the keyword query 109 may be associated with more than one domain 130. If the keyword query 109 may be associated with more than one domain then a branch is made to operation 250. Otherwise a branch is made to operation 252.


At operation 250, the computing module 221 communicates a request to the user to select one domain 130 from the domains 130 that were associated with the keyword query 109.


At operation 254, at the client machine 22, a user interface may be displayed to enable user-selection of a domain 130. FIG. 32 illustrates the user interface 256, according to an example embodiment, to select a domain 130. The user interface 256 includes the keyword query 109 and domains 130 (e.g., “shoes”, “running suits” and “golf equipment”) that may be selected by the user.


Returning to FIG. 13, at the client machine 22, at operation 260, the user selects the “shoes” domain 130, the selection being communicated to the application server 28.


At operation 252, at the application server 28, the query generating module 231 receives the “shoes” domain 130 and utilizes the “shoes” domain 130 and the keyword query 109 “Nike Black Size 8” to determine the selected characteristics 113.



FIG. 15 illustrates a method 252, according to an embodiment, to determine selected characteristics 113 based on the keyword query 109 and the domain 130. The method 252 commences at operation 262 with the query generation module 223 utilizing the domain 130 that is associated with the keyword query 109 to read a classification rule 89 from the search metadata 152.


At decision operation 264, the query generation module 223 utilizes the classification clause 133 associated with classification rule 89 to evaluate the longest set of keywords (e.g., words) in the keyword query 109. If the classification clause 133 evaluates TRUE, then a branch is made to operation 266. Otherwise a branch is made to operation 265.


At operation 265, the query generation module 223 removes the first keyword from the keyword query 109.


At operation 266, the query generation module 223 registers the classification predicate 135 (e.g., color=ruby) associated with the classification rule 89 to the concept query 239. Henceforth the classification predicate 135 may be referred to as a selected characteristic 113.


At operation 267, the query generation module 223 removes the keyword(s) 228 that evaluated TRUE from the keyword query 109.


At decision operation 269, the query generation module 223 determines if there are more keywords in the keyword query 109. If there are more words, a branch is made to decision operation 264. Otherwise a branch is made to decision operation 268.


At decision operation 268, the query generation module 223 determines if there may be more classification rules 89. If there are more classification rules 89 then a branch is made to operation 262 to evaluate the entire keyword query 109. Otherwise the method 252 ends.


Returning to FIG. 13, at operation 270, at the application server 28, the computing module 221 communicates a user interface including the keyword query 109, the domain 130, and the concept query 239 to the buyer 119 at the client machine 22.


At operation 272, at the client machine 22, the user interface is displayed to the user. FIG. 33 illustrates a user interface 278, according to an example embodiment, to display the keyword query 109, the domain 130, and the concept query 239. The user interface 278 is shown to include the keyword query 109 “Nike Black Size 8” and the concept query 111 including three selected characteristics 113, “Color—Black”, “Brand—Nike”, and “Shoe Size—8.” The selected characteristics 113 are respectively shown to include selected concepts 115 (e.g., “Color”, “Brand”, “Shoe Size” and selected values 117 (e.g., “Black”, “Nike”, and “8”). Another example may include keyword(s) 228 (e.g., keywords 228 included in the keyword query 109 that may not have evaluated TRUE with regard to classification clauses 133).


Another embodiment of a system that receives a keyword query and generates a concept query, a domain, and keywords may include a single Boolean evaluation graph. The Boolean evaluation graph may be utilized by the system to enhance the performance of Boolean evaluations. For example, the system may utilize the Boolean evaluation graph to evaluate a large set of classification rules 89 against a keyword query 109 while minimizing the total number of evaluation events computed by the system 107.


Identify Data Items & Canceling Characteristics



FIG. 16 is a block diagram illustrating a system 293, according to an embodiment, that receives a keyword query and generates a user interface that includes the keyword query, a concept query, browsing characteristics and information (e.g., item listings 85). The system 293 is described to provide an overview for what follows.


The system 293 includes search applications 57, classification rules 89, and display instructions 302. The search applications 57 are shown to receive a keyword query 109 “Nike Black Size 8” that includes keywords 228 that may be entered by a buyer 119 with a user interface 295. The search applications 57 receive the keyword query 109 and utilize the classification rules 89 and the display instructions 302 to generate the user interface 297.


The user interface 297 includes the keyword query 109, the domain 130 “shoes”, the concept query 111 “Color—Black, Brand—Nike, Shoe Size—8”, multiple browsing sets 303 (e.g., “Product Type”, “Shoe Style”, “Price Range”) and information (e.g., item listings 85) found based on the concept query 111. The keyword query 109, the domain 130 and the concept queries 111 have been previously described. The concept query 111 is shown to include multiple selected characteristics 113 (e.g., “Color—Black”, “Brand—Nike, and “Shoe Size 8”). Each selected characteristic 113 includes a selected concept 115 (e.g., “Color”) and a selected value 117 (e.g., “Black”). For example, the buyer 119 may add selected characteristics 113 to the concept query 111 and/or cancel selected characteristics 113 from the concept query 111. The buyer 119 may add a selected characteristic 113 to the concept query 111 by selecting a browsing characteristic as described below. The buyer may cancel selected characteristics 113 by selecting one or more “cancel” buttons (not shown) each of which may be associated with a particular selected characteristic 113. The browsing sets 303 are selected by the search applications 57 based on the cumulative selected characteristics 113 (e.g., generated from the keyword query 109, the selected browsing characteristics, and the cancellations) according to a specified order. In other words, the most interesting browsing sets 303 may be presented before the least interesting browsing sets 303, the level of interest determined from the point of view of the buyer 119 by an administrator. Other embodiments may determine the level of buyer interest for a particular browsing set 303 by monitoring user selections of browsing sets 303. Some embodiments may determine the level of buyer interest for a particular browsing set 303 by monitoring the previous selections of browsing sets 303 made by the buyer. Each browsing set 303 is shown to include a browsing concept 284 (e.g., “Product Type”) and multiple browsing values 286 (e.g., “Men's Shoes”, “Women's Shoes”, etc.). The buyer 119 may select one or more browser values 286 (e.g., “Men's Shoes”), thereby effectively selecting one or more browsing characteristic 287 (e.g., “Product Type—Men's Shoes”). Henceforth the selected browsing characteristic 287 may be referred to as a selected characteristic 113 that may be included in the cumulative selected characteristics 113 that may be utilized to select the browsing sets 303, compute counts and find information (e.g., item listings 85).



FIG. 17 is a block diagram illustrating search applications 57 and search metadata 152, according to an embodiment. The search applications 57 include a determining module 298 and a generating module 300. The determining module 298 determines selected characteristics 113, determines the size of areas on a user interface to display information (e.g., item listings 85, browsing sets 303, etc.), determines information (e.g., item listings 85) to display, and determines browsing sets 301 to display. The determining module 298 determines the selected characteristics 113 based on the concept query 111 (e.g., generated from a keyword query 109), browsing characteristic(s) 287 that may have been selected, and/or selected characteristics 113 that may have been cancelled). In addition, the determining module 298 determines or finds information (e.g., item listings 85) and determines or finds browsing sets 303 based on the determined selected characteristics 113. Finally, the generating module 300 may generate counts values that may be associated with browsing values 286.


The search metadata 152 is shown to be organized by domain 130, as previously described. Each domain 130 includes a set of display instructions 302 that include multiple browsing sets 303. Each browsing set 303 includes a browsing concept 284 and multiple browsing values 286. The browsing set 303 may be presented to a buyer 119 who may select a single browsing value 286 thereby effectively selecting a browsing characteristic 287 (e.g., a browsing concept 284 and browsing value 286). The browsing sets 303 may be ordered according to the interest of most users. For example, a user may be most interested in the browsing sets 303 that appear at the top of the display instructions 302 and least interested in the browsing sets 303 that appear at the bottom of the display instructions. Accordingly, the display instructions 302 may be utilized by the determining module 298 to determine which browsing sets 303 to present to the user based on the selected characteristics 113 and the limited area on the display which precludes the ability to present all browsing sets 303 on a single display.



FIG. 18 illustrates a classification engine 114, according to an embodiment. The classification engine 114 includes tagged item information 93 entries that include item information 120 and item classification information 131 including classification tags 97 and inference tags 99, as previously described. The determining module 298 utilizes the selected characteristic(s) 113 associated with the concept query 111 and, in some instances, keywords 228 (e.g., keywords 228 contained in the keyword query 109 that may not have evaluated TRUE with any classification clause 133) to determine or find information (e.g., item listings 85) (e.g., “Items Found”).



FIG. 19 illustrates a method 304, according to an embodiment, to identify data items for browsing. Operations for the client machine 22 appear on the left and operations for the application server 28 appear on the right. At operation 306, the user enters a keyword query 109 that is communicated to the application server 28.


At operation 308, at the application server 28, the search applications 57 receive the keyword query 109 and generate a concept query 111 that includes one or more selected characteristics 113. For example, the search applications 57 may receive the keyword query “Black Nike Size 8” and generate a concept query 111 for the domain 130 “shoes” that includes three selected characteristics 113 (e.g., “Color—Black”, “Brand—Nike”, and “Shoe Size—8”). Next, the search applications 57 generate a user interface based on the selected characteristics 113 associated with the concept query 111.



FIG. 20 illustrates a method 310 to generate a user interface based on selected characteristics 113 and keyword(s) 228, according to an embodiment. The method 310 commences at operation 312 where the determining module 298 determines a set of information (e.g., item listings 85).



FIG. 21 illustrates a method 312, according to an example embodiment, to determine a set of item listings 85 based on the selected characteristics 113 and keyword(s) 228. The method 312 commences at operation 314 where the determining module 298 reads the item (e.g., tagged item information 93) from the classification engine 114 that may be associated with the domain 130 “shoes.”


At decision operation 318, the determining module 298 utilizes the keyword(s) 228 and the selected characteristics 113 associated with the concept query 111 to form an expression and to determine if the expression evaluates TRUE. For example, the determining module 289 may utilize “‘Color=Black’ AND ‘Brand=Nike’ AND ‘Shoe Size=8’” to evaluate the classification tags 97 and/or inference tags 93. In addition, the determining module may utilize the keywords 228 (e.g., keywords 228 contained in the keyword query 109 that may not have evaluated TRUE with any classification clause 133) to evaluate the item information 120. If the expression (e.g., Boolean) evaluates TRUE then a branch is made to operation 324. Otherwise a branch is made to decision operation 322.


At operation 324, the determining module 298 registers the item as found (e.g., “Item Found”).


At decision operation 322, the determining module 298 determines if there are more items associated with the domain 130 “shoes” in the classification engine 114. If there are more items then a branch is made to operation 314. Otherwise the method ends.


Returning to FIG. 20, at operation 326, the determining module 298 determines the browsing sets 303 to display to the user based on the selected characteristics 113. For example, the determining module 298 may access the appropriate display instructions 302 to determine the most interesting browsing sets 303 sufficient to occupy the available room on the user interface.



FIG. 22 illustrates a method 326, according to an embodiment, to determine browsing sets 303. The method 326 commences at operation 313 with the determining module 298 reading the next browsing set 301 from the search metadata 152 based on the appropriate domain 130. For example, the determining module 298 may read the browsing set 301 that may be associated with the display instructions 302 that may be associated with the domain 130 “shoes.”


At operation 315, the determining module 298 reads the next selected characteristic 113. At decision operation 317, the determining module 298 compares the selected concept 115 associated with the selected characteristic 113 with the browsing concept 284 associated with the browsing set 301. If the selected concept 115 and the browsing concept 284 match, then the determining module 298 branches to operation 321 (e.g., do not display a browsing set that corresponds to a selected concept). Otherwise the determining module 298 branches to decision operation 319.


At decision operation 319, the determining module 298 determines if there are more selected characteristics 113. If there are more selected characteristics 113 then a branch is made to operation 315. Otherwise a branch is made to operation 321.


At operation 321, the determining module 298 registers the browsing set 301 to be displayed on the user interface.


At decision operation 323, the determining module 298 determines if another browsing set 303 may be displayed on the user interface. If another browsing set 303 may be displayed then a branch is made to decision operation 325. Otherwise processing ends.


At decision operation 325, the determining module 298 determines if there are more browsing sets 303. If there are more browsing sets 303 then a branch is made to operation 313. Otherwise processing ends.


The above described embodiment selects browsing sets 301 for presentation to the user based on the order of browsing sets 301 as they appear in the display instructions 302. Accordingly, the display instructions 302 determine a fixed order of interest for the display of browsing sets 301 to the user. In other embodiments the fixed order of interest may be temporarily overridden by the cancellation of a selected characteristic 113 with regard to the selected characteristic 113. In this instance the cancelled selected characteristic 113 may temporarily be considered to be of maximum interest to the user and therefore displayed as a browsing set 301 to the user subsequent to cancellation of the corresponding selected characteristic 113. Accordingly, the fixed order of interest may be temporarily overridden to accommodate a user that may want to cancel a browsing value 286 that may be associated with the cancelled selected characteristic 113.


Returning to FIG. 20, at operation 328 the generating module 300 generates a count for each of the browsing values 286 associated with the browsing sets 303 that may be displayed on the user interface.



FIG. 23 illustrates the method 328, according to an embodiment, to generate counts for browsing values 286. At operation 330, the generating module 300 reads the next item that may have been found based on the selected characteristics 113 and keyword(s) 228 (e.g., based on operation 324).


At operation 332, the generating module 300 reads the next browsing set 303 from the appropriate display instructions 302. For example, the appropriate display instructions 302 may be associated with a domain 130 that matches the domain 130 associated with the concept query 111.


At operation 333, the generating module 300 reads the next browsing value 286 associated with the current browsing set 303.


At decision operation 334, the generating module 300 evaluates the current item with an expression (e.g., Boolean) including the current browsing concept 284 and the current browsing value 286 (e.g., color=black). If the expression (e.g., Boolean) evaluates TRUE, then a branch is made to operation 336. Otherwise a branch is made to decision operation 337.


At operation 336, the generating module 300 increments the appropriate counter (e.g., the counter corresponding to the current browsing concept 284 (e.g., color) and the current browsing value 286 (e.g., black).


At decision operation 337, the generating module 300 determines if there are more browsing values 286 associated with the current browsing set 301. If there are more browsing values 286, then a branch is made to operation 333. Otherwise a branch is made to decision operation 338.


At decision operation 338, the generating module 300 determines if there are more browsing sets 303. If there are more browsing sets 303 then a branch is made to operation 332. Otherwise a branch is made to decision operation 340.


At decision operation 340, the generating module 300 determines if there are more found items (e.g., found based on the selected characteristics 113, operation 324). If there are more found items then a branch is made to operation 330. Otherwise processing ends.


Returning to FIG. 19, at operation 360, at the application server 28, the search applications 57 communicate the generated user interface to the client machine 22.


At operation 362, the client machine 22 displays the generated user interface to the user. FIG. 34 illustrates a generated user interface 363, according to an embodiment. The user interface 363 includes a keyword query 109 (e.g., “Black Nike Size 8”), a domain 130 (“Shoes”), a concept query 111 (e.g., Color=Black, Brand=Nike, Shoe Size=8), browsing concepts 284 (e.g., “Product Type”, “Shoe Style”, “Price Range”), browsing values 286 (e.g., “Men's Shoes”, “Women's Shoe's, etc.), counts 365 associated with each of the browsing values 286, and information (e.g., item listings 85) that may have been found based on the selected characteristics 113.


At the client machine 22, the user selects “Men's Shoes”, thereby indicating the selection of a browsing characteristic 287 (e.g., “Product Type—Men's Shoes”). Returning to FIG. 19, at operation 364, the client machine 22 communicates the browsing characteristic 287 selection to the application server 28.


At operation 372, at the application server 28, the determining module 298 receives the selected characteristics 113 associated with the concept query 111 and the browsing characteristic 287 and determines the cumulative selected characteristics 113. For example, the determining module 298 may determine the cumulative selected characteristics 113 to include “Color—Black”, “Brand—Nike”, “Shoe Size—8”, “Product Type—Men's Shoes.” Next, the determining module 298 and the generating module 300 may utilize the cumulative selected characteristics 113 and keyword(s) 228 to generate a user interface, as previously described in method 310 on FIG. 20.


At operation 374, the generated user interface is communicated to the client machine 22.


At operation 376, the client machine 22 receives and displays the generated user interface. FIG. 35 illustrates a generated user interface 378, according to an embodiment. The user interface 378 illustrates the additional selected characteristic 113, “Product Type—Men's Shoes.” In addition, the browsing set 303 associated with the browsing concept 284 “Shoe Width” has been added to the user interface 378 thereby providing the user with the three most interesting browsing sets 303 (e.g., “Shoe Width”, “Shoe Style”, “Price Range”) based on the cumulative selected characteristics 113. Each browsing set 303 is shown to be associated with a “select more” button 305 that may be selected to present additional browsing values 286 that may be associated with the browsing set 303. In addition, the user interface 378 is shown to include multiple browsing set buttons 307 (e.g., “Condition”, “Shoe Sub-Style”, “Buying Options”) that may be selected by the user to select the corresponding named browsing sets 303. It will be appreciated that the browsing set buttons 307 reading from left to right provide the user with the next three most interesting browsing sets 303.


It should be noted that the counts 365 have been recomputed and the information (e.g., item listings 85) (e.g., “Items Found”) updated based on the cumulative selected characteristics 113 and keyword(s) 228. The user interface 378 further includes “Cancel” buttons 381 associated with each of the selected characteristics 113 thereby enabling the user to cancel a particular selected characteristic 113 without removing the remaining selected characteristics 113. In the present example, the user selects the “Cancel” button 381 associated with the selected characteristic 113 “Shoe Size—8”; however, it should be noted that the user may have selected a “Cancel” button 381 associated with any of the selected characteristics 113 (e.g., “Color—Black”, “Brand—Nike”, “Shoe Size 8” or “Product Type—Men's Shoes”) and the remaining selected characteristics 113 may have been utilized to find information (e.g., item listings), determine the most interesting browsing sets 301 for display, and generate counts for the associated browsing values 286. Returning to FIG. 19, at operation 390, the client machine 22 communicates the concept query 111, the selected browsing characteristic 287 (e.g., “Product Type—Men's Shoes”) and the cancelled selected characteristic (e.g., “Shoe Size—8”) to the application server 28.


At operation 392, at the application server 28, the determining module 298 receives the concept query 111, the selected browsing characteristic 287 (e.g., “Product Type—Men's Shoes”) and the cancelled selected characteristic (e.g., “Shoe Size—8”) and determines the cumulative selected characteristics 113. For example, the determining module 298 may determine the cumulative selected characteristics 113 to include “Color—Black”, “Brand—Nike”, “Product Type—Men's Shoes.” Next, the determining module 298 and the generating module 300 may utilize the cumulative selected characteristics 113 to generate a user interface, as previously described in method 310 on FIG. 20.


At operation 394, the generated user interface is communicated to the client machine 22.


At operation 396, the client machine 22 receives and displays the generated user interface. FIG. 36 illustrates a generated user interface 398, according to an embodiment. The user interface 398 is shown to no longer include the cancelled selected characteristic 113 “Shoe Size—8.” In addition, the browsing set 303 that was associated with the browsing concept 284 “Shoe Width” has been displaced by the browsing set 303 associated with the browsing concept “Shoe Size” (e.g., thereby providing the user with the most interesting browsing sets 303 in accordance with the cumulative selected characteristics 113). Finally, the counts 365 have been recomputed and the information (e.g., item listings 85) (e.g., “Items Found”) have been updated based on the updated selected characteristics 113 (“Color—Black”, “Brand—Nike”, “Product Type—Men's Shoes”). In another embodiment the browsing options (e.g., browsing sets 301 and browsing set buttons 307) may be minimized to display additional information (e.g., item listings 85).


Dynamic Display



FIG. 37 illustrates a user interface 400, according to an embodiment, to minimize the display of browsing options. The user interface 400 minimizes the display of browsing options and maximizes the display of information (e.g., item listings 85) based on an item count dropping below a threshold level. For example, the user interface 400 may include a count of “20” items found 472, the count determined by the determining module 298 to be below a configurable threshold resulting in the minimization of browsing options on the user interface 400. To this end, the browsing sets 301 may not be displayed on the user interface 400 though the browsing set buttons 307 (e.g., “Condition”, “Shoe Sub-Style”, “Buying Options”) may continue to be displayed on the user interface 400. In place of the browsing sets 301 additional information (e.g., item listings 85) may be displayed. Accordingly, a count of item listings 85 that falls below a threshold may trigger the generation of a user interface that emphasizes found information (e.g., item listings 85) rather than browsing options.


Another embodiment of a system that receives a keyword query and generates a user interface that includes the keyword query, a concept query, browsing characteristics and information (e.g., item listings 85) may include a single Boolean evaluation graph. The Boolean evaluation graph may be utilized by the system to enhance the performance of Boolean evaluations. For example, the system may utilize the Boolean evaluation graph to evaluate a large set of selected characteristics 113 and keywords 228 against information (e.g., item listings 85) while minimizing the total number of evaluation events computed by the system. In yet another embodiment, the system may utilize a Boolean evaluation graph to evaluate a large set of browsing characteristics 287 against information (e.g., item listings 85).


Processing Back Button Selection



FIG. 24 is a block diagram illustrating user interfaces 401 and browser controls 403, according to an embodiment. The user interfaces 401 may be displayed at a client machine and include a user interface 407, a user interface 409, and a user interface 411. The user interface 409 may include a client application program (e.g., applet, JavaScript, etc.) that may execute at the client machine to generate and display the user interfaces 409 and 411. The browser controls 403 include a back button 405 that may be selected by a user thereby triggering a browser to display the previous user interface to the user.


The user interfaces 401 illustrate a problem due to interference between the client application program and the browser. For example, a user at the client machine may select a button 415 (e.g., “A”) from the user interface 407 thereby triggering the browser at the client machine to request the user interface 409 from a server. In response, the server communicates the user interface 409, including a client application program (e.g., JavaScript), to the client machine where the client application program executes to display the user interface 409 to the user. Next, the user, at the client machine, may select a button 415 (e.g., “B”) from the user interface 409 that may be processed by the client application program at the client machine to generate and display the user interface 411. If the user now selects the back button 405 then the browser may respond by accessing the server to retrieve and display the user interface 407 instead of the user interface 409, as expected by the user. The browser responds in this manner because the browser operates without knowing that the JavaScript has executed to update the display with the user interface 411.



FIG. 25 is a block diagram illustrating a system 420, according to an embodiment, to process a browser back button. The system 420 includes the network based computer-based system 12 that includes an application server 28 or server machine that communicates over a network 14 with a client machine 22, as previously described. The client machine 22 is shown to include a programmatic client 18 (e.g., browser), a hidden frame 432, hidden user interfaces 425, 427429, a visible frame 430, and a visible user interface 426 that includes a client application program 428 (e.g., script, program, applet, etc.) and user interface elements 448. The programmatic client 18 (e.g., browser) may be utilized to request the visible user interface 426 and hidden user interfaces 425, 427429 from the application server 28. In addition, the programmatic client 18 may be executed to generate additional user interfaces (not shown) for display in the visible frame 430 at the client machine 22. To this end, the visible and hidden frames 430, 432 may be respectively associated with data structures that are utilized by the programmatic client 18 and the client application program 428.


A frame is a browser construct that may be utilized to partition a particular area of the display. In the present example, the hidden frame 432 is not partitioned an area of the display. Accordingly, the programmatic client 18 may request the hidden user interfaces 425, 427, 429 from the application server 28; however, displaying the hidden user interfaces 425427, 429 does not result in generating user interface elements that are visible to the user. In the present application, the hidden user interfaces 425, 427, 429, are utilized merely to enable proper processing of the back button 405. Further, the hidden user interfaces 425, 427, 429, are identified as static thereby triggering the programmatic client 18 to store the hidden user interfaces 425, 427, 429, in a cache (not shown) at the client machine 22.


The computer-based system 12 is shown to include an application server 28 that includes search applications 57 that include a receiving module 422 and a communication module 424. The receiving module 422 receives requests for the visible user interface 426 and the hidden user interfaces 425, 427, 429 and generates the requested user interfaces 426, 425, 427, 429 or reads the requested user interfaces 426, 425, 427, 429 from a database 36. The communication module 424 communicates the visible and hidden user interfaces 426, 425, 427, 429 to the client machine 22.



FIG. 26 is a block diagram further illustrating software components associated with the client machine 22, according to an embodiment. The client machine 22 is shown to include the programmatic client 18 (e.g., browser), a cache 434, user interface history 436, a visible frame 430 and a hidden frame 432.


The cache 434 may be utilized by the programmatic client 18 to store and retrieve static user interfaces (e.g., hidden user interfaces 425, 427, 429) for the purpose of minimizing the delay between a request for a static user interface and an update of the display. Accordingly, the cache 434 may be utilized to retrieve the static user interface instead of the application server 28.


The user interface history 436 includes frame 438 and URL 431 pairs that may be stored by the programmatic client 18 to record user interfaces that have been displayed in the respective visible and hidden frames 430, 432. For example, in one embodiment, the user interface history 436 may operate like a stack whereby the programmatic client 18 may push a frame 438 and URL 431 pair onto the stack responsive to the user requesting a user interface (e.g., corresponding to the URL 431) to be displayed in the frame 438 (e.g., visible frame 430, hidden frame 432, etc.). Conversely, the programmatic client 18 may pop one or more frame 438 and URL 431 pairs off the stack responsive to the user selecting the back button, the programmatic client 18 redisplaying the previous user interface in the designated frame. Accordingly, the user interface history 436 may operate as a first-in last-out buffer thereby providing a mechanism that preserves the order of user interfaces selected by the user and enabling the user to review the user interfaces in backwards order responsive to user repeatedly selecting the back button 405.


The visible frame 430 and hidden frame 432 include programmatic client visible and hidden frame statuses 435, 437, visible and hidden frame objects 443, 445, client application program visible and hidden frame statuses 439, 441, the visible user interfaces 426, 466, 492 and the hidden user interfaces 425, 427, 429.


The programmatic client visible and hidden frame statuses 435, 437 respectively include URLs 447, 449. The programmatic client 18 may utilize the visible and hidden frame statuses 435, 437 to determine if the client application program 428 may have requested the programmatic client 18 to request a user interface from the application server 28 to be displayed in the respective frame 430, 432.


The visible and hidden frame objects 443, 445 each include URLs 451, 453 that may be monitored and updated by the programmatic client 18 and the client application program 428. The URLs 451, 453 indicate the requested or actual user interface displayed in the respective visible frames 430 and hidden frames 432.


The client application program visible and hidden frame statuses 439, 441 respectively include URLs 455, 467. The client application program 428 may utilize the visible and hidden frame statuses 439, 441 to determine if the programmatic client 18 may have updated the user interface associated with the respective visible frame 430 or hidden frame 432.


The visible user interfaces include the visible user interface 426, mode=Default, the visible user interface 466, mode=More, and the visible user interface 492, mode=All. The hidden user interfaces include the hidden user interface 425, mode=Default, the hidden user interface 427, mode=More, and the hidden user interface 429, mode=All.


The visible user interface 426 includes the client application program 428 and user interface elements 448 previously described. The user interface elements 448 may include graphics, text, or alphanumeric strings that may be displayed on the client machine 22 and when selected by the user may result in the communication of an event to the client application program 428. For example, client application program 428 may receive an event that triggers the generation and display of the visible user interface 466 or 492 responsive to the user selecting a “MORE” or “ALL” user interface element 448, respectively.


The programmatic client 18 monitors the back button 405 and the URLs 451, 453 associated with the respective visible and hidden frame objects 443, 445. The programmatic client 18 may respond to a selection of the back button 405 or a change in the URL 451, 453. The programmatic client 18 may respond to the selection of the back button 405 by utilizing the user interface history 436 to retrieve the requested user interface from the cache 434 or the application server 28. The programmatic client 18 may respond to a change in the URL 451 by retrieving the visible user interface identified by the URL 451 including the visible user interface 492, mode=Default. The programmatic client 18 may respond to a change in the URL 453 by retrieving the hidden user interface identified by the URL 453 including the hidden user interface 425, mode=Default or the hidden user interface 427, mode=More or the hidden user interface 429, mode=All.


The client application program 428 responds to the user selecting the user interface elements 448 and monitors the URLs 441, 453 associated with the respective visible and hidden frame objects 443, 445. The client application program 428 may respond to the selection of a user interface element 448 by generating and displaying the visible user interface 426, mode=Default, or the visible user interface 466, mode=More, or the visible user interface 492, mode=All, in the visible frame 430 and by updating the corresponding URL 453 in the hidden frame object 445 thereby triggering the programmatic client to retrieve the corresponding requested hidden user interface 425, 427, or 429.



FIG. 27 is an interactive flow chart illustrating a method 450, according to an embodiment, to process a back button at a client machine 22. Illustrated on the right are operations performed by the programmatic client 18 and illustrated on the left are operations performed by the client application program 428. The method 450 commences, at operation 452, at the client machine 22 with the programmatic client 18 (e.g., browser) responding to a user entering a keyword query by communicating a request to the application server 28 for the corresponding visible use interface 426. For example, the request, without keywords, may include a URL for a visible userface (mode=Default).



FIG. 28 is an interactive flow chart illustrating a method 452, according to an embodiment, to request a user interface. Illustrated on the right are operations performed at the application server 28 and illustrated on the left are operations performed at the client machine 22. The method 452 commences, at operation 457, at the client machine 22 where the programmatic client 18 pushes an entry onto the top of the user interface history 436 by storing the requested URL 431 (e.g., URL for the visible user interface (MODE=Default)) and an associated frame 438. Further, the programmatic client 18 stores and the URL for the visible user interface (MODE=Default) in the appropriate frame, which in this example is the visible frame 430.


At decision operation 459, the programmatic client 18 determines if the requested user interface (e.g., corresponding to the URL) may be present in the cache 434. If the user interface is present in the cache 434, then a branch is made to operation 460. Otherwise a branch is made to operation 461.


At operation 461, the programmatic client 18 may communicate the request for the user interface to the application server 28.


At operation 463, at the application server 28, the receiving module 422 receives the request and generates the requested user interface or reads the requested user interface from a database 36.


At operation 467, the communication module 424 communicates the requested user interface to the client machine 22 where it may be stored in the cache 434 at operation 469.


At operation 471, the programmatic client 18 displays the user interface elements in the appropriate frame (e.g., hidden or visible) and the method 452 ends. In the present example, the programmatic client 18 displays the user interface elements 448 associated with the visible user interface 426 at the client machine 22. FIG. 38 illustrates the visible user interface 426 and browser controls 403, according to an embodiment. The browser controls 403 include a back button 405 that may be selected by the user to return to the previous user interface. The visible user interface 426 includes a concept query 111, browsing sets 303 and item found 472 including information (e.g., item listings 85), as previously described. In addition, each browsing set 303 includes browsing values 286 each of which may be associated with a count as previously described, and a more button 470 (e.g., “MORE”). The browsing values 286 may be selected to further narrow the search of items found 472. For example, selecting the price range $30.00-$40.00 may result in finding items that match the concept query (e.g., Color—Black, Brand—Nike, Size—8) in the selected price range (Price—$30.00-$40.00). The more button 470 may be selected by a user to display additional browsing values 286 with regard to a particular browsing set 303 (e.g., TOE TYPE, SHOE STYLE, PRICE RANGE).


Returning to FIG. 27, at the client machine 22, at operation 454, the programmatic client 18 invokes the client application program 428 (e.g., script).


At operation 474, the client application program 428 communicates a request to the programmatic client 18 to request the hidden user interface 425. For example, the request may include a URL for a hidden user interface (mode=Default). Next, the client application program 428 stores the above described URL in the URL 455 of the client application program visible frame status 439.


At operation 476, the programmatic client 18 requests the hidden user interface 425 by communicating the above described URL to the application server 28. For example, the method 452 may be utilized as previously described. Accordingly, after retrieval of the hidden user interface 425 the visible and hidden frame statuses 435, 437, the visible and hidden frame objects 443, 445, and the visible and hidden frame statuses 439, 441 each include a URL designating the “DEFAULT” mode.


At the operations 460, the client applications program 428 and the programmatic client 18 monitor the URLs 451, 453 associated with the respective visible and hidden frame objects 443, 445; however, the monitoring may be preempted by user selections at the client machine 22.



FIG. 29A illustrate a method 490, according to an embodiment, to process a user selection of the “more” user interface element 448. The method 490 commences at the client machine 22, at operation 465, with the user selecting the “MORE” button 470 associated with the “Price Range” browsing set 303 on the user interface 426. In response, the client application program 428 generates and displays the visible user interface 466 (e.g., Mode=MORE). It should be noted that the client application program 428 generates and displays the visible user interface 466 without accessing the application server 28 or the cache 434.



FIG. 39 illustrates a visible user interface 466 and browser controls 403, according to an embodiment. The browser controls 403 include a back button 405. The visible user interface 466 includes a concept query 111, a single browsing set 303 associated with “Price Range” including additional (e.g., “MORE”) browsing values 286 and items found 472 that includes information (e.g., item listings 85). Each browsing value 286 may be associated with a check box 468 and a count. The user may select one or more check boxes 468 thereby further narrowing the search for information (e.g., item listings 85). For example, the user may select the check box 468 associated with the price range $5.00-$10.00 and the check box 468 associated with the price range $35.00-$40.00. Accordingly, the search for information (e.g., item listings 85) may include the following search criteria “color=black” and “brand=Nike” and “size=8” and ((price range=$5.00-$10.00) or (price range=$35.00-$40.00)).


Returning to FIG. 29A, at operation 474, the client application program 428 updates the URL 446 associated with the hidden frame object 445 and the URL 467 associated with the client application program hidden frame status 441 and the process ends. For example, the client application program 428 may store a URL for hidden user interface (mode=More) in the URLs 453, 467.


Returning to FIG. 27, at the client machine 22, at decision operation 477, the programmatic client 18 determines if there has been a hidden frame 432 forward change. For example, the programmatic client 18 may compare the URL 449 associated with the programmatic client hidden frame status 437 with the URL 453 associated with the hidden frame object 445 to determine if the client application program 428 may be requesting a forward change of the hidden frame 432. If the URL 449 is different from the URL 453, then the client application program 428 may be requesting a forward change of the user interface associated with the hidden frame 432 and a branch is made to operation 478. Otherwise a branch is made to decision operation 481.


At operation 478, the programmatic client 18 requests the user interface identified with the URL 453 associated with the hidden frame object 445. For example, the programmatic client may utilize the method 452, as previously described on FIG. 28. Accordingly, the user does not perceive any change to the display at the client machine 22 because the hidden frame 432 does not include displayable user interface elements.



FIG. 29B illustrate a method 491, according to an embodiment, to process a user selection of the “ALL” user interface element 448. The method 491 commences at the client machine 22, at operation 480, with the user selecting the “ALL” button 473 on the visible user interface 466. In response, the client application program 428 generates and displays the visible user interface 492 without accessing the application server 28 or the cache 434.



FIG. 40 illustrates a visible user interface 492 and browser controls 403, according to an embodiment. The browser controls 403 include a back button 405. The visible user interface 492 includes the concept query 111, the browsing set 303 associated with “Price Range”, and the items found 472. The browsing set 303 includes “ALL” browsing values 286 associated with “Price Range.” Each browsing value 286 may be associated with a check box 468 and a count. The user may select one or more check boxes 468 thereby further narrowing the search for the items found 472.


Returning to FIG. 29B, at operation 484, the client application program 428 updates the URL 453 associated with the hidden frame object 445 and the URL 467 associated with the client application program hidden frame status 441 and the process ends. For example, the client application program 428 may store a URL for hidden user interface (mode=All) in the URLs 453, 467.


Returning to FIG. 27, at the client machine 22, at operation 477, the programmatic client 18 determines if there has been a forward change of the URL 453 associated with the hidden frame object 445, as previously described. If the programmatic client 18 determines there has been a forward change of the URL 453, then a branch is made to operation 478. Otherwise a branch is made to decision operation 481.


At operation 478, the programmatic client 18, requests the hidden user interface 229 based on the URL stored in the URL 453 associated with hidden frame object 445. For example, the programmatic client 18 may utilize the method 452, as previously described on FIG. 28.



FIG. 29C illustrate a method 462, according to an embodiment, to process a user selection of the back button 405. The method 462 commences at the client machine 22, at operation 486, with the user selecting the back button 470 from the browser control 403. In response, the programmatic client 18 may pop the top two entries off the user interface history 436, the second entry including the frame 438 and the URL 431 of the previous user interface displayed by the programmatic client 18. For example, the programmatic client 18 may determine that the previous user interface displayed in the hidden frame 432 may be identified with a URL for a hidden user interface (mode=More).


At operation 488, the programmatic client 18 requests the user interface 427 identified by the URL described above. For example, the method 452 may be utilized to request the user interface 427, as previously described.


Returning to FIG. 27, at the client machine 22, at decision operation 481, the client application program 428 determines if there has been a backward change of the URL 453 associated with the hidden frame object 445. For example, the client application program 428 may compare the URL 467 (e.g., associated with the client application program hidden frame status 441) with the URL 453 (e.g., associated with the hidden frame object 445) to determine if the programmatic client 18 processed a back button 405 request associated with the hidden frame 432. If the URL 467 is different from the URL 453 then a branch is made to operation 483. Otherwise a branch is made to decision operation 477.


At operation 483, the programmatic client 18, updates the visible frame 430 based on the user interface elements 448 identified by the URL 453. For example, a URL for a hidden user interface (Mode=More) in the URL 453 may signal the programmatic client 18 to update the visible frame 430 with the visible user interface 466 that corresponds to the “MORE” mode.


For example, the visible frame may be updated with the visible user interface 466 as illustrated on FIG. 39, as previously described.



FIG. 41 shows a diagrammatic representation of machine in the example form of a computer system 500 within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


The example computer system 500 includes a processor 502 (e.g., a central processing unit (CPU) a graphics processing unit (GPU) or both), a main memory 504 and a static memory 506, which communicate with each other via a bus 508. The computer system 500 may further include a video display unit 510 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 500 also includes an alphanumeric input device 512 (e.g., a keyboard), a cursor control device 514 (e.g., a mouse), a disk drive unit 516, a signal generation device 518 (e.g., a speaker) and a network interface device 520.


The disk drive unit 516 includes a machine-readable medium 522 on which is stored one or more sets of instructions (e.g., software 524) embodying any one or more of the methodologies or functions described herein. The software 524 may also reside, completely or at least partially, within the main memory 504 and/or within the processor 502 during execution thereof by the computer system 500, the main memory 504 and the processor 502 also constituting machine-readable media.


The software 524 may further be transmitted or received over a network 526 via the network interface device 520.


While the machine-readable medium 522 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media.


Thus, methods and systems to process a selection of a browser back button have been described. Although the present invention has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. A method comprising: responsive to selection of a first element that is displayed in a browser as part of a first visible user interface, receiving, by the browser, a second visible user interface that includes a client application program;invoking, by the browser, the client application program to cause display of the second visible user interface in the browser;responsive to selection of a second element of the second visible user interface, generating a third visible user interface by the client application program and causing display of the third visible user interface by the client application program in the browser, the third visible user interface further having different graphical user interface elements from the second visible user interface;comparing, by the client application program, a user interface identifier that identifies the second visible user interface to an additional user interface identifier that identifies the third visible user interface generated by the client application program and further having the different graphical user interface elements, the additional user interface identifier being obtained from a hidden frame for the comparing, and the additional user interface identifier being included in the hidden frame prior to the selection of the second element to generate the third visible user interface; anddisplaying, by the client application program and based on the comparing, the second visible user interface in the browser by utilizing the user interface identifier.
  • 2. The method as described in claim 1, further comprising determining that the user interface identifier indicates a backward change from the additional user interface identifier based on the comparing.
  • 3. The method as described in claim 2, further comprising displaying the second visible user interface in the browser responsive to the determining.
  • 4. The method as described in claim 2, wherein the determining is performed by the client application program.
  • 5. The method as described in claim 2, wherein the determining is based on a hidden frame object of the hidden frame being updated to include the user interface identifier, the hidden frame object identifying a current user interface displayed in the hidden frame.
  • 6. The method as described in claim 1, wherein the comparing is performed based on receiving a selection of a single back navigation element of the browser.
  • 7. The method as described in claim 1, wherein the second visible user interface includes a set of the graphical user interface elements that facilitate narrowing searches for listed items.
  • 8. The method as described in claim 7, wherein the third visible user interface includes an additional set of the graphical user interface elements that further facilitate narrowing the searches for the listed items.
  • 9. The method as described in claim 8, wherein elements of the set of graphical user interface elements and the additional set of graphical user interface elements are individually selectable to specify a characteristic of the listed items to be returned in connection with a search.
  • 10. The method as described in claim 1, wherein the first visible user interface is received via a network from an application server and wherein the second visible user interface is received from the application server, the application server having generated the second visible user interface that includes the client application program.
  • 11. The method as described in claim 1, wherein the user interface identifier and the additional user interface identifier are URLs.
  • 12. The method as described in claim 1, wherein the additional user interface identifier is included in the hidden frame prior to the third visible user interface being navigated to.
  • 13. An apparatus comprising: a processor and executable instructions accessible on a computer-readable medium that, when executed, cause the processor to perform operations comprising: responsive to selection of a first element that is displayed in a browser as part of a first visible user interface, receiving, by the browser, a second visible user interface that includes a client application program;invoking, by the browser, the client application program to cause display of the second visible user interface in the browser;responsive to selection of a second element of the second visible user interface, generating a third visible user interface by the client application program and causing display of the third visible user interface by the client application program in the browser, the third visible user interface further having different graphical user interface elements from the second visible user interface;comparing, by the client application program, a user interface identifier that identifies the second visible user interface to an additional user interface identifier that identifies the third visible user interface generated by the client application program and further having the different graphical user interface elements, the additional user interface identifier being obtained from a hidden frame for the comparing, and the additional user interface identifier being included in the hidden frame prior to the selection of the second element to generate the third visible user interface; anddisplaying, by the client application program and based on the comparing, the second visible user interface in the browser by utilizing the user interface identifier.
  • 14. The apparatus as described in claim 13, wherein the operations further comprise determining that the user interface identifier indicates a backward change from the additional user interface identifier based on the comparing.
  • 15. The apparatus as described in claim 14, wherein the determining is performed by the client application program.
  • 16. The apparatus as described in claim 13, wherein the comparing is performed based on receiving a selection of a single back navigation element of the browser.
  • 17. The apparatus as described in claim 13, wherein the second visible user interface includes a set of the graphical user interface elements that facilitate narrowing searches for listed items.
  • 18. The apparatus as described in claim 17, wherein the third visible user interface includes an additional set of the graphical user interface elements that further facilitate narrowing the searches for the listed items.
  • 19. A machine readable medium having no transitory signals and storing instructions that, when executed by at least one processor, cause the at least one processor to perform actions comprising: responsive to selection of a first element that is displayed in a browser as part of a first visible user interface, receiving, by the browser, a second visible user interface that includes a client application program;invoking, by the browser, the client application program to cause display of the second visible user interface in the browser;responsive to selection of a second element of the second visible user interface, generating a third visible user interface by the client application program and causing display of the third visible user interface by the client application program in the browser, the third visible user interface further having different graphical user interface elements from the second visible user interface;comparing, by the client application program, a user interface identifier that identifies the second visible user interface to an additional user interface identifier that identifies the third visible user interface generated by the client application program and further having the different graphical user interface elements, the additional user interface identifier being obtained from a hidden frame for the comparing, and the additional user interface identifier being included in the hidden frame prior to the selection of the second element to generate the third visible user interface; anddisplaying, by the client application program and based on the comparing, the second visible user interface in the browser by utilizing the user interface identifier.
  • 20. The machine readable medium as described in claim 19, wherein the second visible user interface includes a set of the graphical user interface elements that facilitate narrowing searches for listed items.
RELATED APPLICATIONS

This application is a continuation of and claims priority to U.S. patent application Ser. No. 14/809,496, filed Jul. 27, 2015, which is a continuation of and claims priority to U.S. patent application Ser. No. 11/241,824, filed Sep. 29, 2005, which claims the benefit of priority to U.S. Provisional Application Ser. No. 60/666,549, filed Mar. 30, 2005. The benefit of priority of each of these applications is claimed hereby and each is incorporated by reference herein in its entirety.

US Referenced Citations (413)
Number Name Date Kind
5175681 Iwai et al. Dec 1992 A
5537586 Amram et al. Jul 1996 A
5596554 Hagadorn et al. Jan 1997 A
5740425 Povilus Apr 1998 A
5787417 Hargrove Jul 1998 A
5822123 Davis et al. Oct 1998 A
5842218 Robinson Nov 1998 A
5844554 Geller et al. Dec 1998 A
5845278 Kirsch et al. Dec 1998 A
5848424 Scheinkman et al. Dec 1998 A
5950173 Perkowski Sep 1999 A
5969717 Ikemoto Oct 1999 A
6026413 Challenger Feb 2000 A
6029141 Bezos et al. Feb 2000 A
6038560 Wical Mar 2000 A
6044363 Mori et al. Mar 2000 A
6055569 O'Brien Apr 2000 A
6076088 Paik et al. Jun 2000 A
6083276 Davidson et al. Jul 2000 A
6119101 Peckover Sep 2000 A
6122648 Roderick Sep 2000 A
6128600 Imamura et al. Oct 2000 A
6144958 Ortega et al. Nov 2000 A
6151601 Papierniak et al. Nov 2000 A
6154213 Rennison et al. Nov 2000 A
6185558 Bowman et al. Feb 2001 B1
6216264 Maze et al. Apr 2001 B1
6230156 Hussey May 2001 B1
6233571 Egger et al. May 2001 B1
6237030 Adams et al. May 2001 B1
6256623 Jones Jul 2001 B1
6266649 Linden et al. Jul 2001 B1
6279016 De et al. Aug 2001 B1
6311194 Sheth et al. Oct 2001 B1
6313854 Gibson Nov 2001 B1
6321179 Glance et al. Nov 2001 B1
6345315 Mishra Feb 2002 B1
6351755 Najork et al. Feb 2002 B1
6366914 Stern Apr 2002 B1
6385602 Tso et al. May 2002 B1
6397217 Melbin May 2002 B1
6400996 Hoffberg et al. Jun 2002 B1
6401084 Ortega et al. Jun 2002 B1
6410084 Klare et al. Jun 2002 B1
6434556 Levin et al. Aug 2002 B1
6453312 Goiffon et al. Sep 2002 B1
6453339 Schultz et al. Sep 2002 B1
6460058 Koppolu et al. Oct 2002 B2
6460060 Maddalozzo, Jr. Oct 2002 B1
6469719 Kino et al. Oct 2002 B1
6470383 Leshem et al. Oct 2002 B1
6476833 Moshfeghi Nov 2002 B1
6490567 Gregory Dec 2002 B1
6493000 Wynn et al. Dec 2002 B1
6498795 Zhang et al. Dec 2002 B1
6510434 Anderson et al. Jan 2003 B1
6526479 Rosenzweig Feb 2003 B2
6539377 Culliss Mar 2003 B1
6549935 Lapstun et al. Apr 2003 B1
6553404 Stern Apr 2003 B2
6567797 Schuetze et al. May 2003 B1
6574609 Downs et al. Jun 2003 B1
6587837 Spagna et al. Jul 2003 B1
6597381 Eskridge et al. Jul 2003 B1
6598026 Ojha et al. Jul 2003 B1
6606619 Ortega et al. Aug 2003 B2
6611812 Hurtado et al. Aug 2003 B2
6631522 Erdelyi Oct 2003 B1
6633316 Maddalozzo, Jr. et al. Oct 2003 B1
6641037 Williams Nov 2003 B2
6647383 August et al. Nov 2003 B1
6665379 Brown et al. Dec 2003 B1
6667751 Wynn et al. Dec 2003 B1
6668273 Rust Dec 2003 B1
6697800 Jannink et al. Feb 2004 B1
6697823 Otsuka et al. Feb 2004 B2
6697824 Bowman-Amuah Feb 2004 B1
6727927 Dempski et al. Apr 2004 B1
6732088 Glance May 2004 B1
6760720 De Bellis Jul 2004 B1
6760890 Makinen Jul 2004 B2
6763347 Zhang et al. Jul 2004 B1
6766352 McBrearty Jul 2004 B1
6785671 Bailey et al. Aug 2004 B1
6801229 Tinkler Oct 2004 B1
6810333 Adedeji et al. Oct 2004 B2
6820111 Rubin et al. Nov 2004 B1
6853982 Smith et al. Feb 2005 B2
6856967 Woolston et al. Feb 2005 B1
6856970 Campbell et al. Feb 2005 B1
6871198 Neal et al. Mar 2005 B2
6876997 Rorex et al. Apr 2005 B1
6892193 Bolle et al. May 2005 B2
6904410 Weiss et al. Jun 2005 B1
6907574 Xu et al. Jun 2005 B2
6920459 Dedhia et al. Jul 2005 B2
6925444 Mccollom et al. Aug 2005 B1
6928425 Grefenstette et al. Aug 2005 B2
6948120 Delgobbo et al. Sep 2005 B1
6963867 Ford et al. Nov 2005 B2
6976053 Tripp et al. Dec 2005 B1
6978264 Chandrasekar et al. Dec 2005 B2
6978445 Laane Dec 2005 B2
6990534 Mikhailov et al. Jan 2006 B2
7013423 Blaschke et al. Mar 2006 B2
7039709 Beadle et al. May 2006 B1
7085736 Keezer et al. Aug 2006 B2
7099885 Hellman et al. Aug 2006 B2
7114128 Koppolu et al. Sep 2006 B2
7124041 Johnson et al. Oct 2006 B1
7127416 Tenorio Oct 2006 B1
7146627 Ismail et al. Dec 2006 B1
7197475 Lorenzen et al. Mar 2007 B1
7203615 Mao et al. Apr 2007 B2
7203675 Papierniak et al. Apr 2007 B1
7225199 Green et al. May 2007 B1
7228298 Raines Jun 2007 B1
7228301 Meyerzon et al. Jun 2007 B2
7277879 Varadarajan Oct 2007 B2
7295995 York et al. Nov 2007 B1
7328216 Hofmann et al. Feb 2008 B2
7349913 Clark et al. Mar 2008 B2
7360174 Grossman et al. Apr 2008 B2
7362311 Filner et al. Apr 2008 B2
7366721 Bennett et al. Apr 2008 B1
7392293 Leonik Jun 2008 B2
7395502 Gibbons, Jr. et al. Jul 2008 B2
7421421 Newbold et al. Sep 2008 B2
7429987 Leah et al. Sep 2008 B2
7430739 Vellanki et al. Sep 2008 B2
7447646 Agarwal et al. Nov 2008 B1
7480872 Ubillos Jan 2009 B1
7526425 Marchisio et al. Apr 2009 B2
7536672 Ruehle May 2009 B1
7546539 Kibilov et al. Jun 2009 B2
7564377 Kimchi et al. Jul 2009 B2
7613687 Nye Nov 2009 B2
7627561 Pell et al. Dec 2009 B2
7634461 Oral et al. Dec 2009 B2
7640267 Spivack et al. Dec 2009 B2
7673340 Cohen et al. Mar 2010 B1
7685074 Linden et al. Mar 2010 B2
7720315 Mirtich et al. May 2010 B2
7761791 Kobashi et al. Jul 2010 B2
7831601 Oral et al. Nov 2010 B2
7882447 Chandler et al. Feb 2011 B2
7886156 Franchi Feb 2011 B2
7890533 Pollara Feb 2011 B2
7996282 Scott et al. Aug 2011 B1
7996385 Rowney et al. Aug 2011 B2
8001003 Robinson et al. Aug 2011 B1
8086502 Krishnamurthy et al. Dec 2011 B2
8090698 Billingsley et al. Jan 2012 B2
8126907 Knighton et al. Feb 2012 B2
8194985 Grigsby et al. Jun 2012 B2
8245128 Ahad Aug 2012 B1
8260656 Harbick et al. Sep 2012 B1
8260764 Gruber Sep 2012 B1
8261196 Oral et al. Sep 2012 B2
8296666 Wright et al. Oct 2012 B2
8301512 Hamilton et al. Oct 2012 B2
8306872 Inoue et al. Nov 2012 B2
8347211 Rogers, III et al. Jan 2013 B1
8402068 Clendinning et al. Mar 2013 B2
8489438 Ibrahim Jul 2013 B1
8688535 Yuan Apr 2014 B2
8768937 Clendinning et al. Jul 2014 B2
8819039 Grove et al. Aug 2014 B2
8849693 Koyfman et al. Sep 2014 B1
8863002 Chandler et al. Oct 2014 B2
8924377 Wade et al. Dec 2014 B2
8965788 Gonsalves et al. Feb 2015 B2
8972899 Carlsson et al. Mar 2015 B2
9058332 Darby et al. Jun 2015 B1
9076173 Hamilton et al. Jul 2015 B2
9134884 Baird-Smith Sep 2015 B2
9171056 Clendinning et al. Oct 2015 B2
9262056 Leon et al. Feb 2016 B2
9286412 Maslovskis Mar 2016 B2
9412128 Clendinning et al. Aug 2016 B2
9436660 Carreno-Fuentes et al. Sep 2016 B2
9710841 Ainswort et al. Jul 2017 B2
9740996 Gao Aug 2017 B2
10395306 Shahabi et al. Aug 2019 B1
10497051 Leon et al. Dec 2019 B2
10559027 Baird-Smith Feb 2020 B2
10691868 Lawson et al. Jun 2020 B1
20010011276 Durst, Jr. et al. Aug 2001 A1
20010031251 Bologna et al. Oct 2001 A1
20010032130 Gabos et al. Oct 2001 A1
20010032163 Fertik et al. Oct 2001 A1
20010037251 Nojima et al. Nov 2001 A1
20020004735 Gross Jan 2002 A1
20020010625 Smith et al. Jan 2002 A1
20020010794 Stanbach, Jr. et al. Jan 2002 A1
20020015057 Park et al. Feb 2002 A1
20020016839 Smith et al. Feb 2002 A1
20020026353 Porat et al. Feb 2002 A1
20020040359 Green et al. Apr 2002 A1
20020052894 Bourdoncle et al. May 2002 A1
20020055878 Burton et al. May 2002 A1
20020057297 Grimes et al. May 2002 A1
20020060702 Sugimoto et al. May 2002 A1
20020063738 Chung May 2002 A1
20020068500 Gabai et al. Jun 2002 A1
20020077900 Thompson et al. Jun 2002 A1
20020080157 Chickles et al. Jun 2002 A1
20020082893 Barts et al. Jun 2002 A1
20020082953 Batham et al. Jun 2002 A1
20020083448 Johnson Jun 2002 A1
20020087558 Bailey et al. Jul 2002 A1
20020103789 Turnbull et al. Aug 2002 A1
20020103791 Janz et al. Aug 2002 A1
20020103797 Goel et al. Aug 2002 A1
20020107861 Clendinning et al. Aug 2002 A1
20020120554 Vega Aug 2002 A1
20020120619 Marso et al. Aug 2002 A1
20020123912 Subramanian et al. Sep 2002 A1
20020129002 Alberts et al. Sep 2002 A1
20020135567 Chan Sep 2002 A1
20020138624 Esenther Sep 2002 A1
20020152001 Knipp et al. Oct 2002 A1
20020152137 Lindquist et al. Oct 2002 A1
20020152222 Holbrook Oct 2002 A1
20020154157 Sherr et al. Oct 2002 A1
20020156688 Horn et al. Oct 2002 A1
20020161662 Bredow et al. Oct 2002 A1
20020184111 Swanson Dec 2002 A1
20020188527 Dillard et al. Dec 2002 A1
20020194081 Perkowski Dec 2002 A1
20020194087 Spiegel et al. Dec 2002 A1
20030009385 Tucciarone et al. Jan 2003 A1
20030009495 Adjaoute Jan 2003 A1
20030014442 Shiigi Jan 2003 A1
20030018652 Heckerman et al. Jan 2003 A1
20030020449 Smith Jan 2003 A1
20030025737 Breinberg Feb 2003 A1
20030033161 Walker et al. Feb 2003 A1
20030033336 Gremmert Feb 2003 A1
20030036964 Boyden et al. Feb 2003 A1
20030050893 Hirabayashi Mar 2003 A1
20030050916 Ortega et al. Mar 2003 A1
20030055806 Wong et al. Mar 2003 A1
20030058271 Van Der Meulen Mar 2003 A1
20030061147 Fluhr et al. Mar 2003 A1
20030061449 Beyda Mar 2003 A1
20030066031 Laane Apr 2003 A1
20030071832 Branson Apr 2003 A1
20030074368 Schuetze et al. Apr 2003 A1
20030074369 Schuetze et al. Apr 2003 A1
20030078747 Sutton Apr 2003 A1
20030083961 Bezos et al. May 2003 A1
20030093285 Colace et al. May 2003 A1
20030093585 Allan May 2003 A1
20030095141 Shah et al. May 2003 A1
20030105682 Dicker et al. Jun 2003 A1
20030105853 Morito et al. Jun 2003 A1
20030115167 Sharif et al. Jun 2003 A1
20030123850 Jun et al. Jul 2003 A1
20030132911 Narioka et al. Jul 2003 A1
20030149706 Neal et al. Aug 2003 A1
20030151621 McEvilly Aug 2003 A1
20030154044 Lundstedt et al. Aug 2003 A1
20030158893 Komatsu et al. Aug 2003 A1
20030167213 Jammes et al. Sep 2003 A1
20030171911 Fairweather Sep 2003 A1
20030172060 Uchikado Sep 2003 A1
20030172357 Kao et al. Sep 2003 A1
20030182310 Charnock et al. Sep 2003 A1
20030195877 Ford et al. Oct 2003 A1
20030200156 Roseman et al. Oct 2003 A1
20030204449 Kotas et al. Oct 2003 A1
20030216971 Sick et al. Nov 2003 A1
20030217052 Rubenczyk et al. Nov 2003 A1
20030233350 Dedhia et al. Dec 2003 A1
20040001104 Sommerer et al. Jan 2004 A1
20040004633 Perry et al. Jan 2004 A1
20040006602 Bess et al. Jan 2004 A1
20040015416 Foster et al. Jan 2004 A1
20040019536 Ashkenazi et al. Jan 2004 A1
20040030692 Leitermann Feb 2004 A1
20040054672 Tsuchitani et al. Mar 2004 A1
20040061706 Cronin et al. Apr 2004 A1
20040070627 Shahine et al. Apr 2004 A1
20040078457 Tindal Apr 2004 A1
20040083206 Wu et al. Apr 2004 A1
20040107194 Thorpe Jun 2004 A1
20040117271 Knight et al. Jun 2004 A1
20040128320 Grove et al. Jul 2004 A1
20040133876 Sproule Jul 2004 A1
20040139059 Conroy et al. Jul 2004 A1
20040148611 Manion et al. Jul 2004 A1
20040153378 Perkowski Aug 2004 A1
20040163039 Gorman Aug 2004 A1
20040168118 Wong et al. Aug 2004 A1
20040177147 Joshi Sep 2004 A1
20040177319 Horn Sep 2004 A1
20040187027 Chan Sep 2004 A1
20040189708 Larcheveque et al. Sep 2004 A1
20040205334 Rennels Oct 2004 A1
20040210479 Perkowski et al. Oct 2004 A1
20040225647 Connelly Nov 2004 A1
20040243478 Walker et al. Dec 2004 A1
20040243485 Borenstein et al. Dec 2004 A1
20040249794 Nelson et al. Dec 2004 A1
20040254851 Himeno et al. Dec 2004 A1
20040254855 Shah Dec 2004 A1
20040254950 Musgrove et al. Dec 2004 A1
20040260621 Foster et al. Dec 2004 A1
20040260677 Malpani et al. Dec 2004 A1
20040261016 Glass et al. Dec 2004 A1
20040267731 Gino monier et al. Dec 2004 A1
20050010871 Ruthfield et al. Jan 2005 A1
20050028083 Kircher et al. Feb 2005 A1
20050030322 Gardos Feb 2005 A1
20050033849 Matz Feb 2005 A1
20050039111 Abe et al. Feb 2005 A1
20050044066 Hooper et al. Feb 2005 A1
20050060324 Johnson et al. Mar 2005 A1
20050060663 Arkeketa et al. Mar 2005 A1
20050065981 Blinn et al. Mar 2005 A1
20050071251 Linden et al. Mar 2005 A1
20050080769 Gemmell et al. Apr 2005 A1
20050114682 Zimmer et al. May 2005 A1
20050114782 Klinger May 2005 A1
20050132297 Milic-frayling et al. Jun 2005 A1
20050144073 Morrisroe et al. Jun 2005 A1
20050149269 Thomas et al. Jul 2005 A1
20050149972 Knudson Jul 2005 A1
20050171627 Funk et al. Aug 2005 A1
20050197141 Jiang et al. Sep 2005 A1
20050201562 Becker et al. Sep 2005 A1
20050204292 Kibilov Sep 2005 A1
20050222981 Lawrence et al. Oct 2005 A1
20050235310 Bies Oct 2005 A1
20050251409 Johnson et al. Nov 2005 A1
20050257131 Lim et al. Nov 2005 A1
20050267869 Horvitz et al. Dec 2005 A1
20050289039 Greak Dec 2005 A1
20050289105 Cosic Dec 2005 A1
20050289168 Green et al. Dec 2005 A1
20060020360 Wu Jan 2006 A1
20060031209 Ahlberg et al. Feb 2006 A1
20060031216 Semple et al. Feb 2006 A1
20060031778 Goodwin Feb 2006 A1
20060036639 Bauerle et al. Feb 2006 A1
20060036715 Ghattu Feb 2006 A1
20060047752 Hornby Mar 2006 A1
20060059440 Pry Mar 2006 A1
20060085477 Phillips et al. Apr 2006 A1
20060106683 Fisher et al. May 2006 A1
20060106859 Eugene et al. May 2006 A1
20060143158 Ruhl et al. Jun 2006 A1
20060155711 Jackson et al. Jul 2006 A1
20060167931 Bobick et al. Jul 2006 A1
20060190352 Zeidman Aug 2006 A1
20060200066 Fischer et al. Sep 2006 A1
20060224406 Leon Oct 2006 A1
20060224571 Leon et al. Oct 2006 A1
20060224938 Fikes et al. Oct 2006 A1
20060224954 Chandler et al. Oct 2006 A1
20060224960 Baird-Smith Oct 2006 A1
20060265391 Posner et al. Nov 2006 A1
20060277212 Error et al. Dec 2006 A1
20070011304 Error Jan 2007 A1
20070027887 Baldwin Feb 2007 A1
20070112968 Schwab May 2007 A1
20070118441 Chatwani et al. May 2007 A1
20070130207 Pate et al. Jun 2007 A1
20070150365 Bolivar Jun 2007 A1
20070179848 Jain et al. Aug 2007 A1
20070192362 Caballero et al. Aug 2007 A1
20070203902 Bauerle et al. Aug 2007 A1
20070214431 Amadio et al. Sep 2007 A1
20070252603 Restrepo et al. Nov 2007 A1
20070255810 Shuster Nov 2007 A1
20070260495 Mace et al. Nov 2007 A1
20080007625 Reid et al. Jan 2008 A1
20080027830 Johnson et al. Jan 2008 A1
20080059331 Schwab Mar 2008 A1
20080071829 Monsarrat Mar 2008 A1
20080077642 Carbone et al. Mar 2008 A1
20080208857 Liu et al. Aug 2008 A1
20080278598 Greenberg et al. Nov 2008 A1
20080313060 Damodaran Dec 2008 A1
20090082882 Parfitt Mar 2009 A1
20090113475 Li Apr 2009 A1
20090172517 Kalicharan Jul 2009 A1
20090216627 Carden et al. Aug 2009 A1
20090304267 Tapley et al. Dec 2009 A1
20100076612 Robertson Mar 2010 A1
20100076867 Inoue et al. Mar 2010 A1
20100086192 Grigsby et al. Apr 2010 A1
20100214302 Melcher et al. Aug 2010 A1
20100250399 Williams et al. Sep 2010 A1
20100287382 Gyorffy et al. Nov 2010 A1
20110012930 Davis et al. Jan 2011 A1
20110093494 Chandler et al. Apr 2011 A1
20110099085 Hamilton et al. Apr 2011 A1
20120110515 Abramoff et al. May 2012 A1
20120265744 Berkowitz et al. Oct 2012 A1
20130198183 Clendinning et al. Aug 2013 A1
20130297437 Hamilton et al. Nov 2013 A1
20140304220 Clendinning et al. Oct 2014 A1
20140372244 Grove et al. Dec 2014 A1
20150020017 Chandler et al. Jan 2015 A1
20150331591 Baird-smith Nov 2015 A1
20160027086 Clendinning et al. Jan 2016 A1
20160155185 Leon et al. Jun 2016 A1
20180144392 Johnson et al. May 2018 A1
20180232757 Leffelman Aug 2018 A1
20200065886 Leon et al. Feb 2020 A1
20200104908 Shahabi et al. Apr 2020 A1
Foreign Referenced Citations (17)
Number Date Country
1293785 May 2001 CN
1333896 Jan 2002 CN
1335949 Feb 2002 CN
1351302 May 2002 CN
106021435 Oct 2016 CN
1841195 Oct 2007 EP
2002-99772 Apr 2002 JP
2001037540 May 2001 WO
2002054292 Jul 2002 WO
2002054292 Nov 2003 WO
2004097562 Nov 2004 WO
2006107333 Oct 2006 WO
2006107335 Oct 2006 WO
2007061975 May 2007 WO
2007078560 Jul 2007 WO
2007061975 Oct 2007 WO
2007078560 Oct 2007 WO
Non-Patent Literature Citations (378)
Entry
“eBay Australia: Help New to ebay Selling”, Available online on URL: <https://web.archive.orgweb/19991128111541/http://pages.ebay.com.au/help/basics/nselling.html>, Nov. 28, 2004.
“Flamenco Search Interface Project”, Retrieved from the Internet: <URL: http://flamenco.berkeley.edy/pubs.html, Oct. 2008, 2 pages.
“gumtree.com.au”, Available Online on URL: <https://web.archive.org/web/20040413091413/http://www.gumtree.com.au/>, Apr. 6, 2004, 2 pages.
“Internet Explorer Screen Dumps”, Microsoft.com,version 6.0 SP2, Dec. 2004, 6 pages.
Appeal filed for European Patent Application No. 05802369.8 on Apr. 22, 2014, 2 pages.
Decision to Refuse for European Patent No. 05802369.8 dated Feb. 20, 2014, 10 pages.
Grounds of Appeal filed on Jun. 30, 2014, for European Patent Application No. 05802369.8, 17 pages.
Office Action received for European Patent Application No. 05802369.8, dated Dec. 22, 2009, 4 pages.
Office Action received for European Patent Application No. 05802369.8, dated Feb. 14, 2014, 15 pages.
Office Action received for European Patent Application No. 05802369.8, dated Jan. 13, 2014, 2 pages.
Office Action received for European Patent Application No. 05802369.8, dated Dec. 6, 2011, 7 pages.
Office Action received for European Patent Application No. 05802369.8, dated Oct. 23, 2012, 5 pages.
Response to Communication pursuant to Articles 94(3) EPC filed on Feb. 22, 2013, for European Patent Application No. 05802369.8, dated Oct. 23, 2012, 10 pages.
Response to Office Action for European Patent Application No. 05802369.8, filed on Oct. 8, 2010, 4 pages.
Response to Summons filed on Dec. 12, 2013 for European Patent No. 05802369.8, 11 pages.
Summon to Attend Oral Proceedings received for European Patent Application No. 05802369.8, mailed on Sep. 27, 2018, 9 pages.
Summons to Attend Oral Proceedings for European Patent No. 05802369.8 mailed on Sep. 20, 2013, 5 pages.
Office Action received for Korean Patent Application No. 10-2007-7025209, dated Dec. 2, 2009, 6 pages (with English Translation of Claims).
Response to Office Action for Korean Patent Application No. 10-2007-7025209, filed on Aug. 3, 2009, 48 pages.
Advisory Action received for U.S. Appl. No. 10/648,125, dated Jun. 9, 2005, 4 pages.
Advisory Action received for U.S. Appl. No. 10/648,125, dated Feb. 11, 2009, 3 Pages.
Appeal Brief filed on Jul. 16, 2010, for U.S. Appl. No. 10/648,125, 35 Pages.
Appeal Decision received for U.S. Appl. No. 10/648,125, mailed on Jan. 30, 2014, 12 Pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 10/648,125, dated Jun. 12, 2014, 1 Page.
Applicant Initiated Interview Summary received for U.S. Appl. No. 10/648,125, dated Jan. 6, 2006, 3 pages.
Decision on Pre-Appeal Brief Request received for U.S. Appl. No. 10/648,125, mailed on Apr. 16, 2010, 2 pages.
Examiner Initiated Interview Summary received for U.S. Appl. No. 10/648,125, dated Mar. 28, 2014, 2 Pages.
Examiner Initiated Interview Summary received for U.S. Appl. No. 10/648,125, dated Oct. 16, 2008, 2 Pages.
Examiner's Answer to Appeal Brief for U.S. Appl. No. 10/648,125, mailed on Oct. 14, 2010, 47 Pages.
Final Office Action received for U.S. Appl. No. 10/648,125 , dated Dec. 4, 2008, 33 Pages.
Final Office Action received for U.S. Appl. No. 10/648,125, dated Jan. 4, 2010, 39 Pages.
Final Office Action received for U.S. Appl. No. 10/648,125, dated Jan. 9, 2006, 29 Pages.
Final Office Action received for U.S. Appl. No. 10/648,125, dated Oct. 10, 2008, 56 pages.
Final Office Action received for U.S. Appl. No. 10/648,125 dated Mar. 24, 2005, 19 Pages.
Final Office Action received for U.S. Appl. No. 10/648,125, dated Jul. 24, 2007, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 10/648,125, dated Aug. 25, 2005, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 10/648,125, dated Feb. 21, 2008, 53 Pages.
Non-Final Office Action received for U.S. Appl. No. 10/648,125, dated May 13, 2009, 36 Pages.
Non-Final Office Action received for U.S. Appl. No. 10/648,125, dated Dec. 7, 2006, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 10/648,125, dated Sep. 1, 2004, 22 Pages.
Notice of Allowance received for U.S. Appl. No. 10/648,125, dated Apr. 21, 2014, 14 pages.
Pre-Appeal Brief Request received for U.S. Appl. No. 10/648,125, mailed on Mar. 4, 2010, 5 pages.
Reply Brief filed on Mar. 26, 2014, for U.S. Appl. No. 10/648,125, 65 Pages.
Reply Brief filed on Nov. 2, 2010, for U.S. Appl. No. 10/648,125, 10 pages.
Response to Advisory Action filed on Jul. 25, 2005, for U.S. Appl. No. 10/648,125, dated Jun. 9, 2005, 9 Pages.
Response to Final Office Action filed on Feb. 4, 2009, for U.S. Appl. No. 10/648,125, dated Dec. 4, 2008, 13 pages.
Response to Final Office Action filed on Jul. 12, 2006, for U.S. Appl. No. 10/648,125, dated Jan. 9, 2006, 14 pages.
Response to Final Office Action filed on Mar. 4, 2009, for U.S. Appl. No. 10/648,125, dated Dec. 4, 2008, 14 pages.
Response to Final Office Action filed on May 26, 2005, for U.S. Appl. No. 10/648,125, dated Mar. 24, 2005, 21 Pages.
Response to Final Office Action filed on Oct. 30, 2007, for U.S. Appl. No. 10/648,125, dated Jul. 24, 2007, 13 pages.
Notice of Allowance received for U.S. Appl. No. 14/994,355, dated Apr. 17, 2019, 7 pages.
Notice of Allowance received for U.S. Appl. No. 14/994,355, dated Jul. 11, 2019, 7 pages.
Response to Final Office Action filed on Feb. 25, 2019, for U.S. Appl. No. 14/994,355, dated Jan. 2, 2019, 23 pages.
Response to Non-Final Office Action filed on Aug. 20, 2018, for U.S. Appl. No. 14/994,355, dated May 18, 2018, , Aug. 20, 2018, 23 pages.
Response to Restriction Requirement filed on Jan. 29, 2018 for U.S. Appl. No. 14/994,355, dated Dec. 14, 2017, 11 pages.
Restriction Requirement received for U.S. Appl. No. 14/994,355, dated Dec. 14, 2017, 7 pages.
Preliminary Amendment filed on Nov. 18, 2019 U.S. Appl. No. 16/664,578, 7 pages.
Response to Office Action filed on Nov. 30, 2009 for Australian Patent Application No. 2005330296, dated Apr. 2, 2009, 12 pages.
Search Report received for Australian Patent Application No. 2005330296 dated Apr. 2, 2009, 2 pages.
Office Action received for Chinese Patent Application No. 200580049965.9, dated Dec. 1, 2015, 14 pages (7 pages of Official Copy and 7 pages of English Translation).
Office Action received for Chinese Patent Application No. 200580049965.9, dated Dec. 5, 2011, 19 pages (10 pages of Official Copy and 9 pages of English Translation).
Office Action received for Chinese Patent Application No. 200580049965.9, dated Jul. 14, 2015, 7 pages (3 pages of English Translation and 4 pages of Official Copy).
Office Action received for Chinese Patent Application No. 200580049965.9, dated Mar. 11, 2015, 12 pages.
Office Action received for Chinese Patent Application No. 200580049965.9, dated Apr. 25, 2013, 18 pages.
Office Action received for Chinese Patent Application No. 200580049965.9 dated Aug. 2, 2013, 18 pages.
Office Action received for Chinese Patent Application No. 200580049965.9, dated Aug. 7, 2012, 9 pages.
Reexamination Decision for Chinese Patent Application No. 200580049965.9, dated Jan. 15, 2013, 2 pages.
Reexamination Decision for Chinese Patent Application No. 200580049965.9 dated Dec. 3, 2014, 2 pages.
Response to Office Action filed on Feb. 16, 2016 for Chinese Patent Application No. 200580049965.9 dated Dec. 1, 2015, 5 pages (2 pages of English Translation and 3 pages of Official Copy).
Response to Office action filed on Jul. 5, 2013 for Chinese Patent Application No. 200580049965.9, dated Apr. 25, 2013, 6 pages (3 pages of Official Copy and 3 pages of English Translation).
Response to Office Action filed on May 25, 2015 for Chinese Patent Application No. 200580049965.9 dated Mar. 11, 2015, 6 pages (3 pages of Original Copy and 3 pages of English Claims).
Response to Office Action filed on Sep. 24, 2015 for Chinese Patent Application No. 200580049965.9, dated Jul. 14, 2015, 5 pages (2 pages of Official Copy and 3 pages of English Translation).
Response to Office Action for Chinese Patent Application No. 200580049965.9, filed on Jan. 28, 2010, 31 pages (21 pages of Official Copy and 10 pages of English Claims).
Response to Office Action filed on Oct. 10, 2012 for Chinese Patent Application No. 200580049965.9, dated Aug. 7, 2012, 47 pages (40 pages of Official Copy and 7 pages of English Claims).
Response to Reexamination Filed on Nov. 15, 2013 for Chinese Patent Application No. 201280049377.5, 13 pages.
Office Action received for Chinese Patent Application No. 200580049965.9, dated Sep. 18, 2009, 8 pages (4 pages of Official Copy and 4 pages of English Copy).
First Examiner Report received for Australian Patent Application No. 2010201697, dated Mar. 21, 2011, 2 pages.
Notice of Acceptance received for Australian Patent Application No. 2010201697 dated May 7, 2012, 4 pages.
Office Action received for Australian Application No. 2010201697, dated Feb. 27, 2012, 2 pages.
Response to Examiner Report filed on Jan. 31, 2012 for Australian Patent Application No. 2010201697, dated Mar. 21, 2011, 24 pages.
Response to Office Action filed on Apr. 19, 2012 for Australian Patent Application No. 2010201697, dated Feb. 27, 2012, 13 pages.
Examination Report received for Australian Patent Application No. 2012216254, dated Feb. 17, 2015, 2 pages.
First Examiner's Report received for Australian Patent No. 2012216254, dated Jan. 25, 2013, 3 pages.
Office Action received for Australian Patent No. 2012216254, dated Oct. 3, 2014, 3 pages.
Response to Examination Report filed on Aug. 26, 2015 for Australian Patent Application No. 2012216254, dated Feb. 17, 2015, 6 pages.
Response to Examination Report filed on Aug. 28, 2013 for Australian Patent No. 2012216254, 3 pages.
Response to Examination Report filed on Sep. 24, 2014 for Australian Patent No. 2012216254, 17 pages.
Response to Subsequent Examiner Report filed on Jan. 6, 2014, for Australian Patent Application No. 2012216254, dated Sep. 27, 2013, 13 pages.
Response to Office Action filed on Oct. 13, 2014 for Australian Patent No. 2012216254, dated Oct. 3, 2014, 13 pages.
Subsequent Examiner Report received for Australian Patent No. 2012216254, dated Feb. 3, 2014, 3 pages.
Subsequent Examiner Report received for Australian Patent No. 2012216254, dated Sep. 27, 2013, 3 pages.
Subsequent Examiner's Report received for Australian Patent No. 2012216254, dated May 21, 2014, 3 pages.
First Examiner's Report received for Australian Patent No. 2013203508, dated Dec. 8, 2014, 4 pages.
Response to Office Action for Australian Patent Application No. 2013203508, filed on May 26, 2015, 14 pages.
First Examination Report received for Australian Patent Application No. 2015230730 dated Jun. 1, 2016, 4 pages.
Response to First Examination Report filed on Oct. 11, 2016 for Australian Patent Application No. 2015230730 dated Jun. 1, 2016, 6 pages.
Response to Second Examiner Report filed on Mar. 27, 2017 for Australian Patent Application No. 2015230730, dated Nov. 9, 2016, 17 pages.
Second Examination Report received for Australian Patent Application No. 2015230730 dated Nov. 9, 2016, 3 pages.
Office Action received for Chinese Patent Application No. 201610321885.4, dated Dec. 12, 2018, 12 pages (6 pages of English Translation and 6 pages of Official Copy).
Response to Office Action filed on Apr. 28, 2019 for Chinese Patent Application No. 201610321885.4, dated Dec. 12, 2018, 9 pages (6 pages of Official copy and 3 pages of English Pending claims).
Decision on Appeal for U.S. Appl. No. 11/548,098 mailed on Jul. 20, 2012, 6 pages.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 11195425.1, dated Feb. 10, 2014, 5 pages.
Extended European Search Report received for European Patent Application No. 11195425.1, dated Mar. 27, 2012, 6 pages.
Response to Extended European Search report received filed on Oct. 24, 2012, for European Patent Application No. 11195425.1, dated Mar. 27, 2012, 18 pages.
Summons to Attend Oral Proceedings received for European Patent Application No. 11195425.1, mailed on Feb. 8, 2017, 6 pages.
Communication Pursuant to Article 94(3) EPC received for European Patent Application No. 11195437.6, dated Jun. 24, 2014, 5 pages.
Extended Search Report received for European Patent Application No. 11195437.6, dated Jun. 12, 2012, 5 pages.
Response to Communication pursuant to Articles 94(3) EPC filed on Jan. 5, 2015, for European Patent Application No. 11195437.6, 10 pages.
Response to Extended European Search report received filed on Jan. 21, 2013, for European Patent Application No. 11195437.6, dated Jun. 12, 2012, 10 pages.
Summons to Attend Oral Proceedings received for European Patent Application No. 11195437.6, mailed on Feb. 10, 2017, 6 pages.
Non Final office Action received for U.S. Appl. No. 12/605,212 dated Feb. 16, 2012, 11 pages.
Notice of Allowance received for U.S. Appl. No. 12/605,212, dated Jun. 28, 2012, 12 pages.
Response to Non-Final Office Action filed on May 16, 2012, for U.S. Appl. No. 12/605,212, dated Feb. 16, 2012\, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 12/897,587, dated May 30, 2013, 10 pages.
Notice of Allowance received for U.S. Appl. No. 12/897,587, dated May 16, 2014, 14 pages.
Preliminary Amendment for U.S. Appl. No. 12/897,587, filed Oct. 4, 2010, 3 pages.
Response to Non-Final Office Action filed on Aug. 30, 2013 for U.S. Appl. No. 12/897,587, dated May 30, 2013, 11 pages.
Applicant-Initiated Interview Summary received for U.S. Appl. No. 13/663,178, dated Nov. 28, 2014, 3 pages.
Non-Final Office Action received for U.S. Appl. No. 13/663,178, dated Oct. 3, 2014, 18 pages.
Notice of Allowance received for U.S. Appl. No. 13/663,178 dated Mar. 3, 2015, 10 pages.
Response to Non-Final Office Action filed on Jan. 5, 2015, for U.S. Appl. No. 13/663,178 dated Oct. 3, 2014, 14 pages.
Final Office Action received for U.S. Appl. No. 14/318,525, dated May 28, 2015, 20 Pages.
Non-Final Office Action received for U.S. Appl. No. 14/318,525, dated Feb. 12, 2015, 19 Pages.
Non-Final Office Action received for U.S. Appl. No. 14/318,525, dated Sep. 24, 2015, 31 Pages.
Preliminary Amendment for U.S. Appl. No. 14/318,525, filed Jun. 30, 2014, 8 pages.
Response to Non-Final Office Action filed on Aug. 28, 2015, for U.S. Appl. No. 14/318,525, dated May 28, 2015, 16 Pages.
Response to Non-Final Office Action filed on May 12, 2015, for U.S. Appl. No. 14/318,525, dated Feb. 12, 2015, 14 pages.
Supplemental Preliminary Amendment filed on Aug. 14, 2014, for U.S. Appl. No. 14/318,525, 3 Pages.
Advisory Action Received for U.S. Appl. No. 14/500,884 dated Nov. 21, 2019, 5 pages.
Applicant Interview summary received for U.S. Appl. No. 14/500,884, dated Oct. 17, 2019, 3 pages.
Applicant Initiated Interview Summary Received for U.S. Appl. No. 14/500,884 dated Jul. 12, 2016, 4 pages.
Applicant Initiated Interview Summary Received for U.S. Appl. No. 14/500,884, dated May 23, 2019, 3 pages.
Applicant Interview Summary received for U.S. Appl. No. 14/500,884, dated Mar. 9, 2017, 3 pages.
Final Office Action received for U.S. Appl. No. 14/500,884, dated Mar. 28, 2018, 23 pages.
Final Office Action received for U.S. Appl. No. 14/500,884, dated Nov. 28, 2016, 20 pages.
Final Office Action received for U.S. Appl. No. 14/500,884, dated Aug. 22, 2019, 23 Pages.
Non-Final Office Action received for U.S. Appl. No. 14/500,884, dated Apr. 20, 2016, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 14/500,884, dated Feb. 19, 2019, 21 pages.
Non-Final Office Action received for U.S. Appl. No. 14/500,884, dated Jun. 30, 2017, 22 pages.
Preliminary Amendment received for U.S. Appl. No. 14/500,884, filed Oct. 20, 2014, 6 pages.
Response to Final Office action filed on Feb. 28, 2017 for U.S. Appl. No. 14/500,884, dated Nov. 28, 2016, 11 pages.
Response to Final Office Action filed on Jun. 28, 2018, for U.S. Appl. No. 14/500,884, dated Mar. 28, 2018, 10 pages.
Response to Final Office Action Filed on Oct. 22, 2019, for U.S. Appl. No. 14/500,884, dated Aug. 22, 2019, 11 pages.
Response to Non-Final Office Action filed on Aug. 22, 2016 for U.S. Appl. No. 14/500,884, dated Apr. 20, 2016, 14 pages.
Response to Non-Final Office Action filed on May 17, 2019 for U.S. Appl. No. 14/500,884, dated Feb. 19, 2019, 12 pages.
Response to Non-Final Office Action filed on Nov. 29, 2017 for U.S. Appl. No. 14/500,884, dated Jun. 30, 2017, 18 pages.
Supplemental Amendment for U.S. Appl. No. 14/500,884, filed Aug. 17, 2018, 6 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 14/994,355, dated Jan. 18, 2018, 3 pages.
Final Office Action received for U.S. Appl. No. 14/994,355, dated Jan. 2, 2019, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 14/994,355, dated May 18, 2018, 11 pages.
Response to Final office action filed on Mar. 18, 2014 for U.S. Appl. No. 11/241,883, dated Dec. 16, 2013, 11 pages.
Response to Non-Final office action filed on Jan. 26, 2015 for U.S. Appl. No. 11/241,883, dated Sep. 24, 2014, 21 pages.
Response to Non-Final office action filed on Sep. 22, 2009 for U.S. Appl. No. 11/241,883, dated Jun. 22, 2009, 11 pages.
Advisory Action received for U.S. Appl. No. 11/426,993 dated Feb. 12, 2014, 3 pages.
Final Office Action received for U.S. Appl. No. 11/426,993, dated Sep. 16, 2011, 16 pages.
Final Office Action received for U.S. Appl. No. 11/426,993, dated Jul. 22, 2010, 17 pages.
Final Office Action received for U.S. Appl. No. 11/426,993, dated Nov. 7, 2012, 14 pages.
Final Office Action received for U.S. Appl. No. 11/426,993, dated Sep. 4, 2013, 14 pages.
Final Office Action received for U.S. Appl. No. 11/426,993, dated Jun. 30, 2015, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 11/426,993, dated Jun. 21, 2012, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 11/426,993, dated Dec. 15, 2014, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 11/426,993, dated Feb. 24, 2010, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 11/426,993, dated Mar. 29, 2013, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 11/426,993, dated May 23, 2014, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 11/426,993, dated Dec. 23, 2010, 18 pages.
Response to Final Office Action filed on Dec. 15, 2011, for U.S. Appl. No. 11/426,993, dated Sep. 16, 2011, 9 pages.
Response to Final Office Action filed on Feb. 6, 2013, for U.S. Appl. No. 11/426,993, dated Nov. 7, 2012, 9 pages.
Response to Final Office Action filed on Mar. 4, 2014, for U.S. Appl. No. 11/426,993, dated Sep. 4, 2013, 11 pages.
Response to Final Office Action filed on Nov. 4, 2013, for U.S. Appl. No. 11/426,993, dated Sep. 4, 2013, 10 pages.
Response to Final Office Action filed on Oct. 15, 2010, for U.S. Appl. No. 11/426,993, dated Jul. 22, 2010, 43 pages.
Response to Non-Final Office Action filed on Apr. 15, 2015, for U.S. Appl. No. 11/426,993, dated Dec. 15, 2014, 21 pages.
Response to Non-Final Office Action filed on Jun. 23, 2011, for U.S. Appl. No. 11/426,993, dated Dec. 23, 2010, 8 pages.
Response to Non-Final Office Action filed on Apr. 28, 2010, for U.S. Appl. No. 11/426,993, dated Feb. 24, 2010, 10 pages.
Response to Non-Final Office Action filed on Jun. 27, 2013, for U.S. Appl. No. 11/426,993, dated Mar. 29, 2013, 8 pages.
Response to Non-Final Office Action filed on Sep. 18, 2012, for U.S. Appl. No. 11/426,993, dated Jun. 21, 2012, 10 pages.
Response to Non-Final Office Action filed on Sep. 23, 2014, for U.S. Appl. No. 11/426,993, dated May 23, 2014, 13 pages.
Response to Restriction Requirement filed on Jan. 7, 2010, for U.S. Appl. No. 11/426,993, dated Dec. 7, 2009, 5 pages.
Restriction Requirement received for U.S. Appl. No. 11/426,993, dated Dec. 7, 2009, 8 pages.
Final Office Action received for U.S. Appl. No. 11/523,991, dated Sep. 2, 2010, 26 pages.
Final Office Action received for U.S. Appl. No. 11/523,991, dated Apr. 27, 2009, 30 pages.
Final Office Action received for U.S. Appl. No. 11/523,991, dated Mar. 28, 2012, 16 pages.
Final Office Action received for U.S. Appl. No. 11/523,991, dated Oct. 10, 2013, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 11/523,991, dated Oct. 17, 2011, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 11/523,991, dated Mar. 21, 2011, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 11/523,991, dated Apr. 11, 2013, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 11/523,991, dated Mar. 18, 2010, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 11/523,991, dated Nov. 10, 2008, 30 pages.
Non-Final Office Action received for U.S. Appl. No. 11/523,991, dated Mar. 14, 2014, 12 pages.
Notice of Non-compliant Amendment received for U.S. Appl. No. 11/523,991, dated Dec. 8, 2009, 3 pages.
Response to Final Office Action filed on Jan. 8, 2014, for U.S. Appl. No. 11/523,991, dated Oct. 10, 2013, 13 pages.
Response to Final Office Action filed on May 15, 2012, for U.S. Appl. No. 11/523,991, dated Mar. 28, 2012, 15 pages.
Response to Final Office Action filed on Oct. 29, 2010, for U.S. Appl. No. 11/523,991, dated Sep. 2, 2010, 21 pages.
Response to Final Office Action filed on Sep. 10, 2009, for U.S. Appl. No. 11/523,991, dated Apr. 27, 2009, 19 pages.
Response to Non-Final Office Action filed on Feb. 10, 2009, for U.S. Appl. No. 11/523,991, dated Nov. 10, 2008, 17 pages.
Response to Non-Final Office Action filed on Jun. 3, 2011, for U.S. Appl. No. 11/523,991, dated Mar. 21, 2011, 9 pages.
Response to Non-Final Office Action filed on Dec. 15, 2011, for U.S. Appl. No. 11/523,991, dated Oct. 17, 2011, 14 pages.
Response to Non-Final Office Action filed on Jul. 11, 2013, for U.S. Appl. No. 11/523,991, dated Apr. 11, 2013, 16 pages.
Response to Non-Final Office Action filed on Jun. 16, 2014, for U.S. Appl. No. 11/523,991, dated Mar. 14, 2014, 13 pages.
Response to Non-Final Office Action filed on Jun. 18, 2010, for U.S. Appl. No. 11/523,991, dated Mar. 18, 2010, 15 pages.
Response to Notice of Non Compliant filed on Jan. 8, 2010, for U.S. Appl. No. 11/523,991 dated Dec. 8, 2009, 8 pages.
Deadline for filing Written Submissions filed on Dec. 12, 2018, for European Patent Application No. 05802369.8, mailed on Sep. 27, 2018, 19 pages.
European Search Report received for Patent Application No. 05802369.8, dated Oct. 28, 2009, 3 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,766, dated Apr. 2, 2009, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,766, dated Jan. 14, 2008, 8 pages.
Notice of Allowability received for U.S. Appl. No. 11/241,766 dated Jul. 28, 2010, 10 pages.
Response to Non-Final Office Action filed on Apr. 18, 2008 for U.S. Appl. No. 11/241,766, dated Jan. 14, 2008, 14 pages.
Advisory Action received for U.S. Appl. No. 11/241,801 dated Aug. 2, 2012, 5 pages.
Notice of Allowance Received for U.S. Appl. No. 11/241,801, dated Oct. 1, 2015, 8 pages.
Response to Amendment under Rule 312 for U.S. Appl. No. 11/241,801 mailed on Dec. 31, 2015, 2 pages.
Final Office Action received for U.S. Appl. No. 11/241,824, dated Dec. 26, 2008, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,824, dated Jul. 10, 2008, 10 pages.
Advisory Action received for U.S. Appl. No. 11/241,883, dated Mar. 12, 2014, 7 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,883, dated Jun. 22, 2009, 14 pages.
312 Amendment filed for U.S. Appl. No. 11/241,824, on Jun. 30, 2015, 9 pages.
Applicant Initiated Interview Summary Received for U.S. Appl. No. 14/809,496, dated Jul. 8, 2019, 3 pages.
Final Office Action received for U.S. Appl. No. 14/809,496, dated May 8, 2019, 12 pages.
First Action Interview—Pre-Interview Communication received for U.S. Appl. No. 14/809,496, dated Jun. 14, 2018, 9 pages.
First Action Interview—Pre-Interview Communcation received for U.S. Appl. No. 14/809,496 dated Nov. 8, 2018, 12 pages.
Notice of Allowance received for U.S. Appl. No. 14/809,496, dated Sep. 30, 2019, 8 pages.
Preliminary Amendment for U.S. Appl. No. 14/809,496, filed Aug. 14, 2015, 15 pages.
Response to Final Office Action filed on Aug. 5, 2019 for U.S. Appl. No. 14/809,496, dated May 8, 2019, 17 pages.
Response to First Action Interview—Pre-Interview Communication filed on Dec. 13, 2018, for U.S. Appl. No. 14/809,496, dated Nov. 8, 2018, 13 pages.
Response to First Action Interview—Pre-lnterview Communication filed on Jul. 3, 2018, for U.S. Appl. No. 14/809,496, dated Jun. 14, 2018, 15 pages.
Supplemental Amendment filed on Jul. 10, 2018 for U.S. Appl. No. 14/809,496, mailed on Jun. 14, 2018, 13 pages.
Notice of Acceptance received for Australian Patent Application No. 2005330296 dated Dec. 23, 2009, 3 pages.
Notice of Acceptance received for Australian Patent Application No. 2013203508, dated Jun. 18, 2015, 2 pages.
Notice of Acceptance received for Australian Patent Application No. 2015230730, dated Apr. 28, 2017, 3 pages.
Response to Non-Final Office Action filed on Feb. 6, 2012 for U.S. Appl. No. 11/241,801, dated Nov. 4, 2011, 13 pages.
Response to Non-Final Office Action filed on Nov. 10, 2014 for U.S. Appl. No. 11/241,801, dated Sep. 9, 2014, 26 pages.
Response to Non-Final Office Action filed on Oct. 21, 2010, for U.S. Appl. No. 11/241,801, dated Jul. 22, 2010, 13 pages.
Response to Non-Final Office Action filed on Sep. 11, 2009 for U.S. Appl. No. 11/241,801, dated Jun. 12, 2009, 10 pages.
Response to Notice of Non-Compliant Amendment filed on Sep. 4, 2012, for U.S. Appl. No. 11/241,801, mailed on Aug. 2, 2012, 15 pages.
Response to Restriction Requirement filed on Apr. 13, 2009, for U.S. Appl. No. 11/241,801, dated Mar. 13, 2009, 7 pages.
Restriction Requirement received for U.S. Appl. No. 11/241,801, dated Mar. 13, 2009, 6 pages.
Rule 312 Amendment under Notice of Allowance for U.S. Appl. No. 11/241,801, filed Dec. 15, 2015, 9 pages.
Corrected Notice of Allowance received for U.S. Appl. No. 11/241,824, dated May 7, 2015, 6 pages.
Advisory Action received for U.S. Appl. No. 11/241,824 dated Feb. 26, 2013, 4 pages.
Advisory Action received for U.S. Appl. No. 11/241,824 dated Mar. 17, 2009, 3 pages.
Decision on Pre-Appeal Brief Request for U.S. Appl. No. 11/241,824 mailed on May 31, 2013, 2 pages.
Examiner Initiated Interview Summary received for U.S. Appl. No. 11/241,824 dated Nov. 6, 2014, 3 pages.
Final Office Action received for U.S. Appl. No. 11/241,824, dated Apr. 15, 2008, 19 pages.
Final Office Action received for U.S. Appl. No. 11/241,824, dated Dec. 29, 2009, 12 pages.
Final Office Action received for U.S. Appl. No. 11/241,824, dated Oct. 26, 2012, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,824, dated Jun. 12, 2009, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,824, dated Jun. 20, 2014, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,824, dated Oct. 31, 2007, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,824, dated Mar. 29, 2012, 10 pages.
Notice of Allowance received for U.S. Appl. No. 11/241,824, dated Mar. 30, 2015, 5 pages.
Pre-Appeal Brief Request for U.S. Appl. No. 11/241,824, filed Mar. 26, 2013, 5 pages.
Response to Final office action filed on Feb. 26, 2009 for U.S. Appl. No. 11/241,824, dated Dec. 26, 2008, 13 pages.
Response to Final Office Action filed on Jan. 3, 2013 for U.S. Appl. No. 11/241,824, dated Oct. 26, 2012, 16 pages.
Response to Final Office Action filed on Jul. 31, 2013 for U.S. Appl. No. 11/241,824, dated Oct. 26, 2012, 21 pages.
Response to Final Office Action filed on Jun. 16, 2008 for U.S. Appl. No. 11/241,824, dated Apr. 15, 2008, 15 pages.
Response to Final office action filed on Mar. 1, 2010 for U.S. Appl. No. 11/241,824, dated Dec. 29, 2009, 13 pages.
Response to Non-Final Office Action filed on Jul. 30, 2012 for U.S. Appl. No. 11/241,824, dated Mar. 29, 2012, 14 pages.
Response to Non-Final Office Action filed on Mar. 3, 2008 for U.S. Appl. No. 11/241,824, dated Oct. 31, 2007, 23 pages.
Response to Non-Final Office Action filed on Nov. 20, 2014 for U.S. Appl. No. 11/241,824, dated Jun. 20, 2014, 18 pages.
Response to Non-Final Office Action filed on Oct. 10, 2008 for U.S. Appl. No. 11/241,824, dated Jul. 10, 2008, 13 pages.
Response to Non-Final Office Action filed on Sep. 11, 2009 for U.S. Appl. No. 11/241,824, dated Jun. 12, 2009, 16 pages.
Advisory Action received for U.S. Appl. No. 11/241,883, dated Mar. 17, 2010, 3 pages.
Amendment for U.S. Appl. No. 11/241,883, filed Aug. 15, 2012, 10 pages.
Appeal Brief for U.S. Appl. No. 11/241,883, filed Mar. 2, 2016, 33 pages.
Appeal Brief for U.S. Appl. No. 11/241,883, filed May 24, 2010, 24 pages.
Appeal Decision received for U.S. Appl. No. 11/241,883, mailed on Dec. 10, 2018, 14 pages.
Decision on Appeal received for U.S. Appl. No. 11/241,883, mailed on Jun. 15, 2012, 5 pages.
Decision on Pre-Appeal Brief Request received for U.S. Appl. No. 11/241,883, mailed on Apr. 6, 2010, 2 pages.
Examiner's Answer received for U.S. Appl. No. 11/241,883, dated Sep. 21, 2016, 9 pages.
Examiner's Answer to Appeal Brief for U.S. Appl. No. 11/241,883, dated Aug. 4, 2010, 10 pages.
Final Office Action received for U.S. Appl. No. 11/241,883, dated Dec. 16, 2013, 12 pages.
Final Office Action received for U.S. Appl. No. 11/241,883, dated Dec. 22, 2009, 8 pages.
Final Office Action received for U.S. Appl. No. 11/241,883, dated May 15, 2015, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,883, dated Sep. 24, 2014, 13 pages.
Pre-Appeal Brief Request received for U.S. Appl. No. 11/241,883, mailed on Mar. 22, 2010, 5 pages.
Reply Brief for U.S. Appl. No. 11/241,883, filed Sep. 30, 2010, 6 pages.
Reply Brief for U.S. Appl. No. 11/241,883, filed Nov. 21, 2016, 6 pages.
Response to Final office action filed on Feb. 18, 2014 for U.S. Appl. No. 11/241,883, dated Dec. 16, 2013, 11 pages.
Response to Final Office action filed on Feb. 22, 2010 for U.S. Appl. No. 11/241,883, dated Dec. 22, 2009 11 pages.
Response to Non-Final Office Action filed on Dec. 16, 2005, for U.S. Appl. No. 10/648,125, dated Aug. 25, 2005, 9 Pages.
Response to Non-Final Office Action filed on Jun. 23, 2008, for U.S. Appl. No. 10/648,125, dated Feb. 21, 2008, 19 Pages.
Response to Non-Final Office Action filed on Jan. 3, 2005, for U.S. Appl. No. 10/648,125, dated Sep. 1, 2004, 18 pages.
Response to Non-Final Office Action filed on May 10, 2007, for U.S. Appl. No. 10/648,125, dated Dec. 7, 2006, 14 pages.
Response to Non-Final Office Action filed on Sep. 14, 2009, for U.S. Appl. No. 10/648,125, dated May 13, 2009, 14 pages.
Notice of Allowance received for Korean Patent Application No. 10-2007-7025209, dated Apr. 22, 2010, 3 pages (1 page of English Translation and 2 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2007-7025209, dated Dec. 8, 2008, 8 pages (with English Translation of Claims).
Office Action received for Korean Patent Application No. 10-2007-7025209, dated Jun. 2, 2009, 6 pages.
Response to Office Action filed on Feb. 2, 2010 for Korean Patent Application No. 10-2007-7025209 dated Jun. 2, 2009, 22 pages (16 pages of Official Copy and 6 pages of English Translation).
Notice of Allowance received for Korean Patent Application No. 10-2009-7016255, dated Apr. 16, 2010, 3 pages (1 page of English Translation and 2 page of Official Copy).
Office Action received for Korean Patent Application No. 10-2009-7016255, dated Nov. 3, 2009, 6 pages (3 pages of Official Copy and 3 pages of English Translation).
Response to Office Action filed on Jan. 4, 2010 for Korean Patent Application No. 10-2009-7016255 dated Nov. 3, 2009, 20 pages (18 pages of Official Copy and 2 pages of English Translation).
Notice of Allowance received for Korean Patent Application No. 10-2010-7002438, dated Apr. 22, 2010, 3 pages (1 page of English Translation and 2 pages of Official Copy).
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 11/241,883, dated Aug. 4, 2010, 10 pages.
Advisory Action received for U.S. Appl. No. 11/241,766, dated Feb. 19, 2010, 3 pages.
Advisory Action received for U.S. Appl. No. 11/241,766 dated Nov. 6, 2008, 3 pages.
Decision on Pre-Appeal Brief for U.S. Appl. No. 11/241,766, mailed on May 4, 2010, 2 pages.
Decision on Pre-Appeal Brief Request for U.S. Appl. No. 11/241,766, mailed on Dec. 5, 2009, 2 pages.
Final Office Action received for U.S. Appl. No. 11/241,766, dated Dec. 4, 2009, 10 pages.
Final Office Action received for U.S. Appl. No. 11/241,766, dated Aug. 12, 2008, 10 pages.
Notice of Allowance received for U.S. Appl. No. 11/241,766, dated Aug. 31, 2010, 12 pages.
Pre-Appeal Brief Request filed on Mar. 4, 2010 for U.S. Appl. No. 11/241,766, 1 page.
Pre-Appeal Brief Request for U.S. Appl. No. 11/241,766, filed Nov. 12, 2008, 5 pages.
Response to Final Office Action filed on Feb. 4, 2010 for U.S. Appl. No. 11/241,766 dated Dec. 4, 2009, 16 pages.
Response to Final Office Action filed on Jan. 26, 2009 for U.S. Appl. No. 11/241,766, dated Aug. 12, 2008, 12 pages.
Response to Final Office Action filed on Oct. 16, 2008 for U.S. Appl. No. 11/241,766, dated Aug. 12, 2008, 14 pages.
Response to Non-Final Office Action filed on Aug. 3, 2009 for U.S. Appl. No. 11/241,766, dated Apr. 2, 2009, 13 pages.
PTO Response to Rule 312 Amendment for U.S. Appl. No. 11/241,766 mailed on Jan. 5, 2011, 2 pages.
312 Amendment filed for U.S. Appl. No. 11/241,766 on Nov. 30, 2010, 6 pages.
Applicant Initiated Interview Summary received for U.S. Appl. No. 11/241,801 dated Oct. 28, 2014, 3 pages.
Final Office Action received for U.S. Appl. No. 11/241,801, dated Apr. 20, 2012, 12 pages.
Final Office Action received for U.S. Appl. No. 11/241,801, dated Dec. 29, 2009, 10 pages.
Final Office Action received for U.S. Appl. No. 11/241,801, dated Jan. 6, 2011, 10 pages.
Final Office Action received for U.S. Appl. No. 11/241,801, dated Aug. 2, 2013, 14 pages.
Final Office Action received for U.S. Appl. No. 11/241,801, dated Dec. 13, 2012, 11 pages.
Final Office Action received for U.S. Appl. No. 11/241,801, dated Jul. 20, 2015, 8 pages.
Final Office Action received for U.S. Appl. No. 11/241,801, dated Mar. 12, 2015, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,801, dated Jul. 22, 2010, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,801, dated Nov. 4, 2011, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,801, dated Dec. 19, 2013, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,801, dated Jun. 12, 2009, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 11/241,801, dated Sep. 9, 2014, 12 pages.
Response to Final Office Action filed on Apr. 29, 2010, for U.S. Appl. No. 11/241,801, dated Dec. 29, 2009, 15 pages.
Response to Final Office Action filed on Jun. 12, 2015 for U.S. Appl. No. 11/241,801 dated Mar. 12, 2015, 10 pages.
Response to Final Office Action filed on Jun. 20, 2012 for U.S. Appl. No. 11/241,801, dated Apr. 20, 2012, 15 pages.
Response to Final Office Action filed on Mar. 7, 2011 for U.S. Appl. No. 11/241,801 dated Jan. 6, 2011, 14 pages.
Response to Final Office Action filed on Oct. 2, 2013 for U.S. Appl. No. 11/241,801, dated Aug. 2, 2013, 14 pages.
Response to Final Office Action filed on Sep. 21, 2015 for U.S. Appl. No. 11/241,801, dated Mar. 12, 2015, 21 pages.
Response to Non-Final Office Action filed on Apr. 15, 2013 for U.S. Appl. No. 11/241,801, dated Dec. 13, 2012, 15 pages.
Response to Non-Final Office Action filed on Apr. 21, 2014 for U.S. Appl. No. 11/241,801, dated Dec. 19, 2013, 15 pages.
First Examination Report received for Indian Patent No. 7927/DELNP/2007, dated Nov. 8, 2013, 2 pages.
Response to Office Action filed on Sep. 17, 2014 for Indian Patent Application No. 7927/DELNP/2007, 13 pages.
Cai et al., “Study on Classification of Product Data in Electronic Business Market”, Journal of Hangzhou Electronic Industry College, vol. 23, No. 4, with English abstract, Aug. 2003, 4 pages.
Cohen et al., “Automatic strategies in the Siemens RTL tiled window manager”, Computer Workstations, Proceedings of the 2nd IEEE Conference on Santaclara, Mar. 7, 1988, pp. 111-119.
Craigslist, “Wayback Machine”, Available online on URL: < https://web.archive.org/web/20041230183317/http://sydney.craigslist.org/>, Sep. 25, 2001, 1 page.
English et al., “Examining the Usability of Web Site Search”, Retrieved from the Internet: <URL: http://citeseer.ist.psu.edu/viewdoc/download?doi=10.1.1.137.9092&rep=rep1- &type=pdf>, 2002, 10 pages.
English et al., “Hierarchical Faceted Metadata in Site Search Interfaces”, Available Online on URL: <http://flamenco.berkeley.edu/papers/chi02_short_paper.pdf>, Sep. 2, 2012, 2 pages.
Hearst et al., “Finding the FL Site Sear”, Communications of the ACM 45(9), Sep. 2002, pp. 42-49.
Hearst et al., “Finding the Flow in Web Search”, Available Online on URL; <http://flamenco.berkeley.edu/papers/cacm02.pdf>, Sep. 9, 2002, pp. 42-49.
Irvine, “New Text and Social Messaging Features Expand Frucall's Voice-Based”, Retrieved from the Internet: URL:<http://globenewswire.com/news-release/2006/08/31/347746/104605/en/New-Text-and-Social-Messaging-Features-Expand-Frucall-s-Voice-Based-Mobile-Comparison-Shopping-Service.html?lang=en-us>, Aug. 31, 2006, pp. 1-2.
Kandogan et al., “Elastic Windows: Improved Spatial Layout and Rapid Multiple Window Operations”, Published in Proceedings of the Workshop on Advanced Visual Interfaces AVI, Jan. 1, 1996, pp. 29-38.
Kotas et al., “Use of Electronic Catalog to Facilitate User-to-User Sales”, U.S. Appl. No. 60/336,409, filed Oct. 31, 2001, 45 pages.
Loney, “Faceted Preference Matching in Recommender Systems”, Proceedings of the Second International Conference on Electronic Commerce and Web Technologies, 2001, pp. 295-304.
my comicshop. com, “Mad Comic Books”, Retrieved from the Internet: URL:<https://www.mycomicshop.com/search?tid=353271&pgi=i151>, 1955, 26 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2003/041543, dated Apr. 27, 2005, 6 pages.
International Search Report received for PCT Application No. PCT/US2003/041543, dated Apr. 14, 2004, 7 pages.
Written Opinion received for PCT Application No. PCT/US2003/041543, dated Aug. 26, 2004, 9 pages.
International Preliminary Report on Patentability, Application No. PCT/US2004/012678, dated Nov. 10, 2015, 4 pages.
International Search Report received for Application No. PCT/US2004/012678, dated Oct. 27, 2004, 5 pages, 5 pages.
International Written Opinion received for Application No. PCT/US2004/012678, dated Oct. 27, 2004, 5 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2005/035308, dated Oct. 3, 2007, 5 pages.
International Search Report received for PCT Patent Application No. PCT/US2005/035308, dated Apr. 18, 2006, 5 pages.
Written Opinion received for PCT Patent Application No. PCT/US2005/035308, dated Apr. 18, 2006, 3 pages.
International Preliminary Report on Patentability received for PCT Application No. PCT/US2005/035543, dated Oct. 11, 2007, 5 pages.
International Search Report received for PCT Patent Application No. PCT/US2005/035543, dated Mar. 16, 2006, 7 pages.
Written Opinion received for PCT Application No. PCT/US2005/035543, dated Mar. 16, 2016, 3 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2006/044944, dated Jun. 5, 2008, 5 pages.
International Written Opinion received for PCT Application No. PCT/US2006/044944, dated May 10, 2007, 3 pages.
International Search Report received for PCT Application No. PCT/US2006/044944, dated May 10, 2017, 2 pages.
Pruitt, “A Large-Area Flat Panel Multi-function Display for Military and Space Applications”, Proceedings of the IEEE 1992 National Aerospace and Electronics Conference (NAECON), May 18-22, 1992, 2 pages.
Office Action received for Indian Patent Application No. 7927/DELNP/2007, dated Jan. 19, 2017, 15 pages.
Office Action received for Indian Patent Application No. 7927/DELNP/2007, dated Mar. 16, 2017, 2 pages.
Response to Office Action filed on Jun. 23, 2014, for Indian Patent Application No. 7927/DELNP/2007, 32 pages.
Non Final Office Action received for U.S. Appl. No. 14/500,884, dated May 19, 2020, 26 Pages.
Applicant Interview summary received for U.S. Appl. No. 14/500,884 dated Aug. 10, 2020, 3 pages.
Appeal Decision Received for European Patent Application No. 05802369.8, mailed on Apr. 15, 2019, 17 Pages.
Communication Pursuant to Article 94(3) EPC Received for European Patent Application No. 05802369.8, dated Apr. 20, 2011, 2 pages.
Advisory Action Received for U.S. Appl. No. 14/500,884, dated Feb. 10, 2021, 4 Pages.
Final Office Action Received for U.S. Appl. No. 14/500,884 dated Nov. 23, 2020, 32 pages.
Non-Final Office Action received for U.S. Appl. No. 16/664,578, dated Dec. 30, 2020, 11 pages.
Mccown et al., “A navigation system for personalized databases ‘StarMap’”, Proceedings of the 33rd Annual Hawaii International Conference on System Sciences, Maui, HI, USA, 2000, vol. 1-, doi: 10.1109/HICSS.2000.926902, 2000, 9 pages.
Seeling et al., “Analysing associations of textual and relational data with a multiple views system”, Proceedings. Second International Conference on Coordinated and Multiple Views in Exploratory Visualization, 2004., London, UK, 2004, doi: 10.1109/CMV.2004.1319527. (Year: 2004), pp. 61-70.
Sharnowski et al., “A distributed, multimedia environmental information system”, Proceedings of the International Conference on Multimedia Computing and Systems, Washington, DC, USA, 1995, doi: 10.1109/ MMCS.1995.484918. (Year: 1995), 1995, pp. 142-149.
Final Office Action received for U.S. Appl. No. 16/664,578, dated Jul. 9, 2021, 9 Pages.
U.S. Appl. No. 14/500,884 , “Final Office Action Received for U.S. Appl. No. 14/500,884, dated Mar. 9, 2022”, dated Mar. 9, 2022, 16 Pages.
U.S. Appl. No. 16/664,578 , “Final Office Action”, U.S. Appl. No. 16/664,578, dated Mar. 31, 2022, 5 pages.
Non Final Office Action received for U.S. Appl. No. 14/500,884, dated Oct. 25, 2021, 24 pages.
Non Final Office Action received for U.S. Appl. No. 16/664,578, dated Dec. 9, 2021, 5 pages.
Response to Non-Final Office Action filed on Aug. 20, 2020 for U.S. Appl. No. 14/500,884, dated May 19, 2020, 13 pages.
U.S. Appl. No. 14/500,884, “Notice of Allowance”, U.S. Appl. No. 14/500,884, dated Aug. 5, 2022, 10 pages.
U.S. Appl. No. 16/664,578, “Notice of Allowance”, U.S. Appl. No. 16/664,578, dated Jun. 28, 2022, 7 pages.
Related Publications (1)
Number Date Country
20200134707 A1 Apr 2020 US
Provisional Applications (1)
Number Date Country
60666549 Mar 2005 US
Continuations (2)
Number Date Country
Parent 14809496 Jul 2015 US
Child 16728636 US
Parent 11241824 Sep 2005 US
Child 14809496 US