Methods and systems for reducing item selection error in an e-commerce environment

Information

  • Patent Grant
  • 11935103
  • Patent Number
    11,935,103
  • Date Filed
    Wednesday, December 29, 2021
    2 years ago
  • Date Issued
    Tuesday, March 19, 2024
    a month ago
Abstract
Methods and systems for displaying, to a user interface, categories of items for the user to select from for querying the system, in order to enable the system to return to the user images of items of the type desired by the user for possible purchase. The items categories of items can be presented to the user interface visually in silhouette form so that the user can select brand and the silhouette image of the type of item desired. Upon selection of the silhouette image, a query is sent to the system and images of the desired type of item are returned to the user interface for presentation to the user, along with certain attributes of the items.
Description
TECHNICAL FIELD

The present disclosure generally relates to data processing techniques. More specifically, the present disclosure relates to methods and systems for displaying, to a user interface, item listings for the user to select from for querying the system, in order to enable the system to return to the user images of the type of item desired by the user for possible purchase.


BACKGROUND

Advancements in computer and networking technologies have enabled persons to conduct commercial and financial transactions “on-line” via computer-based applications. This has given rise to a new era of electronic commerce (often referred to as e-commerce.) A number of well-known retailers have expanded their presence and reach by operating websites that facilitate e-commerce. In addition, many new retailers, which operate exclusively online, have come into existence. The business models utilized by enterprises operating online are almost as varied as the products and services offered. For instance, some products and services are offered at fixed prices, while others are offered via various transaction methods, and still others are offered via a system of classified ad listings. Some enterprises specialize in the selling of a specific category of product (e.g., books) or a specific service (e.g., tax preparation), while others provide a myriad of categories of items and services from which to choose. Some enterprises serve only as an intermediary, connecting sellers and buyers, while others sell directly to consumers.


Despite the many technical advances that have improved the state of e-commerce, a great number of technical challenges and problems remain. One such problem involves determining how to best present products and services (e.g., items) that are being offered for sale, so as to maximize the likelihood that a transaction (e.g., the sale of a product or service) will occur. For instance, when a potential buyer performs a search for a product or service, it may often be the case that the number of item listings that satisfy the potential buyer's query far exceeds the number of item listings that can practically be presented on a search results page. Furthermore, when a buyer selects from a user interface an item of interest by textual name of that item, a selection error can occur. That is, the buyer might select the incorrect name of the product, such as selecting a clutch handbag when an evening handbag is really desired. Preventing that error and providing the buyer with an image of the precise type of item he or she is looking for enhances the buyer's experience and is more likely to lead to an executed transaction.





DESCRIPTION OF THE DRAWINGS

Some embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which:



FIG. 1 is a block diagram of a network environment including a network-connected client system and server system, with which an embodiment of the invention might be implemented.



FIG. 2 illustrates is a database diagram illustrating an exemplary database for the transaction facility.



FIGS. 3A-3F illustrate an example of a method for buyer and displaying images of item listings in a user interface for one brand of product.



FIGS. 4A-4F illustrate an example of a method for displaying images of item listings i n a user interface, for another brand of product.



FIG. 5A-5D illustrate an example of a method for displaying images of item listings, for yet another brand of product.



FIGS. 6A-6D illustrate an example of a method of displaying images of item listings, for still another brand of product.



FIG. 7 is a block diagram of a machine in the form of a computing device, mobile or otherwise, within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.





DETAILED DESCRIPTION

Methods and systems for displaying, at a user interface, item listings for the user to select from for querying the system, in order to enable the system to return to the user interface items of the type desired by the user for possible purchase. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the various aspects of different embodiments of the present invention. It will be evident, however, to one skilled in the art, that the present invention may be practiced without these specific details.


Terminology


For the purposes of the present specification, the term “transaction” shall be taken to include any communications between two or more entities and shall be construed to include, but not be limited to, commercial transactions including sale and purchase transactions, auctions and the like.


Transaction Facility


To better understand the invention, an embodiment of an electronic transaction facility is shown in FIGS. 1 and 2.



FIG. 1 is a block diagram illustrating an exemplary network-based transaction facility in the form of an Internet-based transaction facility 10. While an exemplary embodiment of the present invention is described within the context of an transaction facility, it will be appreciated by those skilled in the art that the invention will find application in many different types of computer-based, and network-based, commerce facilities. It will also be appreciated by those skilled in the art that the invention may be used in transaction facilities of other architectures. The instructions stored in the transaction facility (which can be executed by a processor) can be stored on a machine-readable medium including, but not limited to read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or electrical, optical, acoustical or other form of propagated signals.


The transaction facility 10 within which an embodiment can be implemented includes one or more of a number of types of front-end servers, namely page servers 12 that deliver web pages (e.g., markup language documents), picture sewers 14 that dynamically deliver images to be displayed within Web pages, listing servers 16, CGI servers 18 that provide an intelligent interface to the back-end of transaction facility 10, and search servers 20 that handle search requests to the transaction facility 10. E-mail servers 21 provide, inter alia, automated e-mail communications to users of the transaction facility 10.


The back-end servers include a database engine server 22, a search index server 24 and a credit card database server 26, each of which maintains and facilitates access to a respective database.


An on-line trading application may form a part of database engine server 22 discussed below. The on-line trading application may include an on-line trading module and item listing presentation management module, and is associated with database 23.


The Internet-based transaction facility 10 may be accessed by a client program 30, such as a browser (e.g., the Internet Explorer distributed by Microsoft Corp. of Redmond Washington) that executes on a client machine 32 and accesses the transaction facility 10 via a network such as, for example, the Internet 34. Other examples of networks that a client may utilize to access the transaction facility 10 include a wide area network (WAN), a local area network (LAN), a wireless network (e.g. a cellular network), or the Plain Old Telephone Service (POTS) (or PSTN) network.


Database Structure



FIG. 2 is a database diagram illustrating an exemplary database 23, maintained by and accessed via the database engine server 22, which at least partially implements and supports the transaction facility 10. The database 23 may, in one embodiment, be implemented as a relational database, and includes a number of tables having entries, or records, that are linked by indices and keys. In an alternative embodiment, the database 23 may be implemented as a collection of blocks in a block-oriented database. While FIG. 2 shows one embodiment of a database, it will be appreciated by those skilled in the art that the invention can be used with other database structures.


Central to the database 23 is a user table 40, which contains a record for each user of the transaction facility 10. A user may operate as a seller, buyer, or both, within transaction facility 10. The database 23 also includes item tables 42 that may be linked to the user table 40. Specifically, the item tables 42 include a seller items table 44 and a buyer items table 46. A user record in the user table 40 may be linked to multiple items that are being, or have been auctioned or otherwise marketed via the transaction facility 10. A link indicates whether the user is a seller or a buyer with respect to items for which records exist within the items tables 42. While offerings by the seller are referred to as “items” in the specification, “items” includes any product or service offered by the seller. The database 23 also includes a note table 48 populated with note records that may be linked to one or more item records within the item tables 42 and/or to one or more user records within the user table 40. Each note record within the note table 48 may include, inter alia, a comment, description, history or other information pertaining to an item being auctioned or otherwise sold via the transaction facility 10 or to a user of the transaction facility 10.


A number of other tables are also shown to be linked to the user table 40, namely a user past aliases table 50, a feedback table 52, a bids table 54, an accounts table 56, and an account balances table 58.


The on-line trading system includes a user interface which may the browser 30 whereby a user can enter select a search term describing an item the user is interested in seeing for possible purchase. Usually a buyer would like to browse a category of images, for example hand bags, by style and/or brand so that the user can quickly find something interesting to the buyer. The buyer selects one of the choices for the desired item and images of one or more of the category of item selected from the choices are returned for the buyer to see for possible purchase. For instance, a number of product type can be presented via the user interface by name, such as shoes, handbags, clothes, and the like. If the category of item the user desires is a handbag, the handbag selection presented to the buyer in textual form by the user interface might be the words “clutch,” “evening bag,” “messenger style,” and “satchel,” among others. It could occur that the user might confuse, as one example, “clutch” with “evening bag” and enter “clutch” as the category of bag desired when the user is actually looking for an evening bag. This would result in one or more images of a clutch being returned to the user, when the user actually wanted one or more images of an evening bag image to be returned. Thus the item image being returned to the user that is not the actual category of item the user desires, and the user's experience is therefore less than optimum. This in itself may result in the transaction not being executed.


If, on the other hand, the selection presented to the user is in visual form, such as a silhouette of the product, then there is much less room for error. The user will see an image of the category of handbag. Using the example next above, the user would see images of “clutch,’ “evening bag,” “messenger style,” and “satchel.” This could be, for example, presented by brand. The user could then, as only one example of an embodiment, mouseover each image to display the name of each of the various types of handbag offered for each image. Mouseover is a well known function and an example, taken from Wikipedia, is seen in the appendix to this specification. The user could then click on the bag of the user's choice to view possible attributes of the bag such as styling, color and fabric, among others. Since the buyer will see images of the category of bag desired, with very little opportunity for error, the experience is more nearly optimum, and a transaction is more likely to be executed. This can be implemented, in one embodiment, by the images of types of handbags being presented to the user in silhouette so that the users could, for example, mouseover the silhouette image that is most like the category of bag desired and the title of the bag would be presented. For example, the user interface can present the prospective buyer with silhouettes of a clutch, an evening hag, a messenger style bag, and a satchel, among others. Seeing the silhouette images, the buyer can mouseover them for the name and simply select the silhouette of the category of bag desired, for example an evening bag, and the system displays the types of evening bags offered for sale. One of ordinary skill in the art will see that the displaying the title is not necessary for the invention. The user might just select the silhouette based on recognizing the silhouette. In this manner in which a buyer can find products or services which they wish to purchase. An example of general ways for a buyer to find products is seen in U.S. patent application Ser. No. 11/618,503 filed on Dec. 29, 2006, and incorporated herein by reference.


In response to the selection, information, including images and attributes of the selected silhouette can then be returned to the user interface for the user. In one embodiment this is accomplished by the system mapping the selected evening bag image information of this example to a textual value and making a query to the transaction facility 10 which will undertake a search using the query and will then obtain and return the foregoing image and attributes. In another embodiment, attributes of the images can be returned separately for presentation to the user by way of the user interface. The presentation of a silhouette of a category of product to the buyer, since it is visual, drastically reduces the opportunity for error, or error rate, makes the buyer's experience more nearly optimum, and is more likely to result in an executed transaction.


As another example, if the buyer is interested in women's shoes, various type of women's shoes will be presented in silhouette, such as high heels, pumps, flats, sandals, and the like. The user mouses over the silhouettes, again to display the name, as only one example, and clicks to select the category of shoe of interest from the silhouettes, for example, high heels. High heel shoes of various types and attributes can be returned to the buyer for possible purchase, much the same way handbags were returned in the above example. The various types of heels could be three-inch heels stilettos, Cuban heels, and the like. Attributes could be fabric, color, size, price and availability, among others can be displayed for the user.


In some embodiments, a user operates a web browser application 30 on a client machine 32 to interact with the transaction facility 10. A user may be presented with a search interface on client browser 30, with items in silhouette, as described generally above, and in more detail below, from which the user can select an item to be used in generating a search request submitted to the transaction facility. In some embodiments users themselves may be able to select certain item attributes. For example, the buyer may be interested in women's shoes. Certain type of women's shoes are provided to the user interface in visual representation as silhouettes. The buyer selects a category of shoe in silhouette, and, as a result, shoes of that category, with certain attributes, such as the color, fabric, size, price, and the like, will be returned for the user to see and possibly purchase. This can be implemented by the transaction facility 10, after receiving and processing the search request, communicating a response to the web browser application 30. The response could be obtained from a system of the type seen in U.S. patent application Ser. No. 12/749,458, filed Mar. 29, 2010, entitled “METHODS AND SYSTEMS FOR IMPROVING THE CATEGORIZATION OF ITEMS FOR WHICH ITEM LISTINGS ARE MADE BY A USER OF AN ECOMMERCE SYSTEM (now abandoned) included herein by reference. The response could be, for example, an Internet document or web page that, when rendered by the browser application 30, displays a search results page showing one or more item listings, possibly with attributes, that satisfy the user's search request. The item listings are, in some embodiments, presented by a presentation module, which may be a web server or an application server.


In some embodiments, the search engine module, not shown but of a type well known in the industry could provide the actual search function. For instance, the search engine module, in some embodiments, receives and processes a search request to identify the item listings that satisfy the search request. It will be appreciated by those skilled in the art that a variety of search techniques might be implemented to identify item listings that satisfy a search request. In general, however, the item attributes of item listings are analyzed for the presence of the user-provided search terms. For instance, in some embodiments, selected silhouette can be converted to textual information and used to query system storage.


In an alternative embodiment, the search engine module may represent an interface to a search engine implemented as an external component or module, for example, as part of transaction facility 10, or as a separate external module. In such a scenario, the search engine module 48 may simply receive the set of item listings that satisfy a search query.


Turning now to FIG. 3A there is shown one embodiment in which a screen is presented to a prospective buyer, for example at the user interface 30. The screen in FIG. 3A shows type of shoes in silhouette form. A particular brand and category might be used as a default brand and category as one example for beginning the process. In this case the default brand is Cole Haan and the default category of shoe is Heels and Pumps. Those of ordinary skill in the art will recognize that a default category need not be used, but that other ways of beginning the process can be used. In silhouette form across the top of page are the various other types of shows offered by Cole Haan—heels and pumps 101, flats 103, sandals 105, loafers 107, boots 109, clogs 111, and platform heels 113. By mousing over each of these a selection indicator such as, in one embodiment, turning the silhouette blue, can give the user the name of the type of shoe. The user has the opportunity of selecting the shoe type illustrated by that silhouette. Clicking on the silhouette will then show all shoes of that type offered by Cole Haan. For example, clicking on one of these type, say boots 109 would show the types of boots offered by Cole Haan. This is seen in FIG. 3B. These images are, in one embodiment, retrieved from system storage by mapping the selected silhouette information to textual information to be used as, or as part of, a query for the storage. An example of mapping the image information to textual for forming a query is seen below. Any of the “See All” shoes in FIG. 3B can be clicked on and purchased. Examples of purchasing, and seeing attributes, such as size, material, price, and the like, are seen below.


As seen in the FIGS. 3A and 3B, there are other brands offered. Clicking on other type would operate likewise. For example, clicking on Steve Madden in FIG. 3C shows in silhouette form the types of shoes offered by that brand—heel, pumps 115, sandals 117, boots 119, platforms 121, and Flats 123. Mousing over each of these silhouettes can cause, in this embodiment, the silhouette to turn blue in color and to list the type of shoe in words, for ease of viewing and understanding, ready for selection. As mentioned above, while this color change and listing in words is used in this embodiment, it is not necessary for the practice of the invention. Other ways of providing this function are within one of ordinary skill in the art without departing from the spirit or the scope of the invention. Then the appropriate silhouette is selected, for example, boots 119, and the system returns images of the boots offered by the Steve Madden brand as in FIG. 3D. Clicking then on an image of a desired boot can show attributes such as size, color, and the like, according to the designer's choice and implemented as is known in the art. An improved way of doing this is seen in FIG. 3E where the screen can also show silhouette of boot types such as ankle, knee-high, mid-calf, cowboy/western, classics, thigh/high, and snow/winter. Selecting, for example, size, color, heel type, and the desired silhouette would show an image of the boots offered for sale in that size, color, and heel type. One could then click on one of the desired type of boots, such as at 125, and obtain other sales information of interest including attributes such as size, price, color, material, and the like, as seen in FIG. 3F. In each case, selection of an image could cause the image data to be converted to textual data that can be used for, or as part of, the query to system storage to retrieve the information viewed by the user. A purchase or bid can then be made if desired and the purchase registered, or recorded, by the transaction facility.


This same silhouette process can be used for other types of products, such as handbags, women's clothing, men's clothing and men's shoes, among others. The silhouette process above operates similarly for each. For example, FIG. 4A illustrates the silhouette process for an embodiment involving women's clothes. Brands are seen listed across the top such as Ann Taylor, St. John, BCBT Max Azria, Banana Republic, Calvin Klein, Theory, Bebe, and other brands. The screen can show silhouettes of women's clothing types such as blazers, dresses, jeans, pants, shirts/tops and skirts. One can select, for example, Calvin Klein and select dresses as in FIG. 4B. A type of clothing, for example, dresses 127, can be selected as in FIG. 4C, and all dresses of the type offered by the Calvin Klein brand will be displayed. Silhouettes can also be used for selecting attributes of dresses such as, for example, sleeve type, size, length, and the like. This is seen in FIG. 4D where the types of dresses—sleeveless, short sleeve, strapless, spaghetti strap, one shoulder, long sleeve, halter, cap sleeve, and ¾ sleeve are displayed as at 129. Again, if desired, title of the type of dress can be seen selected by mousing over the appropriate type of dress, and clicking, as one example. This is seen in FIG. 4E. One can also select size, color, length 133 and desired style in silhouette such as short sleeve 135. The resulting images of the dresses offered for sale in that size and color and dress length are then presented to the browser. One could then select one of the desired dresses such as cocktail dress 137 and obtains other sales information ofi merest for the cocktail dress, with the result shown in FIG. 4F. Again, a purchase or bid can be made if desired.


Men's clothing operates similarly, as in FIG. 5A through FIG. 5D, where men's close generally are seen in FIG. 5A. Tommy Hilfiger jeans are selected in FIG. 5B, and attributes such as style and size are returned to the user in response to the user selecting a style and size as in FIG. 5C. A particular selection such as 141 can be selected. As seen in FIG. 5D the particular type of jeans could be purchased. as described in the examples above. The system functions similarly for handbags as seen in FIGS. 6A through 6D where a clutch type of handbag is selected.


Silhouettes can be generated for use in the system in several ways. For example, a silhouette of, say, a man's jacket can be generated using computer applications such as Adobe Illustrator and/or Adobe Photoshop. A jacket outline can be created from the photo of a man wearing a jacket. That is, the outline is traced from a photograph. A second way of obtain silhouettes is by using baseline silhouette vector images bought from i-stockphoto (www.istockphoto.com) and, using the above applications, modifying the image to represent the context of navigation. That is, a black silhouette vector file of a man wearing a jacket stockphoto is purchased and the jacket portion of it drawn out and painted in color to represent the jackets category. Several silhouettes representing several different categories can be created side by side in one large image called a sprite. This sprite is uploaded into the picture server in a desired format, for example as a transparent Portable Networks Graphic (PNG). The transparency aids in applying a background color using, for example, Cascading Style Sheets (CSS) as necessary to imply different states of the same image, for example, hover state, selected state. The developer could then point to each category using pixel co-ordinates as location and assigns a URL to each one to make each silhouette a link to a certain category. Hence one long image replaces the need to upload several images and multiple states of these images which in turn helps save page weight.


An example of mapping silhouette image information selected by the buyer to textual information to be used as, or as part of, a query for the system to obtain images of the type of item selected by the user can be seen in FIG. 7. For each silhouette, there could a value associated on the web page. For example, “Thigh-high” is associated with the silhouette illustrated. This value (“Thigh-high”) could be used retrieve the search parameter value (such as Fashion—Thigh-High) from a static harshmap which contains all the mappings. This could be easily modified to look-up from a Database table or any other datastructure.


Hardware Operation


The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.


Similarly, the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.


The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., Application Program Interfaces (APIs).)


Example Computer System



FIG. 8 is a block diagram of a machine in the form of a mobile device within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environments, or as a peer machine in peer-to-peer (or distributed) network environments. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


The example computer system 1500 includes a processor 1502 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 1501 and a static memory 1506, which communicate with each other via a bus 1508. The computer system 1500 may further include a display unit 1510, an alphanumeric input device 1517 (e.g., a keyboard), and a user interface (UI) navigation device 1511 (e.g., a mouse). In one embodiment, the display, input device and cursor control device are a touch screen display. The computer system 1500 may additionally include a storage device (e.g., drive unit 1516), a signal generation device 1518 (e.g., a speaker), a network interface device 1520, and one or more sensors 1521, such as a global positioning system sensor, compass, accelerometer, or other sensor.


The drive unit 1516 includes a machine-readable medium 1522 on which is stored one or more sets of instructions and data structures (e.g., software) 1523 embodying or utilized by any one or more of the methodologies or functions described herein. The software 1523 may also reside, completely or at least partially, within the main memory 1501 and/or within the processor 1502 during execution thereof by the computer system 1500, the main memory 1501 and the processor 1502 also constituting machine-readable media.


While the machine-readable medium 1522 is illustrated in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention, or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media include non-volatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.


The software 1523 may further be transmitted or received over a communications network 1526 using a transmission medium via the network interface device 1520 utilizing any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), the Internet, mobile telephone networks, Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Wi-Fi® and WiMax® networks). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.


Although an embodiment has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be 1 regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof, show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


Appendix


A Mouseover or hover box refers to a GUI event that is raised when the user moves or “hovers” the cursor over a particular area of the GUI. The technique is particularly common in web browsers where the URL of a hyperlink can be viewed in the status bar. Site designers can easily define their own mouseover events using Javascript[1] and Cascading Style Sheets.[2] In case of multiple layers the mouseover event is triggered by the uppermost layer.


Mouseover events are not limited to web design and are commonly used in modern GUI programming. Their existence might not even be known to the user as the events can be used to call any function and might affect only the internal workings of the program.


Tooltip


A special usage of mouseover event is a tooltip showing a short description of the GUI object under the cursor. The tooltip generally appears only after the mouse is held over the object for a certain amount of time.


Examples

















<!-- Direct usage not recommended | does not conform with web



standards -->



<img id=“myImage” src=“/images/myImage. jpg”



onMouseOver=“alert(‘your message’);”>



// javascript without any framework



var myImg = document.getElementById(‘myImage’);



function myMessaqe( ) {



 alert(‘your message’);



}



if (myImg.addEventListener) { //addEventListener is the standard



method to add events to objects



 myImg.addEventListener(‘mouseover’, myMessage, false);



}



else if(myImg.attachEvent) { //for internet Explorer



 myImg.attachEvent(‘onmouseover’, myMessage);



}



else [ //for other browsers



 myImg.onmouseover = myMessage;



]



// jQuery example | degrades well if javascript is disabled in client



browser



$(“img”).mouseover(function( ){



 alert(‘your message’);



});









Claims
  • 1. A computer implemented method performed by a system, the method comprising: causing, by one or more processors of the system, display via a user interface of a plurality of selectable brands associated with a product category and a plurality of selectable product styles associated with the product category;detecting, by the one or more processors, a first selection comprising a selected brand of the plurality of selectable brands and a second selection comprising a selected product style of the plurality of selectable product styles;responsive to detecting the first selection and the second selection, causing, by the one or more processors, display via the user interface of a plurality of selectable product attributes associated with products associated with the selected brand and the selected product style, wherein the plurality of selectable product attributes includes a plurality of images, wherein each of the plurality of images includes a visual representation of a product having the selected product style and a corresponding different attribute of the plurality of selectable product attributes;detecting, by the one or more processors, a third selection of a selected product attribute of the plurality of selectable product attributes; andresponsive to detecting the third selection, causing, by the one or more processors, display via the user interface of one or more product listings related to one or more of the products associated with the selected brand, the selected product style, and the selected product attribute.
  • 2. The computer implemented method of claim 1, further comprising: responsive to detecting the first selection and the second selection, determining a mapping between the selected product style and a textual value; andcausing display of the plurality of selectable product attributes corresponding to the textual value.
  • 3. The computer implemented method of claim 1, wherein displaying of the plurality of selectable brands is further based on a user-provided search term.
  • 4. The computer implemented method of claim 1, further comprising: responsive to detecting selection of a selected product listing of the one or more product listings, enabling purchase of a product represented by the selected product listing.
  • 5. The computer implemented method of claim 1, further comprising: detecting a mouseover or hover associated with at least one of the plurality of selectable product styles; andcausing display of a text description of the at least one of the plurality of selectable product styles in response to detecting the mouseover or the hover.
  • 6. The computer implemented method of claim 1, wherein the plurality of images includes at least one image of at least one product associated with the selected brand and the selected product style.
  • 7. The computer implemented method of claim 1, wherein an attribute of the plurality of selectable product attributes comprises a color, size, style, price, material, fabric, product availability, sleeve type, or length, or a combination thereof.
  • 8. The computer implemented method of claim 1, wherein each brand of the plurality of selectable brands represents a respective brand of products within the product category and each product style of the plurality of selectable product styles visually indicates a respective product style within the product category.
  • 9. A system comprising: at least one hardware processor; anda computer-readable storage device storing instructions which, when executed by the at least one hardware processor, cause the system to perform operations comprising: causing display via a user interface of a plurality of selectable brands associated with a product category and a plurality of selectable product styles associated with the product category;detecting a first selection comprising a selected brand of the plurality of selectable brands and a second selection comprising a selected product style of the plurality of selectable product styles;responsive to detecting the first selection and the second selection, causing display via the user interface of a plurality of selectable product attributes associated with products associated with the selected brand and the selected product style, wherein the plurality of selectable product attributes includes a plurality of images, wherein each of the plurality of images includes a visual representation of a product having the selected product style and a corresponding different attribute of the plurality of selectable product attributes;detecting a third selection of a selected product attribute of the plurality of selectable product attributes; andresponsive to detecting the third selection, causing display via the user interface of one or more product listings related to one or more of the products associated with the selected brand, the selected product style, and the selected product attribute.
  • 10. The system of claim 9, the operations further comprising: responsive to detecting the first selection and the second selection, determining a mapping between the selected product style and a textual value; andcausing display of the plurality of selectable product attributes corresponding to the textual value.
  • 11. The system of claim 9, wherein displaying of the plurality of selectable brands is further based on a user-provided search term.
  • 12. The system of claim 9, the operations further comprising: responsive to detecting selection of a selected product listing of the one or more product listings, enabling purchase of a product represented by the selected product listing.
  • 13. The system of claim 9, the operations further comprising: detecting a mouseover or hover associated with at least one of the plurality of selectable product styles; andcausing display of a text description of the at least one of the plurality of selectable product styles in response to detecting the mouseover or the hover.
  • 14. The system of claim 9, wherein the selected product style of the plurality of selectable product styles comprises a silhouette of a product style.
  • 15. The system of claim 9, wherein the plurality of images includes at least one image of at least one product associated with the selected brand and the selected product style.
  • 16. The system of claim 9, wherein an attribute of the plurality of selectable product attributes comprises a color, size, style, price, material, fabric, product availability, sleeve type, or length, or a combination thereof.
  • 17. The system of claim 9, wherein each brand of the plurality of selectable brands represents a respective brand of products within the product category and each product style of the plurality of selectable product styles visually indicates a respective product style within the product category.
  • 18. A non-transitory computer-readable medium storing instructions which, when executed by at least one hardware processor, cause a system to perform operations comprising: causing display via a user interface of a plurality of selectable brands associated with a product category and a plurality of selectable product styles associated with the product category;detecting a first selection comprising a selected brand of the plurality of selectable brands and a second selection comprising a selected product style of the plurality of selectable product styles;responsive to detecting the first selection and the second selection, causing display via the user interface of a plurality of selectable product attributes associated with products associated with the selected brand and the selected product style, wherein the plurality of selectable product attributes includes a plurality of images, wherein each of the plurality of images includes a visual representation of a product having the selected product style and a corresponding different attribute of the plurality of selectable product attributes;detecting a third selection of a selected product attribute of the plurality of selectable product attributes; andresponsive to detecting the third selection, causing display via the user interface of one or more product listings related to one or more of the products associated with the selected brand, the selected product style, and the selected product attribute.
  • 19. The non-transitory computer-readable medium of claim 18, the operations further comprising: responsive to detecting the first selection and the second selection, determining a mapping between the selected product style and a textual value; andcausing display of the plurality of selectable product attributes corresponding to the textual value.
  • 20. The non-transitory computer-readable medium of claim 18, wherein displaying of the plurality of selectable brands is further based on a user-provided search term.
CLAIM OF PRIORITY

This application is a continuation of and claims the benefit of priority to U.S. application Ser. No. 12/749,467, filed Mar. 29, 2010, which is hereby incorporated by reference in its entirety.

US Referenced Citations (170)
Number Name Date Kind
5852823 De Bonet Dec 1998 A
6393427 Vu et al. May 2002 B1
6446061 Doerre et al. Sep 2002 B1
6546309 Gazzuolo Apr 2003 B1
6704725 Lee Mar 2004 B1
6751343 Ferrell et al. Jun 2004 B1
6751600 Wolin Jun 2004 B1
6763148 Stemberg et al. Jul 2004 B1
6804683 Matsuzaki et al. Oct 2004 B1
6865302 Chang Mar 2005 B2
6925196 Kass et al. Aug 2005 B2
6941321 Schuetze et al. Sep 2005 B2
7035440 Kaku Apr 2006 B2
7035467 Nicponski Apr 2006 B2
7260568 Zhang et al. Aug 2007 B2
7277572 Macinnes et al. Oct 2007 B2
7315833 Schrenk Jan 2008 B2
7437321 Hanechak Oct 2008 B2
7580867 Nykamp Aug 2009 B2
7603367 Kanter Oct 2009 B1
7617016 Wannier et al. Nov 2009 B2
7620539 Gaussier et al. Nov 2009 B2
7657126 Gokturk et al. Feb 2010 B2
7783622 Vandermolen et al. Aug 2010 B1
7882156 Wykes Feb 2011 B2
7930546 Rhoads et al. Apr 2011 B2
7996282 Scott Aug 2011 B1
8073818 Duan et al. Dec 2011 B2
8081158 Harris Dec 2011 B2
8121902 Desjardins Feb 2012 B1
8180690 Mayle et al. May 2012 B2
8295854 Osann Oct 2012 B2
8306872 Inoue et al. Nov 2012 B2
8335784 Gutt Dec 2012 B2
8412594 Kundu Apr 2013 B2
8429173 Rosenberg et al. Apr 2013 B1
8467613 Baker et al. Jun 2013 B2
8520979 Conwell Aug 2013 B2
8543580 Chen et al. Sep 2013 B2
8595651 Kenemer et al. Nov 2013 B2
8719075 Macdonald Korth et al. May 2014 B2
8732151 All et al. May 2014 B2
8738630 Lin May 2014 B2
8781231 Kumar et al. Jul 2014 B1
8861844 Chittar et al. Oct 2014 B2
8903816 Dumon et al. Dec 2014 B2
8949252 Chittar et al. Feb 2015 B2
9043828 Jing et al. May 2015 B1
9092458 Perona et al. Jul 2015 B1
9280563 Chittar et al. Mar 2016 B2
9405773 Chittar et al. Aug 2016 B2
9471604 Chittar et al. Oct 2016 B2
9715510 Chittar et al. Jul 2017 B2
9792638 Liu et al. Oct 2017 B2
9846903 Kundu Dec 2017 B2
10528615 Chittar et al. Jan 2020 B2
20010016077 Oki Aug 2001 A1
20020035518 Kano Mar 2002 A1
20020087558 Bailey Jul 2002 A1
20020106111 Kass et al. Aug 2002 A1
20020143636 Carignani Oct 2002 A1
20020156694 Christensen et al. Oct 2002 A1
20030083850 Schmidt et al. May 2003 A1
20030130910 Pickover et al. Jul 2003 A1
20030187844 Li et al. Oct 2003 A1
20040030578 Cross et al. Feb 2004 A1
20040083203 Kemp Apr 2004 A1
20040182413 De Sep 2004 A1
20040228526 Lin et al. Nov 2004 A9
20050022106 Kawai et al. Jan 2005 A1
20050022708 Lee Feb 2005 A1
20050071256 Singhal Mar 2005 A1
20050164273 Stoughton et al. Jul 2005 A1
20050196016 Sato et al. Sep 2005 A1
20060080182 Thompson et al. Apr 2006 A1
20060212362 Donsbach et al. Sep 2006 A1
20070005571 Brewer et al. Jan 2007 A1
20070074110 Miksovsky et al. Mar 2007 A1
20070112640 Grove et al. May 2007 A1
20070162328 Reich Jul 2007 A1
20070168357 Yeong Jul 2007 A1
20070172155 Guckenberger Jul 2007 A1
20070185865 Budzik et al. Aug 2007 A1
20070257792 Gold Nov 2007 A1
20070260597 Cramer Nov 2007 A1
20080040219 Kim et al. Feb 2008 A1
20080040671 Reed Feb 2008 A1
20080071553 Hamadi et al. Mar 2008 A1
20080109327 Mayle et al. May 2008 A1
20080154488 Silva et al. Jun 2008 A1
20080160956 Jackson et al. Jul 2008 A1
20080162032 Wuersch et al. Jul 2008 A1
20080168401 Boule et al. Jul 2008 A1
20080226119 Candelore et al. Sep 2008 A1
20080229225 Kaye Sep 2008 A1
20080243837 Davis et al. Oct 2008 A1
20080281814 Calistri et al. Nov 2008 A1
20080301128 Gandert et al. Dec 2008 A1
20090018932 Evans et al. Jan 2009 A1
20090061884 Rajan et al. Mar 2009 A1
20090094138 Sweitzer et al. Apr 2009 A1
20090094260 Cheng et al. Apr 2009 A1
20090112830 Denoue et al. Apr 2009 A1
20090132943 Minsky May 2009 A1
20090138376 Smyers et al. May 2009 A1
20090150791 Garcia Jun 2009 A1
20090172730 Schiff et al. Jul 2009 A1
20090182612 Challener et al. Jul 2009 A1
20090193675 Sieber Aug 2009 A1
20090271293 Parkhurst et al. Oct 2009 A1
20090276453 Trout et al. Nov 2009 A1
20090287655 Bennett Nov 2009 A1
20090313239 Wen et al. Dec 2009 A1
20100023407 Grady et al. Jan 2010 A1
20100030578 Siddique et al. Feb 2010 A1
20100036711 Shenfield et al. Feb 2010 A1
20100036883 Valencia-campo et al. Feb 2010 A1
20100094935 Svendsen et al. Apr 2010 A1
20100105370 Kruzeniski et al. Apr 2010 A1
20100135597 Gokturk et al. Jun 2010 A1
20100138295 Caron et al. Jun 2010 A1
20100159904 Colligan et al. Jun 2010 A1
20100191770 Cho Jul 2010 A1
20100203901 Dinoff et al. Aug 2010 A1
20100217667 Mo Aug 2010 A1
20100241512 Tirpak et al. Sep 2010 A1
20100257104 Bundy Oct 2010 A1
20100299132 Dolan et al. Nov 2010 A1
20100332283 Ng et al. Dec 2010 A1
20100332324 Khosravy et al. Dec 2010 A1
20110004522 Lee Jan 2011 A1
20110040602 Kurani Feb 2011 A1
20110055238 Slaney et al. Mar 2011 A1
20110085697 Clippard et al. Apr 2011 A1
20110093361 Morales Apr 2011 A1
20110106594 Shirey May 2011 A1
20110106805 Bao et al. May 2011 A1
20110161182 Racco Jun 2011 A1
20110184831 Dalgleish Jul 2011 A1
20110191374 Bengio et al. Aug 2011 A1
20110196724 Fenton et al. Aug 2011 A1
20110231278 Fries Sep 2011 A1
20110235902 Chittar et al. Sep 2011 A1
20110238534 Yakkala Sep 2011 A1
20110238536 Llu et al. Sep 2011 A1
20110238659 Chittar et al. Sep 2011 A1
20110295711 Mazmanyan Dec 2011 A1
20110314031 Chittar et al. Dec 2011 A1
20120054041 Williams Mar 2012 A1
20120054059 Rele Mar 2012 A1
20120054060 Kundu Mar 2012 A1
20120126998 Morgan et al. May 2012 A1
20120159294 Gonsalves et al. Jun 2012 A1
20120265635 Forsblom Oct 2012 A1
20120276928 Shutter Nov 2012 A1
20120302258 Pal et al. Nov 2012 A1
20130085860 Summers et al. Apr 2013 A1
20130085900 Williams Apr 2013 A1
20130226743 Kundu Aug 2013 A1
20130262455 Cramer et al. Oct 2013 A1
20140105489 Chittar et al. Apr 2014 A1
20140156410 Wuersch et al. Jun 2014 A1
20140324836 Chittar et al. Oct 2014 A1
20150039393 Jain Feb 2015 A1
20160012124 Ruvini et al. Jan 2016 A1
20170004632 Chittar et al. Jan 2017 A1
20170322951 Chittar et al. Nov 2017 A1
20180012277 Liu et al. Jan 2018 A1
20180130121 Kundu May 2018 A1
20200117685 Chittar et al. Apr 2020 A1
Foreign Referenced Citations (24)
Number Date Country
2011299401 Jan 2013 AU
2014206199 Aug 2014 AU
2012318961 Mar 2016 AU
1361494 Jul 2002 CN
101206749 Jun 2008 CN
101441651 May 2009 CN
101546406 Sep 2009 CN
101556584 Oct 2009 CN
103430202 Dec 2013 CN
104040577 Sep 2014 CN
106651521 May 2017 CN
1220129 Jul 2002 EP
10-2002-0069767 Sep 2002 KR
10-2010-0020041 Feb 2010 KR
2586028 Jun 2016 RU
2001011491 Feb 2001 WO
2009094724 Aug 2009 WO
2011044497 Apr 2011 WO
2012030672 Mar 2012 WO
2012030674 Mar 2012 WO
2012033654 Mar 2012 WO
2013052316 Apr 2013 WO
2014085657 Jun 2014 WO
2016007382 Jan 2016 WO
Non-Patent Literature Citations (130)
Entry
“EBay Asks Its Users for Help Building New Search Tools,” New York Times Company, Feb. 9, 2010 (Year: 2010).
“Simplicity Itself,” by James Clark, Commercial Motor, Mar. 11, 2010 (Year: 2010).
Notice of Allowance received for U.S. Appl. No. 15/841,805, dated Nov. 23, 2021, 8 Pages.
Japan's Amana Web Site Will Boast Graphic-Based Image Search, AsiaPulse News: NA, Asia Pulse Pty Ltd, Dec. 15, 2006, 2 pages.
Shop It To Me: About Us, https://www.shopittome.com/about_us, Accessed on Nov. 13, 2010, pp. 1-2.
Notice of Allowance received for U.S. Appl. No. 12/749,467, dated Jun. 14, 2017, 14 pages.
Advisory Action received for U.S. Appl. No. 12/749,458, dated Jul. 14, 2015, 3 pages.
Advisory Action received for U.S. Appl. No. 12/749,458, dated May 10, 2013, 3 pages.
Final Office Action received for U.S. Appl. No. 12/749,458, dated Apr. 8, 2015, 36 pages.
Final Office Action received for U.S. Appl. No. 12/749,458, dated Feb. 13, 2013, 26 pages.
Non-Final Office Action received for U.S. Appl. No. 12/749,458, dated Feb. 3, 2012, 26 pages.
Non-Final Office Action received for U.S. Appl. No. 12/749,458, dated Jan. 13, 2016, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 12/749,458, dated Jul. 20, 2012, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 12/749,458, dated Sep. 30, 2014, 40 pages.
Final Office Action received for U.S. Appl. No. 12/749,467, dated Jul. 8, 2015, 32 pages.
Final Office Action received for U.S. Appl. No. 12/749,467, dated Jul. 19, 2012, 23 pages.
Final Office Action received for U.S. Appl. No. 12/749,467, dated Mar. 8, 2017, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 12/749,467, dated Mar. 1, 2012, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 12/749,467, dated May 8, 2014, 27 pages.
Non-Final Office Action received for U.S. Appl. No. 12/749,467, dated Nov. 7, 2014, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 12/749,467, dated Sep. 9, 2016,13 pages.
Definition of silhouette, https://www.dictionary.com/browse/silhouette, 6 pages.
TheFind.com Launches Next-Generation Shopping Search Engine that Helps Consumers Find Any Product Sold Online; TheFind.com Strives to Search Every Online Store to Deliver Unbiased, Relevant Shopping Search Results, PR Newswire (New York), Oct. 31, 2006, 3 Pages.
Advisory Action Received for U.S. Appl. No. 15/655,003, dated Nov. 3, 2020, 3 pages.
Final Office Action Received for U.S. Appl. No. 15/655,003, dated Jun. 30, 2021, 11 pages.
Final Office Action received for U.S. Appl. No. 15/655,003, dated Aug. 17, 2020, 20 pages.
First Action Interview Pre-Interview Communication Received for U.S. Appl. No. 15/655,003, dated Oct. 29, 2019, 8 pages.
Non-Final Office Action received for U.S. Appl. No. 15/655,003, dated Jan. 21, 2021, 23 Pages.
Notice Of Allowability Received for U.S. Appl. No. 15/655,003, dated Feb. 3, 2022, 2 Pages.
Notice of Allowance Received for U.S. Appl. No. 15/655,003, dated Jan. 7, 2022, 17 pages.
Advisory Action Received for U.S. Appl. No. 15/841,805, dated Jun. 12, 2020, 2 pages.
Final Office Action Received for U.S. Appl. No. 15/841,805, dated Apr. 17, 2020, 11 pages.
Final Office Action Received for U.S. Appl. No. 15/841,805, dated Feb. 8, 2021, 16 Pages.
Non Final Office Action Received for U.S. Appl. No. 15/841,805, dated Dec. 19, 2019, 10 pages.
Non Final Office Action Received for U.S. Appl. No. 15/841,805, dated Jun. 25, 2021, 5 pages.
Non-Final Office Action Received for U.S. Appl. No. 15/841,805, dated Aug. 13, 2020, 15 Pages.
Admiral Metals, “Reflecting on the History of Customer Service”, Retrieved from the Internet URL: <https://www.admiralmetals.com/admiral-care/reflecting-history-customer-service/>, Dec. 10, 2014, 2 pages.
Boshoff et al., “The mediating effect of brand image and information search intentions on the perceived risks associated with online purchasing on a generically-branded website”, Management Dynamics 18.4: 18-28, 2009, 9 Pages.
Canadian Application Serial No. 2,804,052, Examiner's Report dated Nov. 25, 2014, 4 pgs.
Canadian Application Serial No. 2,804,052, Office Action dated Oct. 26, 2015, 3 pgs.
Chinese Application Serial No. 201180033079.2, Office Action dated Jun. 2, 2016, (With English Translation), 13 pages (5 pages of English Translation, 4 pages of Pending Claims and 4 pages of Official Copy).
Decision of Rejection received for Chinese Patent Application No. 201611209571.1 dated Feb. 5, 2021, 2 Pages (1 page of Official Copy & 1 page of English Translation).
Extended European Search Report received for European Patent Application No. 11755177.0, dated Dec. 23, 2015, 5 pages.
Final Office Action received for U.S. Appl. No. 13/011,374, dated Apr. 1, 2016, 15 pages.
Final Office Action received for U.S. Appl. No. 13/011,374, dated Jul. 15, 2013, 12 pages.
Final Office Action received for U.S. Appl. No. 13/011,436, dated Jun. 24, 2015, 21 pages.
Final Office Action received for U.S. Appl. No. 13/011,436, dated Mar. 7, 2014, 13 pages.
Final Office Action received for U.S. Appl. No. 13/073,926, dated Feb. 26, 2015, 25 pages.
Final Office Action received for U.S. Appl. No. 13/073,926, dated Oct. 17, 2013, 13 pages.
Final Office Action received for U.S. Appl. No. 13/073,936, dated Aug. 4, 2014, 9 pages.
Final Office Action received for U.S. Appl. No. 13/073,936, dated Oct. 30, 2013, 12 pages.
Final Office Action received for U.S. Appl. No. 13/624,599, dated Apr. 24, 2015, 19 pages.
Final Office Action received for U.S. Appl. No. 13/624,599, dated Jun. 14, 2016, 26 pages.
First Action Interview Office Action Summary received for U.S. Appl. No. 15/655,003, dated Mar. 3, 2020, 9 pages.
First Action Interview-Office Action Summary received for U.S. Appl. No. 15/656,410, dated Jun. 4, 2019, 8 pages.
First Action Interview-Pre-Interview Communication received for U.S. Appl. No. 15/656,410, dated Feb. 26, 2019 4 pages.
First Examination Report received for Australian Patent Application No. 2011299401, dated Sep. 27, 2013, 3 pages.
First Examination Report received for Australian Patent Application No. 2014206199, dated Feb. 26, 2016, 5 pages.
First Examiner Report received for Australian Patent Application No. 2012318961, dated Mar. 5, 2015, 3 pages,.
Huang et al., “Segmentation of Color Textures Using K-Means Cluster Based Wavelet Image Fusion”, Applied Mechanics and Materials, Jan. 12, 2010, pp. 209-214.
International Application Serial No. PCT/US11/49454, International Search Report dated Nov. 28, 2014, 2 pgs.
International Application Serial No. PCT/US2011/49449, Search Report dated Jan. 19, 2012, 2 pgs.
International Application Serial No. PCT/US2012/058101, Search Report dated Nov. 29, 2012, 2 pgs.
International Search Report and Written Opinion of the International Searching Authority, issued in connection with Int'l Appl. No. PCT/US2011/049444, dated Nov. 18, 2014, 7 pages.
International Search Report received for PCT Application No. PCT/US2012/057110, dated Nov. 30, 2012, 2 pages.
International Search Report received for PCT Application No. PCT/US2013/072339, dated Apr. 28, 2014, 4 pages.
Invitation to Pay Additional Fees and Partial Search Report PCT Application No. PCT/US2013/072339, dated Feb. 14, 2014, 2 pages.
Meyer et al., “Multiscale Morphological Segmentations Based on Watershed, Flooding, and Eikonal PDE”, Scale-Space '99, LNCS 1682, 1999, pp. 351-362.
Moovit-Real-Time Transit Info—Android Apps on Google Play, Accessed on Jul. 24, 2013, 2 pages.
Non-Final Office Action received for U.S. Appl. No. 10/979,604, dated Dec. 20, 2010, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 13/011,374, dated Dec. 14, 2012, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 13/011,374, dated Mar. 3, 2017, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 13/011,374, dated Nov. 9, 2015, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 13/011,436, dated Jan. 14, 2015, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 13/011,436, dated Jun. 20, 2013, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 13/073,926, dated Aug. 18, 2015, 19 pages.
Non-Final Office Action received for U.S. Appl. No. 13/073,926, dated Jul. 15, 2014, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 13/073,926, dated May 21, 2013, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 13/073,936, dated Apr. 4, 2014, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 13/073,936, dated May 16, 2013, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 13/624,599, dated Dec. 24, 2015, 23 pages.
Non-Final Office Action received for U.S. Appl. No. 13/624,599, dated Jun. 6, 2014, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 14/133,455, dated Dec. 5, 2014, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 14/329,837, dated Apr. 8, 2016, 7 pages.
Non-Final Office Action Received for U.S. Appl. No. 16/707,467, dated Mar. 4, 2021, 10 pages.
Notice of Acceptance received for Australian Patent Application No. 2011299401, dated Apr. 17, 2014, 2 pages.
Notice of Acceptance received for Australian Patent Application No. 2014206199, dated Aug. 12, 2016, 2 pages.
Notice of Allowance received for Canadian Patent Application No. 2,804,052, dated Sep. 8. 2016, 1 page.
Notice of Allowance received for Russian Federation Application No. 2012155513, dated Feb. 10, 2016, 12 pages (9 pages of Official Copy and 3 pages of English Claims).
Notice of Allowance received for U.S. Appl. No. 13/073,911, dated Jan. 21, 2014, 5 pages.
Notice of Allowance received for U.S. Appl. No. 13/073,911, dated Jul. 19, 2013, 8 pages.
Notice of Allowance received for U.S. Appl. No. 13/073,911, dated Jun. 10, 2014, 5 pages.
Notice of Allowance received for U.S. Appl. No. 13/073,911, dated Sep. 18, 2013, 6 pages,.
Notice of Allowance received for U.S. Appl. No. 13/073,926, dated Mar. 22, 2016, 14 pages.
Notice of Allowance received for U.S. Appl. No. 13/073,936, dated Sep. 24, 2014, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/133,455, dated Apr. 27, 2015, 6 pages.
Notice of Allowance received for U.S. Appl. No. 14/133,455, dated Sep. 24, 2015, 6 pages.
Notice of Allowance received for U.S. Appl. No. 14/329,837, dated Jun. 21, 2016, 7 pages.
Notice of Allowance received for U.S. Appl. No. 15/266,093, dated Mar. 10. 2017, 9 pages.
Notice of Allowance received for U.S. Appl. No. 15/656,410, dated Aug. 30, 2019, 7 pages.
Notice of Allowance Received for U.S. Appl. No. 16/707,467, dated May 12, 2021, 8 pages.
Notice of Allowance received for U.S. Appl. No. 15/655,003, dated Aug. 4, 2022, 9 pages.
Notice of Allowance received for U.S. Appl. No. 15/655,003, dated Nov. 10, 2022, 9 pages.
Office Action received for Canadian Patent Application No. 2,849,970, dated Oct. 5, 2015, 3 pages.
Office Action received for Chinese Patent Application No. 201180033079.2, dated Dec. 22, 2015, 22 pages (9 pages of Official Copy and 13 pages of English Translation).
Office Action received for Chinese Patent Application No. 201280059512.4, dated May 27, 2016, 10 pages (Without English Translation).
Office Action Received for Chinese Patent Application No. 201611209571.1, dated Mar. 24, 2020, 10 pages (6 pages of Official Copy and 4 pages of English Translation).
Office Action received for Chinese Patent Application No. 201611209571.1, dated May 28, 2020, 12 pages (5 pages of Official Copy & 7 pages of English Translation).
Office Action received for European Patent Application No. 11755177.0, dated Apr. 19, 2017, 6 pages.
Office Action received for European Patent Application No. 11755177.0, dated Feb. 16, 2015, 3 pages.
Office Action received for Korean Patent Application No. 10-2014-7012145, dated Jul. 22, 2015, 7 pages (3 pages of English Translation & 4 pages of Official Copy).
Office Action received for Korean Patent Application No. 10-2016-7009512, dated Jun. 1, 2016, 11 pages (6 pages of English Translation & 5 pages of Official Copy). Claims).
Office Action received for Russian Patent Application No. 2012155513, dated Jul. 16, 2015, 7 pages.
Office Action received for Russian Patent Application No. 2016117291, dated Mar. 29, 2017, 12 pages.
Restriction Requirement received for U.S. Appl. No. 13/011,436, dated May 14, 2013, 6 pages.
Restriction Requirement received for U.S. Appl. No. 13/691,390, dated Feb. 26, 2015, 9 pages.
Rui et al., “A Novel Relevance Feedback Technique in Image Retrieval”, In Proceedings of the seventh ACM International conference on Multimedia, Oct. 1999, pp. 67-70.
Subsequent Examiner Report received for Australian Patent Application No. 2012318961, dated Jul. 21, 2015, 3 pages.
U.S. Appl. No. 13/250,490, Final Office Action dated Apr. 1, 2015, 17 pgs.
U.S. Appl. No. 13/250,490, Final Office Action dated Nov. 29, 2013, 13 pgs.
U.S. Appl. No. 13/250,490, Non Final Office Action dated Aug. 11, 2014, 14 pgs.
U.S. Appl. No. 13/250,490, Non Final Office Action dated Jan. 12, 2016, 17 pgs.
U.S. Appl. No. 13/250,490, Non Final Office Action dated Jun. 27, 2013, 9 pgs.
U.S. Appl. No. 13/250,490, Restriction Requirement dated Feb. 28, 2013, 6 pgs.
Written Opinion received for PCT Application No. PCT/US2013/072339, dated Apr. 28, 2014, 5 pages.
Written Opinion received for PCT Application No. PCT/US2012/057110, dated Nov. 30, 2012, 5 pages.
Wayback Machine, “www.macys.com”, [Online]. Retrieved from the Internet: <URL: <www.archive.org>, (Accessed Dec. 31, 2010), 11 pgs.
Wikipedia, “Recommender System”, Accessed on Jul. 10, 2014, 8 pages.
Wikipedia, “The Watershed Transformation”, Accessed on Jul. 3, 2018, 11 pages.
DEVTSOB, Wondering How TSO Mobile & GPS Tracking Benefit Passengers and the Public Transportation Industry?, Retrieved from the Internet: URL: <http://www.tsomobile.com/tso-mobile-public-transportation/>, Aug. 2014, 2 pages.
Related Publications (1)
Number Date Country
20220383380 A1 Dec 2022 US
Continuations (2)
Number Date Country
Parent 15655003 Jul 2017 US
Child 17565307 US
Parent 12749467 Mar 2010 US
Child 15655003 US