1. Field of Disclosure
The present disclosure generally relates to a color palette suggestion system, and, more particularly, to systems and methods for conducting a keyword, a color and a trend based image search in response to a user presenting a search query.
2. Background of the Invention
Searching pertinent images over the Internet can be a particularly challenging task for artists and designers. The commercial systems may require an artist to select a single image from a pool of millions of images that may be retrieved for a keyword search query. In marketing, the precise image selection for a product may directly be linked with the commercial success of a product. An artist may be required to conduct a complex search query to perform an image search based on a keyword and a color. In that case, artists may have to create and discover color palettes to meet their business needs or aesthetic tastes. When an artist is searching for the keyword “ocean,” the Internet search engine may return a large number of images associated with the keyword ocean in various colors and shades.
The artist may then have to run a second query to locate a desired image in a desired color combination from the pool of several images. For example, the artist may have to specifically configure a query to fetch all images that are associated with the keyword “ocean” and that are blue in color. The above mentioned query may be run by the existing image search systems to perform a two layered image search i.e. first images are searched by a keyword and the retrieved images are then searched by the color. Thus, the existing image search systems do not offer a palette selection option, coupled with a keyword search option. Further, the artist using existing image search systems may have to reinvent a palette for each keyword search since there does not exist a way to save and export preferred color pallets. Furthermore, the artist does not have a way to take advantage of image selection trends reflected by the image retrieval and download data gathered from the previous searches performed on the particular keyword and color selection. Accordingly, a need exists for a palette selection system that can present images based at least on the image trend, the color criteria and the keyword criteria.
By way of introduction only, the present embodiments provide a method and system for conducting image search comprising: searching a first database to locate a set of pertinent images, iteratively performing the following operations for each image in the set of pertinent images (a) extracting the histogram of red green and blue colors (RGB colors) from a given image (b) distilling the extracted RGB colors down to create a reduced color palette for the given image (c) segmenting the extracted RGB colors into a set of segments representing distinct parts of the color spectrum (d) selecting a subset from the set of segments to assemble a color palette for the given image (e) updating the assembled color palette and a customer behavior score for the given image in the first database; and generating a ranked display of suggested color palettes for the search query by aggregating the customer behavior scores for the search query across all images.
The disclosed system may present users with multicolor palettes that best match the searched concept and the keyword. Among the presented multicolor palettes, a specific multicolor palette can be selected by the user in order to retrieve all images in the given image depository that match the specific palette and the user specified keyword. Alternatively, the disclosed system may allow the user to create a palette and retrieve all images in the image depository that match the user specified palette and the user specified keyword.
According to one aspect of the present invention, systems and methods are disclosed for generating color palettes from the given image depository in response to a user query. In one embodiment of the disclosed system, a high volume of color combinations can be generated automatically for a search query.
According to another aspect of the present invention, systems and methods are disclosed for designing a color palette. When an artist is searching for the keyword “ocean,” the Internet search engine may return a large number of images associated with the keyword ocean in various colors and shades. For example, the above mentioned query may return color palettes containing various shades of blue from the photos of water, various shades of browns from the photos of beaches, the various shades of reds from photos of ocean sunsets, and the like.
In another embodiment, the system has an ability to display trends for a given image. Thus, the user can optionally select an image that was downloaded by many other users. Accordingly, the system may define popular palettes for certain groups of customers. For example, a popular palette may indicate specific images that are popular among college students just in time for Valentine's Day. In another embodiment, the system may have the ability to suggest color themes based on seasonal or popular trends, for example certain red and green palettes may be especially popular around Christmas time.
In yet another embodiment, the system has the ability to help customers configure their desired palettes. Additionally, the system may allow customers to discover palettes configured by others as well as save or export a particular palette that is of interest to the customer. This may save customer time, money and other resources required to reinvent that palette that works for the customer.
Although the features and advantages of the invention are generally described in this summary section and the following detailed description section in the context of embodiments, it shall be understood that the scope of the invention should not be limited to these particular embodiments. Many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims hereof.
In another embodiment, relevant search queries are further sorted by using user search data to score color palettes by the keyword. Thus, a high volume of color combinations can be applied to locate relevant images in response to the search queries.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
The present disclosure describes a computer implemented method for conducting a keyword, a color and a trend based image search in response to a user presenting a search query on a user interface of a computing device, the method comprising (1) searching a first database to locate a set of pertinent images corresponding to the search query, the first database comprising an image representation (image), a set of keywords associated with the image, a customer behavior score for each keyword in the set of keywords, and a list of keywords previously used to locate the image; (2) iteratively performing the following operations for each image in the searched set of pertinent images: (a) extracting the histogram of red green and blue colors (RGB colors) from a given image, (b) distilling the extracted RGB colors down to create a reduced color palette for the given image based on the proximity of the extracted RGB colors in the RGB color space, (c) segmenting the extracted RGB colors into a set of segments representing different, visually distinct parts of the color spectrum, and black and white colors, (d) selecting a subset of segments from the set of segments satisfying a predetermined criterion to assemble a color palette for the given image, and (e) updating the assembled color palette and the customer behavior score for the given image in the first database; and (3) generating a ranked display of suggested color palettes for the search query by aggregating the customer behavior scores for the search query across all images from which the color palette was extracted from.
Now referring to
The process may at block 120 search a first database to fetch all the images that satisfy the keyword criterion specified in the search query. All the images that meet the searched keyword criteria are herein referred to as a set of pertinent images, since these images may possibly be of interest to the user. In other words, the process may search a first database to locate a set of pertinent images corresponding to the search query. At block 130, the process may generate a set of pertinent images for the user after searching the first database at block 120. The first database may comprise several records, each record may have an associated image representation (image), a set of keywords associated with the image. Furthermore, each record may also have a customer behavior score for each keyword in the set of keywords, and a list of keywords that were previously used by other users to locate the image.
In one embodiment of the disclosed invention, the customer behavior score is based on aggregated activity for an image (user activity received in a computing device). This may include searches that were conducted in the past which led the user to indicate interest in the given image. User interest is indicated by the user purchasing the image, the user clicking on the image, or displaying other signals indicating the user's interest in that image. In this context, the other signals may include for example, the user adding the image to a lightbox or favorites list, and hovering the cursor over the thumbnail image to view a larger preview, or performing any of the above mentioned activities on images contributed by the same artist from the same photo shoot or illustrated image set. In this embodiment of the disclosed invention, the customer behavior scores may be aggregated per search query and used to identify the most relevant keywords or categories for an image.
In another embodiment of the disclosed invention the customer behavior scores may be aggregated over a period of time and may represent customer activity for a specific period of interest such as a season. For example, an abstract image of a red and green background may generate more customer interest in the Christmas season. As a result, the image is likely to receive a disproportionate number of its annual purchases during the three month period prior to Christmas time as compared to other images having the color schemes that are unrelated to Christmas. If a particular image received higher volume of customer activity during a specific time of the year, then the disclosed search system can rank the particular image higher in results during the same time of year. For example, if 30% of the total image downloads for image A had occurred in the month of November, and for 10% of total image downloads for image B had occurred in the month of November, then the image A will be ranked higher than image B in search results during the month of November.
At block 140, the process may determine whether all of the pertinent images generated at block 130 are processed. If the process determines that all pertinent files processed are not yet processed, then the process iteratively performs the operations described in conjunction with blocks 150-190 for each image in the searched set of pertinent images. In other words, the process iteratively performs the operations described in conjunction with blocks 150-190 until all the images for each image in the searched set of pertinent images are processed. In this context, the term “given image” indicates a particular image from the set of pertinent images upon which the operations described in conjunction with blocks 150-190 are being performed.
First, the process may move to block 150 to extract the histogram of red green and blue colors (RGB colors) from a given image. The process of extracting histogram of red green and blue colors (RGB colors) from a given image is further illustrated in conjunction with
Then the process may move to block 160 to distill the extracted RGB colors down to create a reduced color palette for the given image based on the proximity of the extracted RGB colors in the RGB color space. Then the process may move to block 170 to segment the extracted RGB colors into a set of segments representing different, visually distinct parts of the color spectrum, and black and white colors before moving to block 180 to select a subset of segments from the set of segments that satisfy a predetermined criterion to assemble a reduced color palette for the given image. Finally, the process may update the assembled reduced color palette and the customer behavior score for the given image in the first database before returning to block 140 to determine whether all pertinent files are processed.
Alternatively, at block 140, if the process determines that all pertinent files are processed, then the process may move to block 195 to generate a ranked display of suggested color palettes for the search query by aggregating the customer behavior scores for the search query across all images from which the color palette was extracted from. Then the process may exit at block 197.
The process may iteratively process each color in the reduced color palette by performing a set of operations for each color in the reduced color palette. In particular, the process may iteratively perform, for each color in the reduced color palette, the operations of assigning (1) a color volume score to the given color (2) a color weight score to the given color, and (3) the percentage of a set of proximate colors. The process may assign the color volume score to each color in the reduced color palette at block 229. The color volume score may be assigned to each color in the reduced color palette based on the amount of that individual color present in the given image.
In other words, the color volume score for a given color may indicate the percentage of the given image covered by the given color. Thus, in an image of sky, where blue color covers the entire image of the sky, the color volume score for blue color will be 100%. In contrast, in a blue and green colored image depicting birds, where some birds are flying in the blue sky up high and some birds pecking seeds in the green grass below, the color volume score for the color blue will be less than 100% since the green grass will occupy at least some percentage of the image. Likewise, the colors of the birds will also occupy some percentage of the image.
This phenomenon is described further in detail in conjunction with
Furthermore, the process may iteratively assign a color weight score for each color in the reduced color palette. The color weight score may indicate the percentage of the given color in the given image. Notably the term “given image” is an image from the set of pertinent images that is currently being processed and the term “given color” is a color from the reduced color palette that is being assigned the color weight score. The color weight score may be based on the occurrence of other visually similar colors in the given image. The color weight score may indicate a numeric score based on the percentage of the given color in the given image, and the percentage of other colors located within a certain predefined distance in the RGB color space from the given color, and the measure of distance between the given color and the other colors situated within the certain predefined distance in the RGB color space from the given color.
A set of mathematical operations may be performed to compute the color weight score. One example of a formula is illustrated in
Further, for each given color, the process may also identify a set of proximate colors such that each color in the set of proximate colors is located within a predefined distance from the given color in the RGB color space. Next, the process may ascertain the percentage of a set of proximate colors in the given image. The process may also identify the measure of the distance between the given color and the set of proximate colors in the RGB color space. Finally, the process may perform the indexing operations on the color volume score, the color weight score, the customer behavior score and the percentage of the set of proximate colors for the given color in a search engine. The indexing operations may be helpful in a subsequent image search of the given image via the search engine.
Now referring back to
Now referring to
For each segment in the set of segments, the process may iteratively assign a color score to each color in a given segment to ascertain a color that represents the given segment as a whole. In this context, the term “given segment” indicates a particular segment from the set of segments, wherein each color in the particular segment is being assigned a color score. The segment score maybe determined by a product of the color volume score, a color lightness measure and a color chroma measure and wherein the color having the highest color score in a given segment is designated to represent the given segment.
To compute the color score for a specific color, the specific color is converted to the LCH*color space, which produces three values: a lightness value, a chroma value, and a hue value for the specific color. The color score is determined by computing a product of the color volume score, the lightness value, and the chroma value. Resultantly, the color with the highest color score is the brightest, most saturated, and most abundant color in that segment. Accordingly, the color with the highest color score in a particular segment may be used to represent the particular segment.
Likewise, for each segment in the set of segments, the process may iteratively assign a segment score, wherein the segment score for a particular segment is computed by aggregating the color volume scores for each color in the particular segment. In other words, the segment score for a specific segment may be based on the sum of the color volume scores of each color in the given image within the specific segment. This phenomenon is further described in conjunction with
Referring now to
An example of the four color palettes derived from the image 600 is illustrated in
In yet another embodiment of the disclosed invention, the top five segments having the highest segment score may be selected in order to assemble a color palette that best represents the given image. After generating the color palette that best represents the given image, the generated palette is then stored in a database along with the customer behavior scores for the given image from which the colors in the color palette were extracted. Notably, the disclosed invention may be used to derive any number of color palettes that best represents the given image. In another embodiment of the disclosed invention, the process may select a predefined number of segments having the highest segment scores for assembling a color palette representation for a given image.
A process may then iterate through all the color palettes and derive the aggregate customer behavior score for each unique color palette by summing up all the customer behavior scores per search query for each unique color palette. For example, if 10 images with the same color palette have a customer behavior score of 2 for the search query “water”, then the palette will have an aggregate customer behavior score of 20 for the query “water”. Each image may have customer behavior scores for multiple search queries. In that case, the palette may include a sum of customer behavior scores for each individual search query across all images. This phenomenon is illustrated in
Also displayed in the area 700 is the color palette 708 along with the customer behavior scores 716, 718, and 720. As for image 702, the color palette 708 is extracted from image 702, which is an image of a parasailing person. As described above, the first database may an image representation (image), a set of keywords associated with the given image, a customer behavior score for each keyword in the set of keywords, and a list of keywords previously used to locate the given image.
In
Similarly, for image 704, the color palette 708 is extracted from image 702, which is an image of a boardwalk. In
The process may aggregate the customer behavior scores per search query and use the customer behavior scores to identify most relevant keywords or categories for a given image. For example, as shown in area 712 of
In one embodiment of the disclosed invention, the process may aggregate the customer behavior scores over a period of time and use the customer behavior scores to represent a customer activity for a specific period of interest. In another embodiment of the disclosed invention, the customer behavior score may be based on the previous searches indicating user interest in a given image. In this embodiment, the user interest may at least be indicated by one of the following: the user purchasing the given image, the user hovering a cursor over a thumbnail version of the given image to view an enlarged version of the given image, the user adding the given image to a favorites list, and the like.
Now referring to
Displayed in the area 800 are the color palettes 804, 806, and 808 along with the customer behavior scores 810, 812, and 814. The color palette 804 has three associated keywords: ocean, beach, and sky, the customer behavior scores 810 for these keywords are 90, 100 and 80 respectively. Likewise, the color palette 806 has four associated keywords: ocean, underwater, fish, and beach, where the customer behavior scores 812 for these keywords are 100, 90, 50 and 30 respectively. Finally, the color palette 808 has four associated keywords: ocean, sunset, beach, and sky, where the customer behavior scores 814 for these keywords are 80, 140, 150, and 40 respectively.
Now referring to the display area 802 in
Likewise for the search query 842 where images for the keyword beach are being searched, the customer behavior score for this keyword is 100 for palette A, 30 for palette B, and 150 for palette C. Accordingly, the first rank 830 is allocated to the suggested color palette C referenced by the reference numeral 836. The second rank 832 and the third rank 834 are allocated to the suggested color palette A referenced by the reference numeral 838 and the suggested color palette B referenced by the reference numeral 840 respectively.
Similarly, for the search query 850 where images for the keyword sky are being searched, the customer behavior score for this keyword is 80 for palette A, and 40 for palette C. Therefore, the first rank 842 is allocated to the suggested color palette A referenced by the reference numeral 846 and the second rank 844 is allocated to the suggested color palette C referenced by the reference numeral 848. Notably, because the keyword sky is not associated with the images from which the color palette B is derived from, the color palette B referenced by the reference numeral 806 is not listed in the ranked display of suggested color palette for the search query for the keyword sky.
Now referring to
In
The claimed system and method may also be configured to receive data including a list of keyword tags provided with the image and a list of keywords used by customers to find said image along with a score based on customer behavior.
The customer behavior score may be based on aggregate user activity for an image. This includes searches that lead to purchases of said image, clicks on said image, or other signals that indicated that the user was interested in said image—for example, adding said image to a lightbox or favorites list and hovering the cursor over a thumbnail of said image to view a larger preview, or doing any of the above activities on images contributed by an artist from a photo shoot of said image or illustrated image set. Customer behavior scores are aggregated per search term and used to identify the most relevant keywords or categories for an image.
The claimed system and method aggregate customer behavior scores over a period of time. Said customer behavior scores may represent customer activity for a specific period of interest. For example, said specific period may include one season of a year. An abstract image of a red and green background may generate more customer interest in a winter holiday season. As a result, the image will receive a disproportionate number of its annual purchases during the three month period prior to Christmas compared to other images with color schemes unrelated to Christmas. In years that follow, in one embodiment the claimed system and method weigh that image higher in results during a time of year that it received higher volume of customer activity. If one image received 30% of its downloads during November, and another image received 10% of its downloads during November, then the claimed system and method will rank the former image higher in search results during the month of November.
The claimed system and method extract the histogram 220 of red, green, and blue, hereinafter, “RGB,” colors from the image thumbnail 210 and distill said extracted colors down to a reduced palette 230 based on their proximity in a RGB color space.
The claimed system and method assign each of the reduced colors a score 310, also referred to as a “color volume score,” based on a percentage of an image that is covered by that individual color.
The claimed system and method then give each color a second weight score, referred to as a color weight score, which is based on occurrence of other visually similar colors in an image. As reflected in
This weight is a numeric score based on percentage of a color in the image, and the percentage of other colors within a certain predefined distance in the RGB color space, and a measure of distance between those colors in the RGB color space. In this way, the claimed system and method will score a shade of green higher if a wide variety of greens cover an image but the claimed system and method will weight said shade lower if shades of red completely cover said image.
As noted previously the claimed system and method extract colors from an image and distill said colors to a palette.
The claimed system and method divide the extracted colors into buckets representing different visually distinct parts of the color spectrum, a spectrum which includes black and white. In one embodiment, the claimed system and method define the buckets manually by segmenting the color spectrum into sections that are visually distinct from each other. In another embodiment, the claimed system and method define the buckets algorithmically using a variety of techniques.
In one embodiment, three buckets with bucket color scores ranking highest are selected in order to assemble a color palette that best represents their corresponding image. The claimed system and method score buckets based on an aggregation of volume of each color in the image within a bucket.
The claimed system and method index each piece of data: color volume score, color weight score, and customer behavior score in a search engine to serve user queries.
The claimed system and method then store a palette for an image in addition to customer behavior scores from an image from which colors were extracted.
The claimed system and method can iterate operation throughout the entire collection of images.
As demonstrated in
The claimed system and method produce a data set containing color palettes linked to a sum 712 of all customer behavior scores (e.g. 716, 718 and 720) from all images (e.g. 702, 704 and 706) that match a given color palette 714.
By way of example, if a user runs a query for “sky,” the claimed system and method will display the proper palette (e.g. 804, 806 and 808) according to customer behavior score (e.g. 810, 812 and 814) for a specified search term—in this example, “sky.” In one embodiment, the claimed system and method use this data to rank color palettes based on a search term 828. For example, if palette A has a customer behavior score of 90 for “ocean,” a score of 100 for “beach,” and a score of 80 for “sky,” and palette B has a score of 100 for ocean and a score of 30 for beach, then the claimed system and method will rank palette A first if the user enters a query for “beach” and palette B will be ranked first if the user enters a query for “ocean.”
The claimed system and method provide a user with a web interface where said user can enter at least one keyword as a search term 910.
If a search yields multiple palettes that are visually similar, for example, said palettes sharing two colors while a third color is a near match, then the claimed system and method will remove said multiple palettes that are visually similar in order to provide better variety to a user. The claimed system and method filters palettes based on the following logic: (1) If a palette has N colors, create an index of all combinations of N−1 colors, (2) Count each set of N−1 colors present in the index, (3) If a set of N−1 colors occurs more than once then compare Nth colors for each of them, and (4) If distance between two of said Nth colors is below a predefined threshold, then remove one color palette from said results.
If the user selects a single color option, the claimed system and method update search results to show only images that match a keyword a given color. The claimed system and method match said given color with weighted colors retrieved from each image and secondarily on the volume of the color in each image. The claimed system and method update palette suggestions to only display palettes which match selected colors.
If a user clicks on a palette suggestion, the claimed system and method update search results to only display images that match a given color combination. The claimed system and method update palette suggestions to show multicolor palettes that contain colors of the given color combination. If the user selected a multicolor palette with three colors, the claimed system and method will display palette suggestions with four colors. If the user selected a palette with four colors, the claimed system and method will display palette suggestions with five colors.
If a user hovers over an image result, the claimed system and method will display a color palette that was extracted from said image result. If a user clicks on said palette (1) the claimed system and method will retrieve four top colors for said image result along with color volume scores for said image result, (2) the claimed system and method will enter said four colors into a search query along with color volume scores as boost parameters, (3) the returned image results will match all given colors and search terms, and (4) the claimed system and method will rank said results based on proximity in volume each color of said returned image results matches the volume of each color in said image result.
The claimed system and method enables a user to input a query with at least one and specify a weight for each color in order to retrieve a set of images that match proportions of each color in said query. In one embodiment the claimed system enables the user to define said proportions manually via a control on the interface. In one embodiment the claimed system and method (1) enable the user to specify volume of each color used in said query by providing or choosing any image from which the claimed system and method will extract a color histogram, and (2) enable the user to specify an assembled query which will include proportions of each color from said image.
The claimed system and method enable a user to save a palette s/he created and retrieve it later to apply it to a search query.
The claimed system and method display trending or popular palettes to users based on customer behavior scores. Customer behavior scores indicate popularity of a given palette for specific searches and at specific times.
Number | Name | Date | Kind |
---|---|---|---|
5086295 | Boettcher et al. | Feb 1992 | A |
5299116 | Owens et al. | Mar 1994 | A |
5307182 | Maltz | Apr 1994 | A |
5313231 | Yin et al. | May 1994 | A |
5434957 | Moller | Jul 1995 | A |
6185385 | Mestha et al. | Feb 2001 | B1 |
6385336 | Jin | May 2002 | B1 |
6518981 | Zhao | Feb 2003 | B2 |
6859210 | Luo | Feb 2005 | B2 |
7023577 | Watanabe | Apr 2006 | B2 |
7126612 | Sekiguchi | Oct 2006 | B2 |
7715624 | Nishida | May 2010 | B2 |
7864999 | Chang et al. | Jan 2011 | B2 |
7873521 | Kurozumi et al. | Jan 2011 | B2 |
8502864 | Watkins | Aug 2013 | B1 |
8508547 | Klassen | Aug 2013 | B2 |
8576241 | Kanter et al. | Nov 2013 | B1 |
8587604 | Kanter et al. | Nov 2013 | B1 |
8593478 | O'Brien- Strain | Nov 2013 | B2 |
8630485 | Cok | Jan 2014 | B2 |
8634640 | Bhatti | Jan 2014 | B2 |
8890884 | Zhang | Nov 2014 | B2 |
9002100 | Lecerf | Apr 2015 | B2 |
9390168 | Dykstra et al. | Jul 2016 | B1 |
9582517 | Chester et al. | Feb 2017 | B2 |
9792303 | Sayre, III et al. | Oct 2017 | B2 |
20010003814 | Hirayama et al. | Jun 2001 | A1 |
20020080153 | Zhao et al. | Jun 2002 | A1 |
20020094124 | Kim | Jul 2002 | A1 |
20050055344 | Liu et al. | Mar 2005 | A1 |
20060193538 | Vronay et al. | Aug 2006 | A1 |
20060195325 | Tateson et al. | Aug 2006 | A1 |
20060218522 | Hanechak | Sep 2006 | A1 |
20060248081 | Lamy | Nov 2006 | A1 |
20070188445 | Silverstein et al. | Aug 2007 | A1 |
20070188478 | Silverstein et al. | Aug 2007 | A1 |
20080046409 | Lieb | Feb 2008 | A1 |
20080046410 | Lieb | Feb 2008 | A1 |
20090003892 | Sakaizawa et al. | Jan 2009 | A1 |
20090041345 | Tirumalareddy | Feb 2009 | A1 |
20090252404 | Lecerf | Oct 2009 | A1 |
20090281925 | Winter et al. | Nov 2009 | A1 |
20100158412 | Wang et al. | Jun 2010 | A1 |
20110085697 | Clippard et al. | Apr 2011 | A1 |
20110135195 | Marchesotti | Jun 2011 | A1 |
20110184950 | Skaff et al. | Jul 2011 | A1 |
20110191334 | Hua et al. | Aug 2011 | A1 |
20110289082 | Townsend | Nov 2011 | A1 |
20110313927 | Savilia et al. | Dec 2011 | A1 |
20120045121 | Youngman et al. | Feb 2012 | A1 |
20120075329 | Skaff et al. | Mar 2012 | A1 |
20120099784 | Marchesotti et al. | Apr 2012 | A1 |
20120163710 | Skaff et al. | Jun 2012 | A1 |
20120166472 | Hoppenot et al. | Jun 2012 | A1 |
20120189340 | Hagiwara | Jul 2012 | A1 |
20120224764 | Weng | Sep 2012 | A1 |
20120239506 | Saunders et al. | Sep 2012 | A1 |
20120254790 | Colombino et al. | Oct 2012 | A1 |
20130073336 | Heath | Mar 2013 | A1 |
20140019484 | Coppin et al. | Jan 2014 | A1 |
20140055484 | Moon | Feb 2014 | A1 |
20140089295 | Becherer et al. | Mar 2014 | A1 |
20140096009 | Grosz et al. | Apr 2014 | A1 |
20140201634 | Hill | Jul 2014 | A1 |
20140270498 | Chester et al. | Sep 2014 | A1 |
20140304661 | Topakas et al. | Oct 2014 | A1 |
20140334722 | Bloore et al. | Nov 2014 | A1 |
20150039994 | Hoguet | Feb 2015 | A1 |
20150046254 | Raab et al. | Feb 2015 | A1 |
20150081469 | Acharyya et al. | Mar 2015 | A1 |
20150110381 | Parvin et al. | Apr 2015 | A1 |
20150170380 | Duwenhorst | Jun 2015 | A1 |
20150199010 | Coleman et al. | Jul 2015 | A1 |
20150220578 | Hunt et al. | Aug 2015 | A1 |
20150310010 | Brenner et al. | Oct 2015 | A1 |
20150324365 | Becker et al. | Nov 2015 | A1 |
20150324392 | Becker et al. | Nov 2015 | A1 |
20150324394 | Becker et al. | Nov 2015 | A1 |
20150378999 | Dorner et al. | Dec 2015 | A1 |
20150379004 | Sayre, III et al. | Dec 2015 | A1 |
Number | Date | Country |
---|---|---|
WO 2014070168 | May 2014 | WO |
WO2014070914 | May 2014 | WO |
Entry |
---|
Flickner et al., “Query by image and video content: The QBIC system,” Computer, 28(9) pp. 23-32, 1995. |
Smith et al., “VisualSEEk: a fully automated content-based image query system,” In Proceedings of the fourth ACM International Conference on Multimedia, pp. 87-89, ACM, 1997. |
Number | Date | Country | |
---|---|---|---|
20150324366 A1 | Nov 2015 | US |
Number | Date | Country | |
---|---|---|---|
61988962 | May 2014 | US |