The present disclosure generally relates to a color palette suggestion system, and, more particularly, to systems and methods for conducting a keyword, a color and a trend based image search in response to a user presenting a search query.
Searching pertinent images over the Internet can be a particularly challenging task for artists and designers. The commercial systems may require an artist to select a single image from a pool of millions of images that may be retrieved for a keyword search query. In marketing, the precise image selection for a product may directly be linked with the commercial success of a product. An artist may be required to conduct a complex search query to perform an image search based on a keyword and a color. In that case, artists may have to create and discover color palettes to meet their business needs or aesthetic tastes. When an artist is searching for the keyword “ocean,” the Internet search engine may return a large number of images associated with the keyword ocean in various colors and shades.
The artist may then have to run a second query to locate a desired image in a desired color combination from the pool of several images. For example, the artist may have to specifically configure a query to fetch all images that are associated with the keyword “ocean” and that are blue in color. The above mentioned query may be run by the existing image search systems to perform a two layered image search i.e. first images are searched by a keyword and the retrieved images are then searched by the color. Thus, the existing image search systems do not offer a palette selection option, coupled with a keyword search option. Further, the artist using existing image search systems may have to reinvent a palette for each keyword search since there does not exist a way to save and export preferred color pallets. Furthermore, the artist does not have a way to take advantage of image selection trends reflected by the image retrieval and download data gathered from the previous searches performed on the particular keyword and color selection. Accordingly, a need exists for a palette selection system that can present images based at least on the image trend, the color criteria and the keyword criteria.
By way of introduction only, the present embodiments provide methods and systems for conducting image searches comprising: searching a first database to locate a set of pertinent images; iteratively performing the following operations for each image in the set of pertinent images: (a) extracting the histogram of red green and blue colors (RGB colors) from a given image, (b) distilling the extracted RGB colors down to create a reduced color palette for the given image, (c) segmenting the extracted RGB colors into a set of segments representing distinct parts of the color spectrum, (d) selecting a subset from the set of segments to assemble a color palette for the given image, and (e) updating the assembled color palette and a customer behavior score for the given image in the first database; and generating a display of suggested color palettes for the search query.
The disclosed systems may present users with multicolor palettes that best match the searched concept and the keyword. Among the presented multicolor palettes, a specific multicolor palette can be selected by the user in order to retrieve all images in the given image depository that match the specific palette and the user specified keyword. Alternatively, the disclosed system may allow the user to create a palette and retrieve all images in the image depository that match the user specified palette and the user specified keyword.
According to one aspect of the present invention, systems and methods are disclosed for generating color palettes from the given image depository in response to a user query. In one embodiment of the disclosed system, a high volume of color combinations can be generated automatically for a search query.
According to another aspect of the present invention, systems and methods are disclosed for designing a color palette. When an artist is searching for the keyword “ocean,” the Internet search engine may return a large number of images associated with the keyword ocean in various colors and shades. For example, the above mentioned query may return color palettes containing various shades of blue from the photos of water, various shades of browns from the photos of beaches, the various shades of reds from photos of ocean sunsets, and the like.
In another embodiment, the system has an ability to display trends for a given image. Thus, the user can optionally select an image that was downloaded by many other users. Accordingly, the system may define popular palettes for certain groups of customers. For example, a popular palette may indicate specific images that are popular among college students just in time for Valentine's Day. In another embodiment, the system may have the ability to suggest color themes based on seasonal or popular trends, like certain red and green palettes may be especially popular around Christmas time.
In yet another embodiment, the system has an ability to help customers configure their desired palettes. Additionally, the system may allow the customers to discover palettes configured by others as well as save or export a particular palette that is of interest to the customer. This may save customer time, money and other resources required to reinvent that palette that works for the customer.
Although the features and advantages of the invention are generally described in this summary section and the following detailed description section in the context of embodiments, it shall be understood that the scope of the invention should not be limited to these particular embodiments. Many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims hereof.
In another embodiment, relevant search queries are further sorted by using user search data to score color palettes by the keyword. Thus, a high volume of color combinations can be applied to locate relevant images in response to the search queries.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
The present disclosure describes a computer implemented method for conducting a keyword, a color and a trend based image search in response to a user presenting a search query on a user interface of a computing device, the method comprising: (1) searching a first database to locate a set of pertinent images corresponding to the search query, the first database comprising an image representation (image), a set of keywords associated with the image, a customer behavior score for each keyword in the set of keywords, and a list of keywords previously used to locate the image; (2) iteratively performing the following operations for each image in the searched set of pertinent images: (a) extracting the histogram of red green and blue colors (RGB colors) from a given image, (b) distilling the extracted RGB colors down to create a reduced color palette for the given image based on the proximity of the extracted RGB colors in the RGB color space, (c) segmenting the extracted RGB colors into a set of segments representing different, visually distinct parts of the color spectrum, and black and white colors, (d) selecting a subset of segments from the set of segments satisfying a predetermined criterion to assemble a color palette for the given image, and (e) updating the assembled color palette and the customer behavior score for the given image in the first database; and (3) generating a display of suggested color palettes for the search query.
The process may, at block 120, search a first database to fetch all the images that satisfy the keyword criterion specified in the search query. All the images that meet the searched keyword criteria are herein referred to as a set of pertinent images, since these images may possibly be of interest to the user. In other words, the process may search a first database to locate a set of pertinent images corresponding to the search query. At block 130, the process may generate a set of pertinent images for the user after searching the first database at block 120. The first database may comprise several records; each record may have an associated image representation (image), a set of keywords associated with the image. Furthermore, each record may also have a customer behavior score for each keyword in the set of keywords, and a list of keywords that were previously used by other users to locate the image.
In one embodiment of the disclosed invention, the customer behavior score is based on aggregate user activity for an image. This may include searches that were conducted in the past which led the user to indicate interest in the given image. User interest is indicated by the user purchasing the image, the user clicking on the image, or displaying other signals indicating the user's interest in that image. In this context, the other signals may include for example, the user adding the image to a lightbox or favorites list, and hovering the cursor over the thumbnail image to view a larger preview, or performing any of the above mentioned activities on images contributed by the same artist from the same photo shoot or illustrated image set. In this embodiment of the disclosed invention, the customer behavior scores may be aggregated per search query and used to identify the most relevant keywords or categories for an image.
In another embodiment of the disclosed invention the customer behavior scores may be aggregated over a period of time and may represent customer activity for a specific period of interest such as a season. For example, an abstract image of a red and green background may generate more customer interest in the Christmas season. As a result, the image is likely to receive a disproportionate number of its annual purchases during the three month period prior to Christmas time as compared to other images having the color schemes that are unrelated to Christmas. If a particular image received higher volume of customer activity during a specific time of the year, then the disclosed search system can rank the particular image higher in results during the same time of year. For example, if 30% of the total image downloads for image A had occurred in the month of November, and for 10% of total image downloads for image B had occurred in the month of November, then the image A will be ranked higher than image B in search results during the month of November.
At block 140, the process may determine whether all of the pertinent images generated at block 130 are processed. If the process determines that all pertinent files processed are not yet processed, then the process iteratively performs the operations described in conjunction with blocks 150-190 for each image in the searched set of pertinent images. In other words, the process iteratively performs the operations described in conjunction with blocks 150-190 until all the images for each image in the searched set of pertinent images are processed. In this context, the term “given image” indicates a particular image from the set of pertinent images upon which the operations described in conjunction with blocks 150-190 are being performed.
First, the process may move to block 150 to extract the histogram of red green and blue colors (RGB colors) from a given image. The process of extracting histogram of red green and blue colors (RGB colors) from a given image is further illustrated in conjunction with
Then the process may move to block 160 to distill the extracted RGB colors down to create a reduced color palette for the given image based on the proximity of the extracted RGB colors in the RGB color space. Then the process may move to block 170 to segment the extracted RGB colors into a set of segments representing different, visually distinct parts of the color spectrum, and black and white colors before moving to block 180 to select a subset of segments from the set of segments that satisfy a predetermined criterion to assemble a reduced color palette for the given image. Finally, the process may update the assembled reduced color palette and the customer behavior score for the given image in the first database before returning to block 140 to determine whether all pertinent files are processed.
Alternatively, at block 140, if the process determines that all pertinent files are processed, then the process may move to block 195 to generate a display of suggested color palettes for the search query. Then the process may exit at block 197.
The process may iteratively process each color in the reduced color palette by performing a set of operations for each color in the reduced color palette. In particular, the process may iteratively perform, for each color in the reduced color palette, the operations of assigning: (1) a color volume score to the given color, (2) a color weight score to the given color, and (3) the percentage of a set of proximate colors. The process may assign the color volume score to each color in the reduced color palette at block 229. The color volume score may be assigned to each color in the reduced color palette based on the amount of that individual color present in the given image.
In other words, the color volume score for a given color may indicate the percentage of the given image covered by the given color. Thus, in an image of sky, where blue color covers the entire image of the sky, the color volume score for blue color will be 100%. In contrast, in a blue and green colored image depicting birds, where some birds are flying in the blue sky up high and some birds pecking seeds in the green grass below, the color volume score for the color blue will be less than 100% since the green grass will occupy at least some percentage of the image. Likewise, the colors of the birds will also occupy some percentage of the image.
This phenomenon is described further in detail in conjunction with
Furthermore, the process may iteratively assign a color weight score for each color in the reduced color palette. The color weight score may indicate the percentage of the given color in the given image. Notably the term “given image” is an image from the set of pertinent images that is currently being processed and the term “given color” is a color from the reduced color palette that is being assigned the color weight score. The color weight score may be based on the occurrence of other visually similar colors in the given image. The color weight score may indicate a numeric score based on the percentage of the given color in the given image, and the percentage of other colors located within a certain predefined distance in the RGB color space from the given color, and the measure of distance between the given color and the other colors situated within the certain predefined distance in the RGB color space from the given color.
A set of mathematical operations may be performed to compute the color weight score. One example of a formula is illustrated in
Further, for each given color, the process may also identify a set of proximate colors such that each color in the set of proximate colors is located within a predefined distance from the given color in the RGB color space. Next, the process may ascertain the percentage of a set of proximate colors in the given image. The process may also identify the measure of the distance between the given color and the set of proximate colors in the RGB color space. Finally, the process may perform the indexing operations on the color volume score, the color weight score, the customer behavior score and the percentage of the set of proximate colors for the given color in a search engine. The indexing operations may be helpful in a subsequent image search of the given image via the search engine.
Now referring back to
For each segment in the set of segments, the process may iteratively assign a color score to each color in a given segment to ascertain a color that represents the given segment as a whole. In this context, the term “given segment” indicates a particular segment from the set of segments, wherein each color in the particular segment is being assigned a color score. The segment score maybe determined by a product of the color volume score, a color lightness measure and a color chroma measure and wherein the color having the highest color score in a given segment is designated to represent the given segment.
To compute the color score for a specific color, the specific color is converted to the LCH * color space, which produces three values: a lightness value, a chroma value, and a hue value for the specific color. The color score is determined by computing a product of the color volume score, the lightness value, and the chroma value. Resultantly, the color with the highest color score is the brightest, most saturated, and most abundant color in that segment. Accordingly, the color with the highest color score in a particular segment may be used to represent the particular segment.
Likewise, for each segment in the set of segments, the process may iteratively assign a segment score, wherein the segment score for a particular segment is computed by aggregating the color volume scores for each color in the particular segment. In other words, the segment score for a specific segment may be based on the sum of the color volume scores of each color in the given image within the specific segment. This phenomenon is further described in conjunction with
An example of the four color palettes derived from the image 600 is illustrated in
In yet another embodiment of the disclosed invention, the top five segments having the highest segment score may be selected in order to assemble a color palette that best represents the given image. After generating the color palette that best represents the given image, the generated palette is then stored in a database along with the customer behavior scores for the given image from which the colors in the color palette were extracted. Notably, the disclosed invention may be used to derive any number of color palettes that best represents the given image. In another embodiment of the disclosed invention, the process may select a predefined number of segments having the highest segment scores for assembling a color palette representation for a given image.
In
This application is a continuation of U.S. patent application Ser. No. 14/701,549 filed on May 1, 2015, which claims the benefit of priority under 35 U.S.C. § 119 from U.S. Provisional Application No. 61/988,962 entitled “COLOR PALETTE SUGGESTIONS,” filed on May 6, 2014. The disclosure of which are hereby incorporated by reference in their entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5086295 | Boettcher et al. | Feb 1992 | A |
5299116 | Owens et al. | Mar 1994 | A |
5307182 | Maltz | Apr 1994 | A |
5313231 | Yin et al. | May 1994 | A |
5434957 | Moller | Jul 1995 | A |
6185385 | Mestha et al. | Feb 2001 | B1 |
6385336 | Jin | May 2002 | B1 |
6518981 | Zhao et al. | Feb 2003 | B2 |
6859210 | Luo et al. | Feb 2005 | B2 |
7023577 | Watanabe et al. | Apr 2006 | B2 |
7126612 | Sekiguchi et al. | Oct 2006 | B2 |
7715624 | Nishida | May 2010 | B2 |
7864999 | Chang et al. | Jan 2011 | B2 |
7873521 | Kurozumi | Jan 2011 | B2 |
8502864 | Watkins | Aug 2013 | B1 |
8508547 | Klassen | Aug 2013 | B2 |
8576241 | Kanter et al. | Nov 2013 | B1 |
8587604 | Kanter et al. | Nov 2013 | B1 |
8593478 | O'Brien-Strain | Nov 2013 | B2 |
8630485 | Cok et al. | Jan 2014 | B2 |
8634640 | Bhatti | Jan 2014 | B2 |
8756241 | Thite | Jun 2014 | B1 |
8890884 | Zhang | Nov 2014 | B2 |
9002100 | Lecerf | Apr 2015 | B2 |
9390168 | Dykstra et al. | Jul 2016 | B1 |
9582517 | Chester | Feb 2017 | B2 |
9792303 | Sayre, III et al. | Oct 2017 | B2 |
20010003814 | Hirayama et al. | Jun 2001 | A1 |
20020080153 | Zhao et al. | Jun 2002 | A1 |
20020094124 | Kim | Jul 2002 | A1 |
20050055344 | Liu et al. | Mar 2005 | A1 |
20060193538 | Vronay et al. | Aug 2006 | A1 |
20060195325 | Tateson et al. | Aug 2006 | A1 |
20060218522 | Hanechak | Sep 2006 | A1 |
20060248081 | Lamy | Nov 2006 | A1 |
20070188445 | Silverstein | Aug 2007 | A1 |
20070188478 | Silverstein | Aug 2007 | A1 |
20080046409 | Lieb | Feb 2008 | A1 |
20080046410 | Lieb | Feb 2008 | A1 |
20090003892 | Sakaizawa | Jan 2009 | A1 |
20090041345 | Tirumalareddy et al. | Feb 2009 | A1 |
20090252404 | Lecerf | Oct 2009 | A1 |
20090281925 | Winter et al. | Nov 2009 | A1 |
20100158412 | Wang | Jun 2010 | A1 |
20110085697 | Clippard et al. | Apr 2011 | A1 |
20110135195 | Marchesotti | Jun 2011 | A1 |
20110184950 | Skaff | Jul 2011 | A1 |
20110191334 | Hua | Aug 2011 | A1 |
20110289082 | Townsend | Nov 2011 | A1 |
20110313927 | Savilia et al. | Dec 2011 | A1 |
20120045121 | Youngman et al. | Feb 2012 | A1 |
20120075329 | Skaff | Mar 2012 | A1 |
20120099784 | Marchesotti | Apr 2012 | A1 |
20120163710 | Skaff | Jun 2012 | A1 |
20120166472 | Hoppenot | Jun 2012 | A1 |
20120189340 | Hagiwara | Jul 2012 | A1 |
20120224764 | Weng | Sep 2012 | A1 |
20120239506 | Saunders | Sep 2012 | A1 |
20120254790 | Colombino et al. | Oct 2012 | A1 |
20130073336 | Heath | Mar 2013 | A1 |
20140019484 | Coppin et al. | Jan 2014 | A1 |
20140055484 | Moon et al. | Feb 2014 | A1 |
20140089295 | Becherer et al. | Mar 2014 | A1 |
20140096009 | Grosz | Apr 2014 | A1 |
20140201634 | Hill | Jul 2014 | A1 |
20140270498 | Chester | Sep 2014 | A1 |
20140304661 | Topakas et al. | Oct 2014 | A1 |
20140334722 | Bloore | Nov 2014 | A1 |
20150039994 | Hoguet | Feb 2015 | A1 |
20150046254 | Raab et al. | Feb 2015 | A1 |
20150081469 | Acharyya et al. | Mar 2015 | A1 |
20150110381 | Parvin | Apr 2015 | A1 |
20150170380 | Duwenhorst | Jun 2015 | A1 |
20150181469 | Yu | Jun 2015 | A1 |
20150199010 | Coleman | Jul 2015 | A1 |
20150220578 | Hunt et al. | Aug 2015 | A1 |
20150310010 | Brenner et al. | Oct 2015 | A1 |
20150324365 | Becker et al. | Nov 2015 | A1 |
20150324366 | Becker et al. | Nov 2015 | A1 |
20150324392 | Becker et al. | Nov 2015 | A1 |
20150324394 | Becker et al. | Nov 2015 | A1 |
20150378999 | Dorner et al. | Dec 2015 | A1 |
20150379004 | Sayre, III et al. | Dec 2015 | A1 |
Number | Date | Country |
---|---|---|
WO-2014070168 | May 2014 | WO |
WO-2014070914 | May 2014 | WO |
Entry |
---|
Flickner et al., “Query by image and video content: The QBIC system,” Computer, 28(9) pp. 23-32, 1995. |
Flickner, et a., “Query by image and video content: The QBIC system,” Computer 28(9), pp. 23-32 1995. |
Smith, et al., “VisualSEEK: a fully automated content-based image query system”, In Proceedings of the fourth ACM International Conference on Multimedia, pp. 87-89, ACM, 1997. |
Number | Date | Country | |
---|---|---|---|
61988962 | May 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14701549 | May 2015 | US |
Child | 15862303 | US |