Apparatus and method for supplying content aware photo filters

Information

  • Patent Grant
  • 11496673
  • Patent Number
    11,496,673
  • Date Filed
    Friday, May 22, 2020
    3 years ago
  • Date Issued
    Tuesday, November 8, 2022
    a year ago
Abstract
A mobile client device includes a photo controller to identify when a client device captures a picture. Photo filters are designated based upon attributes of the mobile client device. The picture with a selected photo filter is sent to a server for routing to other client devices.
Description
FIELD OF THE INVENTION

This invention relates generally to photographs taken by a mobile device operative in a networked environment. More particularly, this invention relates to supplying such a mobile device with content aware photo filters.


BACKGROUND OF THE INVENTION

The number of digital photographs taken with mobile wireless devices is increasingly outnumbering photographs taken with dedicated digital and film based cameras. Thus, there are growing needs to improve the experience associated with mobile wireless digital photography.





BRIEF DESCRIPTION OF THE FIGURES

The invention is more fully appreciated in connection with the following detailed description taken in conjunction with the accompanying drawings, in which:



FIG. 1 illustrates an electronic device utilized in accordance with an embodiment of the invention.



FIG. 2 illustrates a networked system utilized in accordance with an embodiment of the invention.



FIG. 3 illustrates processing operations associated with an embodiment of the invention.



FIG. 4 illustrates a photograph taken by a digital mobile device.



FIG. 5 illustrates a general filter applied to the photograph.



FIG. 6 illustrates a feature specific filter applied to the photograph.



FIG. 7 illustrates a different feature specific filter with a branded element applied to the photograph.





Like reference numerals refer to corresponding parts throughout the several views of the drawings.


DETAILED DESCRIPTION OF THE INVENTION


FIG. 1 illustrates an electronic device 100 utilized in accordance with an embodiment of the invention. In one embodiment, the electronic device 100 is a Smartphone with a processor 102 in communication with a memory 104. The processor 102 may be a central processing unit and/or a graphics processing unit. The memory 104 is a combination of flash memory and random access memory. The memory 104 stores a photo controller 106. The photo controller 106 includes executable instructions to coordinate the capture, display and archiving of digital photographs. The photo controller 106 may include photo filter processing disclosed herein, which augments or replaces such photo filter processing that is described below in connection with a server-based photo filter module.


The processor 102 is also coupled to image sensors 115. The image sensors 115 may be known digital image sensors, such as charge coupled devices. The image sensors capture visual media, which is presented on display 116, as coordinated by the photo controller 106.


A touch controller 118 is connected to the display 116 and the processor 102. The touch controller 118 is responsive to haptic signals applied to the display 116. In one embodiment, the photo controller 106 monitors signals from the touch controller 118 to coordinate the capture, display and archiving of digital photographs. The electronic device 100 may also include other components commonly associated with a Smartphone, such as a wireless signal processor 120 to support wireless communications, a power control circuit 122 and a global positioning system processor 124.



FIG. 2 illustrates a system 200 configured in accordance with an embodiment of the invention. The system 200 includes a set of client devices 100_1 through 100_N. The client devices 100 are connected to a network 206, which is any combination of wireless and wired network communication devices. A server 204 is also connected to the network 206. The server 204 includes standard components, such as a central processing unit 210 and input/output devices 212 connected via a bus 214. The input/output devices 212 may include a keyboard, mouse, display and the like. A network interface circuit 216 is also connected to the bus 214 to provide connectivity to network 206. A memory 220 is also connected to the bus 214. The memory 220 includes modules with executable instructions, such as a photo filter module 222. The photo filter module 222 implements photo evaluation and filter selection operations, as discussed below.



FIG. 3 illustrates processing operations associated with an embodiment of the invention. The operations are performed by the photo filter module 222 of server 204 in combination with one or more client devices 100. Initially, the photo filter module 222 serves a photo prompt 300. For example, the photo filter module 222 may form a segment of a network executed application that coordinates taking photographs and appending messages to such photographs for delivery from one user to another. In this context, client 100_1 accesses the photo filter module 222 over network 206 to activate the application, which serves the photo prompt to the client 100_1. A user at the client 100_1 takes a photo 302.



FIG. 4 illustrates client device 100_1 with a display 400 that presents a photo prompt 402. Activation of the photo prompt 402 results in a picture 404. The photo filter module 222 monitors the client device activity to determine if a photo is taken 304. If so, the attributes of the photograph and client device are evaluated 305. Photo filters are selected and supplied 306 based upon the evaluation.


By way of example, the attributes of the client device may include geolocation of the client device, which is collected from the GPS processor 124. The geolocation may be used to designate photo filters relevant to the geolocation. For example, if the geolocation is proximate to a beach, then photo filters to augment a beach setting (e.g., a color filter for water, sand and/or sky) may be supplied. The geolocation may be used to select a filter with a brand associated with an establishment proximate to the geolocation. For example, a restaurant or store may sponsor a photo filter that includes a brand associated with the restaurant or store. In this case, in addition to the brand, the photo filter may include other indicia associated with the restaurant (e.g., an image of a hamburger or taco) or store (e.g., an image of a surfboard or sunglasses).


The attributes associated with the client device may include established preferences associated with the client device. The established preferences may be defined by explicitly stated preferences supplied by a user. Alternately, the established preferences may be derived from prior use patterns. For example, explicitly stated or derived preferences may indicate that photo filters with a temperature overlay, date and/or time overlay be supplied.


The attributes of the photograph may include the physical environment captured in the photograph. For example, the photograph may be evaluated to identify an urban setting, a rural setting, a sunset a seascape and the like. Filters applicable to the physical environment may then be supplied.


The attributes of the photograph may include an object depicted in the photograph. For example, the evaluation may identify a building, a building feature (e.g., door or roof), a flower, an individual, an animal and the like. Filters applicable to such objects may then be supplied.


The next operation of FIG. 3 is to apply the photo filters 308. For example, a swipe across the display of a client device 100_1 may cause a photo filter to slide across the original photo. FIG. 5 illustrates the result of a first swipe motion, which results in a darkening filter 500 being applied to the original photo. Another swipe motion may result in another filter being presented. For example, FIG. 6 illustrates the result of a second swipe motion, which results in an object specific filter 600 being presented. In this case, the object specific filter 600 relates to the identification of a door in the photo. The identification of the door may result in the supply of a variety of filters for different door colors. Another swipe of the display may result in still another filter, such as shown in FIG. 7. The filter of FIG. 7 includes an object specific filter 700, in this case for a roof of a building. The filter also includes a brand component 702. This filter also includes an overlay of the temperature 704 when the photo was taken. A time overlay 706 and date overlay 708 are also supplied.


Returning to FIG. 3, the next operation is to select a photo filter 310. Selection of a photo filter may include selection of one or more available filters. The photo may then be saved with the applicable filter or filters. The photo and filter may also be sent to another user 312. In this case, the server 204 routes 314 the photo to another client 1002, which displays the photo with the filter 316.


Photograph filters may also be selected based upon popular filters. Branded filters may be supplied based upon an auction mechanism. For example, vendors may bid on photo filters to be supplied based upon characteristics of a user, location of a user, content of a photograph and the like.


An embodiment of the present invention relates to a computer storage product with a non-transitory computer readable storage medium having computer code thereon for performing various computer-implemented operations. The media and computer code may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well known and available to those having skill in the computer software arts. Examples of computer-readable media include, but are not limited to: magnetic media, optical media, magneto-optical media and hardware devices that are specially configured to store and execute program code, such as application-specific integrated circuits (“ASICs”), programmable logic devices (“PLDs”) and ROM and RAM devices. Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. For example, an embodiment of the invention may be implemented using JAVA®, C++, or other object-oriented programming language and development tools. Another embodiment of the invention may be implemented in hardwired circuitry in place of, or in combination with, machine-executable software instructions.


The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the invention. However, it will be apparent to one skilled in the art that specific details are not required in order to practice the invention. Thus, the foregoing descriptions of specific embodiments of the invention are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed; obviously, many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, they thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the following claims and their equivalents define the scope of the invention.

Claims
  • 1. A mobile client device comprising: an image sensor configured to capture a picture;a user interface configured to present information and receive user selections;a photo controller configured to designate a plurality of photo filters relevant to at least one attribute of the mobile client device;a processor; anda memory including instruction for execution by the processor, wherein the instructions, when executed by the processor, cause the mobile client device to:present a photo prompt on the user interface of the mobile client device;capture a picture with the image sensor in response to an activation of the photo prompt;display the captured picture on the user interface;collect at least one attribute of the mobile client device;designate, using the photo controller, the plurality of photo filters relevant to the at least one attribute of the mobile client device, wherein the plurality of photo filters are configured to be individually selectable for sequential presentation on the user interface of the mobile client device, wherein the plurality of photo filters are independently selectable in sequence by a user in response to a gesture on the displayed picture as presented by the user interface of the mobile client device, and wherein each of the plurality of photo filters is an overlay for presentation on top of the captured picture to augment the captured picture;receive on the user interface the gesture to select one of the plurality of photo filters in sequence;apply the selected photo filter to the picture; andsend the picture with the applied photo filter to a server for routing to another mobile client device.
  • 2. The mobile client device of claim 1, wherein the instructions, when executed by the processor, further cause the mobile client device to: display a first of the plurality of photo filters on the picture; andtransition from display of the first of the plurality of photo filters on the picture to a second of the plurality of photo filters on the picture in response to the gesture.
  • 3. The mobile client device of claim 1, wherein the at least one attribute comprises geolocation information and wherein the instruction to designate the plurality of photo filters comprises instructions to cause the mobile client device to: designate the plurality of photo filters based on the geolocation information.
  • 4. The mobile client device of claim 1, wherein the at least one attribute comprises temperature information and wherein the instruction to designate the plurality of photo filters comprises instructions to cause the mobile client device to: designate the plurality of photo filters based on the temperature information.
  • 5. The mobile client device of claim 1, wherein the at least one attribute comprises time information and wherein the instruction to designate the plurality of photo filters comprises instructions to cause the mobile client device to: designate the plurality of photo filters based on the time information.
  • 6. The mobile client device of claim 1, wherein the at least one attribute comprises physical environment information and wherein the instruction to designate the plurality of photo filters comprises instructions to cause the mobile client device to: designate the plurality of photo filters based on the physical environment information.
  • 7. The mobile client device of claim 1, wherein the user interface comprises a display and wherein the instruction to receive on the user interface the gesture to select one of the plurality of photo filters comprises instructions to cause the mobile client device to: receive the gesture on the display.
  • 8. The mobile client device of claim 7, wherein the gesture is a swipe across the display.
  • 9. A method for processing pictures captured with a mobile client device, the mobile client device having a user interface, a camera, and a photo controller, the method comprising: presenting a photo prompt on the user interface of the mobile client device;capturing a picture in response to an activation of the photo prompt;displaying the captured picture on the user interface;collecting at least one attribute of the mobile client device;designating, using the photo controller, a plurality of photo filters relevant to the at least one attribute of the mobile client device, wherein the plurality of photo filters are configured to be individually selectable for sequential presentation on the user interface of the mobile client device, wherein the plurality of photo filters are independently selectable in sequence by a user in response to a gesture on the displayed picture as presented by the user interface of the mobile client device, and wherein each of the plurality of photo filters is an overlay for presentation on top of the captured picture to augment the captured picture;receiving on the user interface the gesture to select one of the plurality of photo filters in sequence;applying the selected photo filter to the picture; andsending the picture with the selected photo filter to a server for routing to another mobile client device.
  • 10. The method of claim 9, wherein the receiving the gesture to select one of the plurality of photo filters comprises receiving an indication that a user has applied the gesture to the picture while the picture is presented on the user interface of the mobile client device.
  • 11. The method of claim 9, further comprising: displaying a first of the plurality of photo filters on the picture; andtransitioning from displaying the first of the plurality of photo filters on the picture to a second of the plurality of photo filters on the picture in response to the gesture.
  • 12. The method of claim 9, wherein the at least one attribute comprises geolocation information and wherein the designating the plurality of photo filters comprises: designating the plurality of photo filters based on the geolocation information.
  • 13. The method of claim 9, wherein the at least one attribute comprises temperature information and wherein the designating the plurality of photo filters comprises: designating the plurality of photo filters based on the temperature information.
  • 14. The method of claim 9, wherein the at least one attribute comprises time information and wherein the designating the plurality of photo filters comprises: designating the plurality of photo filters based on the time information.
  • 15. The method of claim 9, wherein the at least one attribute comprises physical environment information and wherein designating the plurality of photo filters comprises: designating the plurality of photo filters based on the physical environment information.
  • 16. The method of claim 9, wherein the user interface comprises a display and wherein the receiving on the user interface the gesture to select one of the plurality of photo filters comprises: receiving the gesture on the display.
  • 17. The method of claim 16, wherein the gesture is a swipe across the display.
  • 18. A non-transitory machine-readable storage medium storing processor executable instructions that, when executed by a processor of a mobile client device, cause the mobile client device to perform operations comprising: presenting a photo prompt on the user interface of the mobile client device;capturing a picture with the mobile client device in response to an activation of the photo prompt;displaying the captured picture on a user interface mobile client device;collecting at least one attribute of the mobile client device;designating, using a photo controller of the mobile client device, a plurality of photo filters relevant to the at least one attribute of the mobile client device, wherein the plurality of photo filters are configured to be individually selectable for sequential presentation on the user interface of the mobile client device, wherein the plurality of photo filters are independently selectable in sequence by a user in response to a gesture on the displayed picture as presented by the user interface of the mobile client device, and wherein each of the plurality of photo filters is an overlay for presentation on top of the captured picture to augment the captured picture;receiving on the user interface the gesture to select one of the plurality of photo filters in sequence;applying the selected photo filter to the picture; andsending the picture with the selected photo filter to a server for routing to another mobile client device.
  • 19. The non-transitory machine-readable storage medium of claim 18, wherein the instructions further cause the mobile client device to: displaying a first of the plurality of photo filters on the picture; andtransitioning from displaying the first of the plurality of photo filters on the picture to a second of the plurality of photo filters on the picture in response to the gesture.
  • 20. The non-transitory machine-readable storage medium of claim 18, wherein the at least one attribute comprises geolocation information and wherein the designating the plurality of photo filters comprises: designating the plurality of photo filters based on the geolocation information.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/419,556 filed May 22, 2019, which is a continuation of U.S. patent application Ser. No. 15/829,544 filed Dec. 1, 2017, now issued as U.S. Pat. No. 10,348,960, which is a continuation of U.S. patent application Ser. No. 15/224,262 filed Jul. 29, 2016, now issued as U.S. Pat. No. 10,154,192, which is a continuation of U.S. patent application Ser. No. 14/977,380, filed on Dec. 1, 2015, now issued as U.S. Pat. No. 9,407,816, which is a continuation of U.S. patent application Ser. No. 14/325,270, filed on Jul. 7, 2014, now issued as U.S. Pat. No. 9,225,897, all of which are incorporated herein by reference in their entirety.

US Referenced Citations (206)
Number Name Date Kind
5999932 Paul Dec 1999 A
6154764 Nitta et al. Nov 2000 A
6167435 Druckenmiller et al. Dec 2000 A
6204840 Petelycky et al. Mar 2001 B1
6216141 Straub et al. Apr 2001 B1
6310694 Okimoto et al. Oct 2001 B1
6484196 Maurille Nov 2002 B1
6665531 Söderbacka et al. Dec 2003 B1
6724403 Santoro et al. Apr 2004 B1
6757713 Ogilvie et al. Jun 2004 B1
6898626 Ohashi May 2005 B2
7124164 Chemtob Oct 2006 B1
7149893 Leonard et al. Dec 2006 B1
7203380 Chiu et al. Apr 2007 B2
7243163 Friend et al. Jul 2007 B1
7356564 Hartselle et al. Apr 2008 B2
7519670 Hagale et al. Apr 2009 B2
8001204 Burtner et al. Aug 2011 B2
8098904 Ioffe et al. Jan 2012 B2
8112716 Kobayashi Feb 2012 B2
8276092 Narayanan et al. Sep 2012 B1
8279319 Date Oct 2012 B2
8312086 Velusamy et al. Nov 2012 B2
8312097 Siegel et al. Nov 2012 B1
8379130 Forutanpour et al. Feb 2013 B2
8405773 Hayashi et al. Mar 2013 B2
8418067 Cheng et al. Apr 2013 B2
8471914 Sakiyama et al. Jun 2013 B2
8560612 Kilmer et al. Oct 2013 B2
8687021 Bathiche et al. Apr 2014 B2
8744523 Fan et al. Jun 2014 B2
8775972 Spiegel Jul 2014 B2
8788680 Naik Jul 2014 B1
8797415 Arnold Aug 2014 B2
8856349 Jain et al. Oct 2014 B2
9143681 Ebsen et al. Sep 2015 B1
10348960 Sehn Jul 2019 B1
20020047868 Miyazawa Apr 2002 A1
20020122659 Mcgrath et al. Sep 2002 A1
20020144154 Tomkow Oct 2002 A1
20020163531 Ihara et al. Nov 2002 A1
20030016247 Lai et al. Jan 2003 A1
20030052925 Daimon et al. Mar 2003 A1
20030126215 Udell et al. Jul 2003 A1
20030164856 Prager et al. Sep 2003 A1
20040027371 Jaeger Feb 2004 A1
20040111467 Willis Jun 2004 A1
20040203959 Coombes Oct 2004 A1
20040239686 Koyama et al. Dec 2004 A1
20040243531 Dean Dec 2004 A1
20050078804 Yomoda Apr 2005 A1
20050097176 Schatz et al. May 2005 A1
20050104976 Currans May 2005 A1
20050114783 Szeto May 2005 A1
20050122405 Voss et al. Jun 2005 A1
20050193340 Amburgey et al. Sep 2005 A1
20050193345 Klassen et al. Sep 2005 A1
20050198128 Anderson et al. Sep 2005 A1
20050223066 Buchheit et al. Oct 2005 A1
20060114338 Rothschild Jun 2006 A1
20060270419 Crowley et al. Nov 2006 A1
20070040931 Nishizawa Feb 2007 A1
20070073823 Cohen et al. Mar 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070192128 Celestini Aug 2007 A1
20070214216 Carrer et al. Sep 2007 A1
20070233801 Eren et al. Oct 2007 A1
20070243887 Bandhole et al. Oct 2007 A1
20070255456 Funayama Nov 2007 A1
20080025701 Ikeda Jan 2008 A1
20080033930 Warren Feb 2008 A1
20080055269 Lemay et al. Mar 2008 A1
20080104503 Beall et al. May 2008 A1
20080147730 Lee et al. Jun 2008 A1
20080207176 Brackbill et al. Aug 2008 A1
20080222545 Lemay et al. Sep 2008 A1
20080256446 Yamamoto Oct 2008 A1
20080266421 Takahata et al. Oct 2008 A1
20080270938 Carlson Oct 2008 A1
20080313346 Kujawa et al. Dec 2008 A1
20090006565 Velusamy et al. Jan 2009 A1
20090015703 Kim et al. Jan 2009 A1
20090024956 Kobayashi Jan 2009 A1
20090040324 Nonaka Feb 2009 A1
20090042588 Lottin et al. Feb 2009 A1
20090058822 Chaudhri Mar 2009 A1
20090079846 Chou Mar 2009 A1
20090132453 Hangartner et al. May 2009 A1
20090132665 Thomsen et al. May 2009 A1
20090160970 Fredlund et al. Jun 2009 A1
20090265647 Martin et al. Oct 2009 A1
20100082693 Hugg et al. Apr 2010 A1
20100131880 Lee et al. May 2010 A1
20100131895 Wohlert May 2010 A1
20100156933 Cameron et al. Jun 2010 A1
20100159944 Pascal et al. Jun 2010 A1
20100161831 Haas et al. Jun 2010 A1
20100185665 Horn et al. Jul 2010 A1
20100214436 Kim et al. Aug 2010 A1
20100223128 Dukellis et al. Sep 2010 A1
20100223343 Bosan et al. Sep 2010 A1
20100257196 Waters et al. Oct 2010 A1
20100281045 Dean Nov 2010 A1
20100306669 Della Dec 2010 A1
20110004071 Faiola et al. Jan 2011 A1
20110040783 Uemichi et al. Feb 2011 A1
20110040804 Peirce et al. Feb 2011 A1
20110050909 Ellenby et al. Mar 2011 A1
20110050915 Wang et al. Mar 2011 A1
20110102630 Rukes May 2011 A1
20110145564 Moshir et al. Jun 2011 A1
20110197194 D et al. Aug 2011 A1
20110202968 Nurmi et al. Aug 2011 A1
20110211534 Schmidt et al. Sep 2011 A1
20110213845 Logan et al. Sep 2011 A1
20110273575 Lee Nov 2011 A1
20110283188 Farrenkopf et al. Nov 2011 A1
20110320373 Lee et al. Dec 2011 A1
20120028659 Whitney et al. Feb 2012 A1
20120062805 Candelore Mar 2012 A1
20120081573 Park Apr 2012 A1
20120108293 Law et al. May 2012 A1
20120110096 Smarr et al. May 2012 A1
20120113143 Adhikari et al. May 2012 A1
20120113272 Hata May 2012 A1
20120131507 Sparandara et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120143760 Abulafia et al. Jun 2012 A1
20120150978 Monaco et al. Jun 2012 A1
20120166971 Sachson et al. Jun 2012 A1
20120169855 Oh Jul 2012 A1
20120173991 Roberts et al. Jul 2012 A1
20120176401 Hayward et al. Jul 2012 A1
20120184248 Speede Jul 2012 A1
20120200743 Blanchflower et al. Aug 2012 A1
20120210244 De Francisco et al. Aug 2012 A1
20120212632 Mate et al. Aug 2012 A1
20120220264 Kawabata Aug 2012 A1
20120233000 Fisher et al. Sep 2012 A1
20120236162 Imamura Sep 2012 A1
20120239761 Linner et al. Sep 2012 A1
20120250951 Chen Oct 2012 A1
20120268615 Choi et al. Oct 2012 A1
20120278387 Garcia et al. Nov 2012 A1
20120278692 Shi Nov 2012 A1
20120299954 Wada et al. Nov 2012 A1
20120304080 Wormald et al. Nov 2012 A1
20120307096 Ford et al. Dec 2012 A1
20120307112 Kunishige et al. Dec 2012 A1
20120323933 He et al. Dec 2012 A1
20130050260 Reitan Feb 2013 A1
20130057587 Leonard et al. Mar 2013 A1
20130059607 Herz et al. Mar 2013 A1
20130060690 Oskolkov et al. Mar 2013 A1
20130063369 Malhotra et al. Mar 2013 A1
20130067027 Song et al. Mar 2013 A1
20130071093 Hanks et al. Mar 2013 A1
20130083215 Wisniewski Apr 2013 A1
20130085790 Palmer et al. Apr 2013 A1
20130128059 Kristensson May 2013 A1
20130145286 Feng et al. Jun 2013 A1
20130169822 Zhu et al. Jul 2013 A1
20130173729 Starenky et al. Jul 2013 A1
20130182133 Tanabe Jul 2013 A1
20130185131 Sinha et al. Jul 2013 A1
20130194301 Robbins et al. Aug 2013 A1
20130198176 Kim Aug 2013 A1
20130222323 Mckenzie et al. Aug 2013 A1
20130227476 Frey Aug 2013 A1
20130232194 Knapp et al. Sep 2013 A1
20130263031 Oshiro et al. Oct 2013 A1
20130265450 Barnes, Jr., et al. Oct 2013 A1
20130290443 Collins et al. Oct 2013 A1
20130329060 Yim Dec 2013 A1
20130344896 Kirmse et al. Dec 2013 A1
20130346877 Borovoy et al. Dec 2013 A1
20140002578 Rosenberg Jan 2014 A1
20140011538 Mulcahy et al. Jan 2014 A1
20140032682 Prado et al. Jan 2014 A1
20140047045 Baldwin et al. Feb 2014 A1
20140047335 Lewis et al. Feb 2014 A1
20140049652 Moon et al. Feb 2014 A1
20140052485 Shidfar Feb 2014 A1
20140052633 Gandhi Feb 2014 A1
20140057660 Wager Feb 2014 A1
20140122658 Haeger et al. May 2014 A1
20140122787 Shalvi et al. May 2014 A1
20140129953 Spiegel May 2014 A1
20140143143 Fasoli et al. May 2014 A1
20140149519 Redfern et al. May 2014 A1
20140155102 Cooper et al. Jun 2014 A1
20140173457 Wang et al. Jun 2014 A1
20140176732 Cohen et al. Jun 2014 A1
20140189592 Benchenaa et al. Jul 2014 A1
20140204244 Choi et al. Jul 2014 A1
20140207679 Cho Jul 2014 A1
20140214471 Schreiner Jul 2014 A1
20140279436 Dorsey et al. Sep 2014 A1
20140280537 Pridmore et al. Sep 2014 A1
20140282096 Rubinstein et al. Sep 2014 A1
20140317302 Naik Oct 2014 A1
20140325383 Brown et al. Oct 2014 A1
20150046278 Pei et al. Feb 2015 A1
20150116529 Wu et al. Apr 2015 A1
20150172534 Miyakawa et al. Jun 2015 A1
20150222814 Li et al. Aug 2015 A1
Foreign Referenced Citations (1)
Number Date Country
2012-093950 May 2021 JP
Non-Patent Literature Citations (1)
Entry
TechPP “Daily App: InstaPlace (iOS/Android): Give Pictures a Sense of Place,” Retrieved from the Internet on May 18, 2017, 9 pages.
Continuations (5)
Number Date Country
Parent 16419556 May 2019 US
Child 16881214 US
Parent 15829544 Dec 2017 US
Child 16419556 US
Parent 15224262 Jul 2016 US
Child 15829544 US
Parent 14977380 Dec 2015 US
Child 15224262 US
Parent 14325270 Jul 2014 US
Child 14977380 US