The present invention, in some embodiments thereof, relates to image processing and, more particularly, but not exclusively, to systems and methods of selectively adjusting visual content on client terminals.
Emerging telecommunications services enable client terminals, such as handheld devices, i.e. cell phones and tablets to exchange data containers of different types and/or to post these data containers in a content sharing platform. For example, a data container may include data representing a text segment, an image, an audio signal, and/or a video signal.
For example, Twitter™ offers a social networking and microblogging service, enabling its users to send and read messages, also referred to as tweets. Tweets are text-based posts of up to 140 characters displayed on the user's profile page.
Other examples include real-time visual media sharing platforms which allow a user to share with his friends an image he or she captured, for example in a social network such as Facebook™ and/or other users of the real-time visual media sharing platforms. Examples of such real-time visual media sharing platforms are Mobli™, Instagram™, and Twitter™.
Some of these real-time visual media sharing platforms suggest functions that allow the user to edit the shared image, for example using color filters and/or the like.
According to some embodiments of the present invention, there is provided a method of selecting visual content editing functions. The method comprises storing a plurality of records each with suitability data of one of a plurality of visual content editing functions, receiving, from a client terminal, a request with visual content data pertaining to a visual content, selecting a group from the plurality of visual content editing functions according to the visual content data, and responding to the request of the client terminal by sending a response with a list which comprises at least one of a plurality of members of the group and a plurality of indications for the plurality of members.
Optionally, the method further comprises capturing the visual content using an image sensor installed in the client terminal.
Optionally, the visual content data comprises positional data pertaining to the client terminal, the selecting being performed according to the positional data.
More optionally, the selecting comprises using the positional data to classify the location of the client terminal, the selecting being performed according to the classification.
Optionally, the visual content data comprises a velocity of the client terminal, the selecting being performed according to the velocity.
Optionally, the method further comprises analyzing textual content a plurality of content providing network sources to identify at least one current event, the selecting being performed according to the at least one current event.
More optionally, the analyzing comprises at least one of a semantic analysis and a statistic analysis of the textual content.
More optionally, the method further comprises acquiring at least one demographic characteristic of a user of the client terminal; wherein the selecting is performed according to the at least one demographic characteristic.
More optionally, the storing comprises automatically generating at least some of the plurality of visual content editing functions according to at least one of the visual content data, personal data pertaining to a user of the client terminal, and information acquired from textual content a plurality of content providing network sources.
Optionally, the request comprises user identification, the selecting comprises acquiring personal data pertaining to a user of the client terminal according to the user identification, the selecting being performed according to the personal data.
More optionally, the personal data being extracted from a social network profile.
More optionally, the personal data comprises a log of previously selected visual content editing functions from the plurality of visual content editing functions.
More optionally, the request comprises user identification, the selecting comprises acquiring social network data pertaining to a friend of a user of the client terminal in a social network according to the user identification, the selecting being performed according to the social network data.
According to some embodiments of the present invention, there is provided a method of adjusting visual content. The method comprises selecting visual content on a client terminal, extracting visual content data pertaining to the visual content, forwarding a request which includes the visual content data to a network node via a network, receiving, in response to the request, a list of a plurality of visual content editing functions from the network node, presenting, on the client terminal, the plurality of visual content editing functions to a user, receiving a selection of at least one member of the list from the user, adjusting the visual content using the at least one member, and outputting the adjusted visual content.
Optionally, the selecting comprises locally capturing the visual content using an image sensor installed in the client terminal.
Optionally, the selecting comprises accessing a database via the network and selecting the visual content using a user interface on the client terminal.
Optionally, the extracting comprises image processing the visual content to perform at least one of identifying an object having a predefined feature in the visual content, classifying a scene depicted in the visual content, recognizing a facial feature in the visual content, and detecting a moving object in the visual content and the request comprises an outcome of the image processing.
More optionally, the method further comprises identifying positional data of the client terminal; the visual content data comprises the positional data.
Optionally, the request comprises user identification data; further comprising extracting personal data according to the user ID; wherein the list being formed according to the personal data.
Optionally, the adjusting comprises acquiring the at least one member from the network node.
According to some embodiments of the present invention, there, is provided a system of providing a plurality of visual content editing functions to a plurality of client terminals. The system comprises a network interface which receives a request having from a client terminal, a repository which stores a plurality of records each with suitability data of one of a plurality of visual content editing functions, and a selection module which selects a group of the plurality of visual content editing functions according to the visual content data and generates a list which comprises at least one of a plurality of members of the group and a plurality of indications for the plurality of members. The network interface sends the list as a response to the request.
Optionally, the system further comprises a plurality of client modules each allows a user to create a visual content editing function and update the repository with the created visual content editing function.
Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.
Implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.
For example, hardware for performing selected tasks according to embodiments of the invention could be implemented as a chip or a circuit. As software, selected tasks according to embodiments of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In an exemplary embodiment of the invention, one or more tasks according to exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions. Optionally, the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data. Optionally, a network connection is provided as well. A display and/or a user input device such as a keyboard or mouse are optionally provided as well.
Some embodiments of the invention are herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced.
In the drawings:
The present invention, in some embodiments thereof, relates to image processing and, more particularly, but not exclusively, to systems and methods of selectively adjusting visual content on client terminals.
According to some embodiments of the present invention, there are methods and systems which provide a remote client terminal with a plurality of visual content editing functions, such as image processing filters and/or overlays, for adjusting a certain visual content based on data extracted therefrom and/or related thereto, real time data, such as news events, and/or personal data related to the user of the client terminal. Optionally, the overlays include sound overlays which are designed to be added to the visual content. The system provides the user with a list of visual content editing functions which are adapted to his needs, preferences and/or to a currently captured visual content. In such a manner, the user is not bound to choose from a fixed set of visual content editing functions and/or has to review a huge collection of visual content editing functions. The selected list exposes the user to different visual content editing functions, which are adapted to his current needs, preferences and/or to a currently captured visual content, such as an image or a video file (i.e. video files, multimedia files, and audio/video files). For example, the system allows a user to receive location based visual content editing functions which allow him to add automatically suitable graphics and/or text by a single click on the presented list. The system further allows a user to receive visual content editing functions which are frequently used by his friends (i.e. social network friends, followers and/or followed users), frequently used by him, and/or suitable to a set of circumstances under which the respective visual content is taken.
Optionally, the system includes a repository with visual content editing functions which are generated in real time according to local and/or global news events and/or visual content editing functions which are generated to meet the needs of any of a plurality of subscribers.
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the Examples. The invention is capable of other embodiments or of being practiced or carried out in various ways.
Reference is now made to
The system 100 allows users of the client terminals 102 to receive a list of visual content editing functions and to select one or more functions therefrom for editing a selected visual content, such as an image or a video file, for example before the sending and/or the uploading of the visual content, for example as a visual message, for example a visual twit, such as a Mobli™ message, Twitter™ message and/or an Instagram™ message. As used herein, a client terminal means, a mobile telephone, a Smartphone, a tablet, a laptop, a camera having a network interface and/or the like.
The system 100, which is optionally implemented using one or more network servers, includes a network interface 103, such as network interface card and/or a port. The network interface facilitates the establishing of bidirectional communication between the system 100 and the client terminals 102. This communication allows the system 100 to receive requests for visual content editing functions and to respond with a list of selected visual content editing functions, for example as described below.
The system 100 further includes a repository 104 for hosting the visual content editing function records, optionally associated with the visual content editing functions and a selection module 105 which receives the request for visual content editing functions via the network interface 103 and generates a list of visual content editing functions according to request, for example as described below. The list is optionally forwarded to the client terminals via the network interface 103.
Optionally, each client terminal 102 hosts a local module 106, such as an app, a widget, or an add on, with a user interface, for example a graphical user interface (GUI), that presents the user with the option to select visual content for adaptation. For example, the client terminal allows the user to select a file from the memory of the client terminal, a social network profile, an image repository account, such as flicker, and/or the like. Additionally or alternatively, the user into is adapted to allow the user to send an image or a video taken using the client terminal for processing by one of the plurality of visual content editing functions. For example, the client terminal is a cellular device having an image sensor and a user interface that allows a user to select a member of a list of visual content editing functions which are presented thereto, for example as exemplified below. Optionally, a plurality of user profiles, such as subscriber profiles, are stored and managed the system 100, for example as described below. Optionally, the user profiles are subscriber profiles of a social network, which are managed by a subscriber database, for example managed by separate web servers 107 and/or in a local repository.
According to some embodiments of the present invention, users may generate visual content editing functions. These visual content editing functions are than uploaded to the repository 104 and a respective documenting record is formed. Optionally, the user defines the sharing of the visual content editing functions. For example, the user defines whether the visual content editing functions are for personal use only, for friends use, and/or for public use. In such embodiments, the visual content editing functions may be tagged with their sharing rights. Optionally, the local module 106 includes visual content editing functions generation module. The generated functions may be customized overlays with selected graphic, color filter with selected colors, and/or any other filters which are planned by the user. For example, each one of
Reference is now made to
First, as shown at 201, visual content is selected for editing by the user. The selection is optionally performed using a user interface on the client terminal, for example a GUI that is presented on a touch screen of client terminal. The selected visual content may be selected from an internal or an external library or a currently captured image or video file.
Optionally, as shown at 202, visual content data is generated. Optionally, the visual content is locally processed to identify descriptive features that allow classifying depicted scenes and/or characters and/or for identifying certain objects. The image processing may be performed using known image processing algorithms, such as face portion recognition algorithm, a close up detection algorithm, a motion detection algorithm and/or the like. In such an embodiment, the visual content data includes include the descriptive information.
Additionally or alternatively, the visual content data includes positional data pertaining to the location of the client terminal. The positional data, such as location, velocity, and/or acceleration, is optionally acquired from a global positioning system (GPS) unit of the client terminal, assisted GPS, or any other positioning system.
Additionally or alternatively, the visual content data includes information pertaining to the type of the client terminal.
Additionally or alternatively, the visual content data includes user identification (ID) pertaining to the user of the client terminal, for example as described below.
Now, as shown at 203, a request with the visual content data is sent to the system 100, for example as a hypertext transfer protocol (HTTP) message or any other web message.
Optionally, the request includes visual content data pertaining to the selected visual content. The visual content data allows the system 100, for example the selection module 105 to select and/or adapt visual content editing functions to be sent, in response, to the client terminal. For example, at the system side, as shown at
Now, as shown at 304, based on the visual content data, the real time data, and/or the user data, a plurality of visual content editing functions are selected from the repository 104.
According to some embodiments of the present invention, for example as described above, the visual content data includes positional data. In these embodiments, the positional data is analyzed to select specific location related visual content editing functions.
In such embodiments, some of all of the related visual content editing function records in the repository 104 may be tagged, and optionally weighted, according to their relevancy to a certain geographical area (i.e. coordinates, a country, a city, and a street) and/or location type (i.e. a restaurant, a working place, home, a bar, an amusement park, a train station, a shop, a mall, and a sport venue).
Optionally, each visual content editing function record includes an array of weighted tags (i.e. scored and/or ranked) each given a value according to the affinity of the respective visual content editing function to a different location type and/or geographical area. In use, the user positional data is matched with the array of weighted tags of each one of the visual content editing function records to identify one or more relevant visual content editing functions.
In such a manner, for example, if the positional data is indicative that the visual content was captured in Australia, a visual content editing function which provides an overlay of the Australian flag and/or a sound overlay of the Australian anthem and/or Australian popular music may be selected. In another example, if the positional data is indicative that the visual content was captured in New York, a visual content editing function which provides an overlay of the statue of liberty and/or a sound overlay of the song New York New York may be provided. In another example, if the positional data is indicative that the visual content was captured in a café, a visual content editing function which provides an overlay of a coffee supplier, the café place, and/or café related activities and/or a sound overlay with which is related to café are provided.
According to some embodiments of the present invention, visual content editing functions are selected according to a velocity that is measured for the client terminal. The velocity is optionally part of the positional data, for example as outlined above. Similarly to the above, the visual content editing function records may include one or more tags which are indicative of the relevancy of a respective visual content editing function to one or more velocities. In such a manner, for example, a visual content editing function which may be selected to high velocities may be an overlay with a “road trip” logo, a smear filter for increasing or decreasing a smearing artifact, a suitable soundtrack and/or the like.
According to some embodiments of the present invention, the real time data includes one or more real time events which occurred lately, for example in the last few hours, days, or weeks. Optionally, the real time events are acquired from news aggregators, such as Google™ News, and/or from different websites using web crawlers. Optionally, semantic and/or statistic analysis is used to extract keywords from the news events. Each visual content editing function record is optionally tagged with metadata that includes a plurality of characterizing words or sentences. The keywords may now be matched with the plurality of characterizing words or sentences of each record to determine the relevancy of the respective visual content editing function to the current events. Optionally, visual content editing functions may be uploaded by the operator in response to current events. In such an embodiment, the operator may manually add tag(s) to or update tag(s) of the visual content editing function records, for example with keywords and/or location relevancy data.
The matching between the current events and the visual content editing function records allows selecting visual content editing functions with a real time contemporary meaning. For example, visual content editing functions which are related to the environment, for example overlays with images of Greenpeace logo, images of wild animals, green color filters and/or respective audio may be selected when the current event is “Earth day”. In another example, an overlay with a specific slogan is selected. Optionally, a sound overlay wherein the slogan is sound is selected.
Optionally, the real time events are weighted according to the scope of coverage they receive in news websites. These events may be weighted according to their geographical relevance to the user. For example, the real time events are weighted according to the scope of coverage they receive in news websites which handle local matters in proximity to the user. For example, the selection module 105 identifies the current location of the user, for example according to his positional data and weights news events from news websites according to their relevancy to the user location. In such an embodiment, news events from the website sandicgo6(dot)com which provides San Diego news from news events of general news websites if the user is located in San Diego.
The matching between positional data of the user and current events and the visual content editing functions allows selecting visual content editing functions which are related to current events in the neighborhood, the city, the country, and/or the area around the user. For example, specific location events such as elections for a local candidate in a U.S. state may be matched for a user that is located in the respective U.S. state with a visual content editing function that is tagged with the candidate's name, such as an overlay with her name. In another example, the location of the user is a stadium and the news indicate that a certain sport match occurs or about to occur in the stadium, visual content editing functions which are related to the playing teams, for example overlays with images of the logo(s) and/or player(s) of the teams, color filters with the colors of the logos of teams, and/or the like may be selected for the user.
According to some embodiments of the present invention, visual content editing functions are selected according to personal information pertaining to the user. Optionally, the system 100 manages a plurality of user profiles, such as subscriber profiles, for example as described above. Each user profile includes personal information pertaining to the user, such as demographic data, visual content editing functions selection history, location positional data history, and/or preferences. In use, the selection module 105 uses the user ID to identify a respective user profile. Additionally or alternatively, the system 100 uses the user ID to extract personal data from his social network page, for example gender, birth date, hometown, education, marital status, and/or interests. The user data from the user profile and/or the social network page is used to select visual content editing functions, for example according to tags in the visual content editing function records, similarly to the described above. In such embodiments, the user consent is optionally acquired.
Additionally or alternatively, the system, 100 uses the user ID to extract personal data which defines favorite visual content editing functions. These favorite functions may be top used visual content editing functions, selected and/or tagged visual content editing functions, visual content editing functions which are used to create most of the posted images and/or the like.
In such embodiments, the visual content editing functions are tagged as suitable for users with certain demographic characteristic(s), such as age range, or a certain combination of demographic characteristics, for example age, gender, and marital status. For example, for the age group of between 12 and 17, a visual content editing function of an overlay with an image of Justin Bieber and/or a sound overlay with a song of Justin Bieber may be selected may be selected, for the combination of age and marital status, a visual content editing function of an overlay “I am single” or “I am a proud Mother” can be selected and based on the birth date, a visual content editing function of an overlay with the text “happy birthday”, a sound overlay with the song “happy birthday” and/or graphical effect of balloons may be selected.
According to some embodiments of the present invention, visual content editing functions are selected according to the social activity of the user and/or his or her friends. For example, the selection module may access the social network profile to analyze the user's and/or the user's friends news feeds to match visual content editing functions, for example based on a keyword analysis, similarly to the described above regarding news events.
For example, visual content editing functions may be selected according to the user's social activity, such as checking in a certain place, such as a bar, attending a certain event, invited to a certain event, and/or adding indicative content, such as a status line or shared content. Likewise, based on user's friends' preferences may be taken into account when selecting visual content editing functions. For example, if the user profile indicates that the user checked-in a specific restaurant, visual content editing function(s), which are related to restaurants in general and/or to the specific restaurant, are selected. In another example if the user profile indicates that the user has a friend with a birthday, visual content editing function(s), which are related to birthdays in general, are selected.
In another embodiment, visual content editing functions are automatically generated based on the social network profile of the user and/or the selections of friends in his network. For example, a visual content editing function with an overlay that includes a copy of the status of the user may be generated. In another example, if a certain amount of friends of that user chose a specific visual content editing function, this visual content editing function is suggested for the user. In another example, if the user profile indicates that the user has a friend with a birthday, visual content editing function(s), which are related to the specific user who has a birthday, are generated, for example an overlay with the friend's name, a function selected from the most frequent functions he uses and/or the like. Friends' data may be acquired using a social connection graphs.
According to some embodiments of the present invention, visual content editing functions are selected according to previous user selections in the system 100. In such an embodiment, visual content editing functions, which were previously used by the user to enhance visual content, are recorded. This allows weighting visual content editing functions according to their usage prevalence, optionally based on additional terms, such as the location of the user when using the function, the time of the day the user uses the function, and/or the like. In such embodiments, visual content editing functions, which have been selected by the user more than a certain amount of times and/or during a certain time of the day and/or in a certain location, are selected or receive a high score or rank.
For example, if when the user visited in a certain location, for instance the Madison Square Garden, he choose a New York Nicks logo, a function with an overlay of the New York Nicks logo may be selected when the user is back in this location and/or in a similar location, for example in other sport venues.
As described above, a user may create customized visual content editing functions and upload them to the system 100. Optionally, visual content editing functions which are generated by the user and/or his friends, for example social network friends, are selected automatically and/or receive a high ranking in a selection process.
As described above, the visual content data may include descriptive features that allow classifying depicted scenes and/or characters and/or for identifying certain objects. In such an embodiment, visual content editing functions which are tagged as suitable for the descriptive features may be selected. For example, if a car is identified in the visual content, a visual content editing function with an overlay that say “I just pimped my ride” may be selected. If detected a plate or a food portion is identified, a visual content editing function with an overlay and/or a sound overlay that includes promotional content of a food company is presented. In another example, of smiling faces are identified, promotional content to a toothbrush maybe presented.
According to some embodiments of the present invention, the selection module combines some or more of the above methods for selecting visual content editing functions. Optionally, visual content editing functions are ranked or scored by some or all of the aforementioned selection methods. In such a manner, visual content editing functions which receive a cumulative rank or score above a certain threshold and/or a number of visual content editing functions with the top rank and/or score are selected for the list. In such a manner, visual content editing functions are scored according to data acquired from the user's social profile, positional data, and real time data.
Reference is now made, one again, to
As shown at 204, the client terminal receives the list of a plurality of visual content editing functions and presents it, as shown at 205, to the user.
The user can now select, as shown at 206, one or more members of the presented list. Optionally, the selected visual content editing functions are stored in the client terminal, for example managed by the local module 106. Additionally or alternatively, the selected visual content editing functions are received in the list. Additionally or alternatively, the visual content editing functions are stored in the repository 104. In such embodiments, after the user selects one or more members, a request for the selected visual content editing functions is forwarded to system 100 which responds with the requested visual content editing functions.
Then, as shown at 207, the visual content is adjusted using the selected visual content editing function(s).
As shown at 208, the adjusted visual content may now be outputted, for example uploaded to a server, posted an or shared with other subscribers of the system 100, for example as a visual twit, such as a Mobli™ message, Twitter™ message and/or an Instagram™ message, uploaded to a social network webpage and/or forwarded to one or more friends, for example as an electronic message such as an multimedia messaging service (MMS) message.
It is expected that during the life of a patent maturing from this application many relevant methods and/or devices will be developed and the scope of the term a module, a client terminal, and a controller is intended to include all such new technologies a priori.
As used herein the term “about” refers to ±10%.
The terms “comprises”, “comprising”, “includes”, “including”, “having” and their conjugates mean “including but not limited to”. This term encompasses the terms “consisting of” and “consisting essentially of”.
The phrase “consisting essentially of” means that the composition or method may include additional ingredients and/or steps, but only if the additional ingredients and/or steps do not materially alter the basic and novel characteristics of the claimed composition or method.
As used herein, the singular form “a” “an” and “the” include plural references unless the context clearly dictates otherwise. For example, the term “a compound” or “at least one compound” may include a plurality of compounds, including mixtures thereof.
The word “exemplary” is used herein to mean “serving as an example, instance or illustration”. Any embodiment described as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.
The word “optionally” is used herein to mean “is provided in some embodiments and not provided in other embodiments”. Any particular embodiment of the invention may include a plurality of “optional” features unless such features conflict.
Throughout this application, various embodiments of this invention may be presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
Whenever a numerical range is indicated herein, it is meant to include any cited numeral (fractional or integral) within the indicated range. The phrases “ranging/ranges between” a first indicate number and a second indicate number and “ranging/ranges from” a first indicate number “to” a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numerals therebetween.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.
All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention. To the extent that section headings are used, they should not be construed as necessarily limiting.
This application is a continuation of and claims the benefit of priority of U.S. patent application Ser. No. 16/538,397, filed on Aug. 12, 2019, which is a continuation of and claims the benefit of priority of U.S. patent application Ser. No. 15/974,409, filed on May 8, 2018, which is a continuation of and claims the benefit of priority of U.S. patent application Ser. No. 15/250,960, filed on Aug. 30, 2016, which is a continuation of and claims the benefit of priority of U.S. patent application Ser. No. 14/232,274, filed on Jan. 12, 2014, which is a National Phase of PCT Patent Application Serial No. PCT/IL2012/050242, filed on Jul. 10, 2012, which claims the benefit of priority of U.S. Provisional Patent Application Ser. No. 61/506,670, filed on Jul. 12, 2011. The contents of the above applications are all incorporated by reference as if fully set forth herein in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
666223 | Shedlock | Jan 1901 | A |
4555775 | Pike | Nov 1985 | A |
4581634 | Williams | Apr 1986 | A |
4975690 | Torres | Dec 1990 | A |
5072412 | Henderson, Jr. et al. | Dec 1991 | A |
5347627 | Hoffmann et al. | Sep 1994 | A |
5493692 | Theimer et al. | Feb 1996 | A |
5581670 | Bier et al. | Dec 1996 | A |
5617114 | Bier et al. | Apr 1997 | A |
5713073 | Warsta | Jan 1998 | A |
5754939 | Herz et al. | May 1998 | A |
5855008 | Goldhaber et al. | Dec 1998 | A |
5883639 | Walton et al. | Mar 1999 | A |
5999932 | Paul | Dec 1999 | A |
6012098 | Bayeh et al. | Jan 2000 | A |
6014090 | Rosen et al. | Jan 2000 | A |
6029141 | Bezos et al. | Feb 2000 | A |
6038295 | Mattes | Mar 2000 | A |
6049711 | Yehezkel et al. | Apr 2000 | A |
6154764 | Nitta et al. | Nov 2000 | A |
6167435 | Druckenmiller et al. | Dec 2000 | A |
6204840 | Petelycky et al. | Mar 2001 | B1 |
6205432 | Gabbard et al. | Mar 2001 | B1 |
6216141 | Straub et al. | Apr 2001 | B1 |
6285381 | Sawano et al. | Sep 2001 | B1 |
6285987 | Roth et al. | Sep 2001 | B1 |
6310694 | Okimoto et al. | Oct 2001 | B1 |
6317789 | Rakavy et al. | Nov 2001 | B1 |
6334149 | Davis, Jr. et al. | Dec 2001 | B1 |
6349203 | Asaoka et al. | Feb 2002 | B1 |
6353170 | Eyzaguirre et al. | Mar 2002 | B1 |
6446004 | Cao et al. | Sep 2002 | B1 |
6449657 | Stanbach et al. | Sep 2002 | B2 |
6456852 | Bar et al. | Sep 2002 | B2 |
6484196 | Maurille | Nov 2002 | B1 |
6487601 | Hubacher et al. | Nov 2002 | B1 |
6523008 | Avrunin | Feb 2003 | B1 |
6542749 | Tanaka et al. | Apr 2003 | B2 |
6549768 | Fraccaroli | Apr 2003 | B1 |
6618593 | Drutman et al. | Sep 2003 | B1 |
6622174 | Ukita et al. | Sep 2003 | B1 |
6631463 | Floyd et al. | Oct 2003 | B1 |
6636247 | Hamzy et al. | Oct 2003 | B1 |
6636855 | Holloway et al. | Oct 2003 | B2 |
6643684 | Malkin et al. | Nov 2003 | B1 |
6658095 | Yoakum et al. | Dec 2003 | B1 |
6665531 | Soderbacka et al. | Dec 2003 | B1 |
6668173 | Greene | Dec 2003 | B2 |
6684238 | Dutta | Jan 2004 | B1 |
6684257 | Camut et al. | Jan 2004 | B1 |
6698020 | Zigmond et al. | Feb 2004 | B1 |
6700506 | Winkler | Mar 2004 | B1 |
6720860 | Narayanaswami | Apr 2004 | B1 |
6724403 | Santoro et al. | Apr 2004 | B1 |
6757713 | Ogilvie et al. | Jun 2004 | B1 |
6832222 | Zimowski | Dec 2004 | B1 |
6834195 | Brandenberg et al. | Dec 2004 | B2 |
6836792 | Chen | Dec 2004 | B1 |
6898626 | Ohashi | May 2005 | B2 |
6959324 | Kubik et al. | Oct 2005 | B1 |
6970088 | Kovach | Nov 2005 | B2 |
6970907 | Ullmann et al. | Nov 2005 | B1 |
6980909 | Root et al. | Dec 2005 | B2 |
6981040 | Konig et al. | Dec 2005 | B1 |
7020494 | Spriestersbach et al. | Mar 2006 | B2 |
7027124 | Foote et al. | Apr 2006 | B2 |
7027663 | Edwards et al. | Apr 2006 | B2 |
7072963 | Anderson et al. | Jul 2006 | B2 |
7085571 | Kalhan et al. | Aug 2006 | B2 |
7110744 | Freeny, Jr. | Sep 2006 | B2 |
7124164 | Chemtob | Oct 2006 | B1 |
7149893 | Leonard et al. | Dec 2006 | B1 |
7173651 | Knowles | Feb 2007 | B1 |
7188143 | Szeto | Mar 2007 | B2 |
7203380 | Chiu et al. | Apr 2007 | B2 |
7206568 | Sudit | Apr 2007 | B2 |
7227937 | Yoakum et al. | Jun 2007 | B1 |
7237002 | Estrada et al. | Jun 2007 | B1 |
7240089 | Boudreau | Jul 2007 | B2 |
7269426 | Kokkonen et al. | Sep 2007 | B2 |
7280658 | Amini et al. | Oct 2007 | B2 |
7304677 | Keelan et al. | Dec 2007 | B2 |
7315823 | Brondrup | Jan 2008 | B2 |
7349768 | Bruce et al. | Mar 2008 | B2 |
7356564 | Hartselle et al. | Apr 2008 | B2 |
7391929 | Edwards et al. | Jun 2008 | B2 |
7394345 | Ehlinger et al. | Jul 2008 | B1 |
7411493 | Smith | Aug 2008 | B2 |
7423580 | Markhovsky et al. | Sep 2008 | B2 |
7454442 | Cobleigh et al. | Nov 2008 | B2 |
7508419 | Toyama et al. | Mar 2009 | B2 |
7512649 | Faybishenko et al. | Mar 2009 | B2 |
7519670 | Hagale et al. | Apr 2009 | B2 |
7535890 | Rojas | May 2009 | B2 |
7546554 | Chiu et al. | Jun 2009 | B2 |
7607096 | Oreizy et al. | Oct 2009 | B2 |
7639943 | Kalajan | Dec 2009 | B1 |
7650231 | Gadler | Jan 2010 | B2 |
7668537 | DeVries | Feb 2010 | B2 |
7770137 | Forbes et al. | Aug 2010 | B2 |
7778973 | Choi | Aug 2010 | B2 |
7779444 | Glad | Aug 2010 | B2 |
7787886 | Markhovsky et al. | Aug 2010 | B2 |
7796946 | Eisenbach | Sep 2010 | B2 |
7801954 | Cadiz et al. | Sep 2010 | B2 |
7814089 | Skrenta et al. | Oct 2010 | B1 |
7856360 | Kramer et al. | Dec 2010 | B2 |
8001204 | Burtner et al. | Aug 2011 | B2 |
8032586 | Challenger et al. | Oct 2011 | B2 |
8082255 | Carlson, Jr. et al. | Dec 2011 | B1 |
8090351 | Klein | Jan 2012 | B2 |
8098904 | Ioffe et al. | Jan 2012 | B2 |
8099109 | Altman et al. | Jan 2012 | B2 |
8112716 | Kobayashi | Feb 2012 | B2 |
8131597 | Hudetz | Mar 2012 | B2 |
8135166 | Rhoads | Mar 2012 | B2 |
8136028 | Loeb et al. | Mar 2012 | B1 |
8146001 | Reese | Mar 2012 | B1 |
8161115 | Yamamoto | Apr 2012 | B2 |
8161417 | Lee | Apr 2012 | B1 |
8195203 | Tseng | Jun 2012 | B1 |
8199747 | Rojas et al. | Jun 2012 | B2 |
8208943 | Petersen | Jun 2012 | B2 |
8214443 | Hamburg | Jul 2012 | B2 |
8234350 | Gu et al. | Jul 2012 | B1 |
8250473 | Haynes | Aug 2012 | B1 |
8266524 | Bailey | Sep 2012 | B2 |
8276092 | Narayanan et al. | Sep 2012 | B1 |
8279319 | Date | Oct 2012 | B2 |
8280406 | Ziskind et al. | Oct 2012 | B2 |
8285199 | Hsu et al. | Oct 2012 | B2 |
8287380 | Nguyen et al. | Oct 2012 | B2 |
8301159 | Hamynen et al. | Oct 2012 | B2 |
8306922 | Kunal et al. | Nov 2012 | B1 |
8312086 | Velusamy et al. | Nov 2012 | B2 |
8312097 | Siegel et al. | Nov 2012 | B1 |
8326315 | Phillips et al. | Dec 2012 | B2 |
8326327 | Hymel et al. | Dec 2012 | B2 |
8332475 | Rosen et al. | Dec 2012 | B2 |
8352546 | Dollard | Jan 2013 | B1 |
8379130 | Forutanpour et al. | Feb 2013 | B2 |
8385950 | Wagner et al. | Feb 2013 | B1 |
8402097 | Szeto | Mar 2013 | B2 |
8405773 | Hayashi et al. | Mar 2013 | B2 |
8418067 | Cheng et al. | Apr 2013 | B2 |
8423409 | Rao | Apr 2013 | B2 |
8471914 | Sakiyama et al. | Jun 2013 | B2 |
8472935 | Fujisaki | Jun 2013 | B1 |
8510383 | Hurley et al. | Aug 2013 | B2 |
8527345 | Rothschild et al. | Sep 2013 | B2 |
8554627 | Svendsen et al. | Oct 2013 | B2 |
8560612 | Kilmer et al. | Oct 2013 | B2 |
8594680 | Ledlie et al. | Nov 2013 | B2 |
8613089 | Holloway et al. | Dec 2013 | B1 |
8660358 | Bergboer et al. | Feb 2014 | B1 |
8660369 | Llano et al. | Feb 2014 | B2 |
8660793 | Ngo et al. | Feb 2014 | B2 |
8682350 | Altman et al. | Mar 2014 | B2 |
8718333 | Wolf et al. | May 2014 | B2 |
8724622 | Rojas | May 2014 | B2 |
8732168 | Johnson | May 2014 | B2 |
8744523 | Fan et al. | Jun 2014 | B2 |
8745132 | Obradovich | Jun 2014 | B2 |
8761800 | Kuwahara | Jun 2014 | B2 |
8768876 | Shim et al. | Jul 2014 | B2 |
8775972 | Spiegel | Jul 2014 | B2 |
8788680 | Naik | Jul 2014 | B1 |
8788943 | Borst | Jul 2014 | B2 |
8790187 | Walker et al. | Jul 2014 | B2 |
8797415 | Arnold | Aug 2014 | B2 |
8798646 | Wang et al. | Aug 2014 | B1 |
8856349 | Jain et al. | Oct 2014 | B2 |
8874677 | Rosen et al. | Oct 2014 | B2 |
8886227 | Schmidt et al. | Nov 2014 | B2 |
8909679 | Root et al. | Dec 2014 | B2 |
8909725 | Sehn | Dec 2014 | B1 |
8972357 | Shim et al. | Mar 2015 | B2 |
8995433 | Rojas | Mar 2015 | B2 |
9015285 | Ebsen et al. | Apr 2015 | B1 |
9020745 | Johnston et al. | Apr 2015 | B2 |
9040574 | Wang et al. | May 2015 | B2 |
9055416 | Rosen et al. | Jun 2015 | B2 |
9094137 | Sehn et al. | Jul 2015 | B1 |
9100806 | Rosen et al. | Aug 2015 | B2 |
9100807 | Rosen et al. | Aug 2015 | B2 |
9113301 | Spiegel et al. | Aug 2015 | B1 |
9119027 | Sharon et al. | Aug 2015 | B2 |
9123074 | Jacobs et al. | Sep 2015 | B2 |
9143382 | Bhogal et al. | Sep 2015 | B2 |
9143681 | Ebsen et al. | Sep 2015 | B1 |
9152477 | Campbell et al. | Oct 2015 | B1 |
9191776 | Root et al. | Nov 2015 | B2 |
9204252 | Root | Dec 2015 | B2 |
9225897 | Sehn et al. | Dec 2015 | B1 |
9258459 | Hartley | Feb 2016 | B2 |
9344606 | Hartley et al. | May 2016 | B2 |
9385983 | Sehn | Jul 2016 | B1 |
9396354 | Murphy et al. | Jul 2016 | B1 |
9407712 | Sehn | Aug 2016 | B1 |
9407816 | Sehn | Aug 2016 | B1 |
9430783 | Sehn | Aug 2016 | B1 |
9439041 | Parvizi et al. | Sep 2016 | B2 |
9443227 | Evans et al. | Sep 2016 | B2 |
9450907 | Pridmore et al. | Sep 2016 | B2 |
9459778 | Hogeg et al. | Oct 2016 | B2 |
9489661 | Evans et al. | Nov 2016 | B2 |
9491134 | Rosen et al. | Nov 2016 | B2 |
9507819 | Gross | Nov 2016 | B2 |
9532171 | Allen et al. | Dec 2016 | B2 |
9537811 | Allen et al. | Jan 2017 | B2 |
9628950 | Noeth et al. | Apr 2017 | B1 |
9710821 | Heath | Jul 2017 | B2 |
9854219 | Sehn | Dec 2017 | B2 |
10334307 | Hogeg et al. | Jun 2019 | B2 |
10440420 | Hogeg et al. | Oct 2019 | B2 |
10999623 | Hogeg et al. | May 2021 | B2 |
20020047868 | Miyazawa | Apr 2002 | A1 |
20020078456 | Hudson et al. | Jun 2002 | A1 |
20020087631 | Sharma | Jul 2002 | A1 |
20020097257 | Miller et al. | Jul 2002 | A1 |
20020122659 | Mcgrath et al. | Sep 2002 | A1 |
20020128047 | Gates | Sep 2002 | A1 |
20020144154 | Tomkow | Oct 2002 | A1 |
20030001846 | Davis et al. | Jan 2003 | A1 |
20030016247 | Lai et al. | Jan 2003 | A1 |
20030017823 | Mager et al. | Jan 2003 | A1 |
20030020623 | Cao et al. | Jan 2003 | A1 |
20030023874 | Prokupets et al. | Jan 2003 | A1 |
20030037124 | Yamaura et al. | Feb 2003 | A1 |
20030052925 | Daimon et al. | Mar 2003 | A1 |
20030101230 | Benschoter et al. | May 2003 | A1 |
20030110503 | Perkes | Jun 2003 | A1 |
20030126215 | Udell | Jul 2003 | A1 |
20030148773 | Spriestersbach et al. | Aug 2003 | A1 |
20030164856 | Prager et al. | Sep 2003 | A1 |
20030229607 | Zellweger et al. | Dec 2003 | A1 |
20040027371 | Jaeger | Feb 2004 | A1 |
20040064429 | Hirstius et al. | Apr 2004 | A1 |
20040078367 | Anderson et al. | Apr 2004 | A1 |
20040111467 | Willis | Jun 2004 | A1 |
20040158739 | Wakai et al. | Aug 2004 | A1 |
20040189465 | Capobianco et al. | Sep 2004 | A1 |
20040203959 | Coombes | Oct 2004 | A1 |
20040215625 | Svendsen et al. | Oct 2004 | A1 |
20040243531 | Dean | Dec 2004 | A1 |
20040243688 | Wugofski | Dec 2004 | A1 |
20050021444 | Bauer et al. | Jan 2005 | A1 |
20050022211 | Veselov et al. | Jan 2005 | A1 |
20050048989 | Jung | Mar 2005 | A1 |
20050078804 | Yomoda | Apr 2005 | A1 |
20050097176 | Schatz et al. | May 2005 | A1 |
20050102381 | Jiang et al. | May 2005 | A1 |
20050104976 | Currans | May 2005 | A1 |
20050114783 | Szeto | May 2005 | A1 |
20050119936 | Buchanan et al. | Jun 2005 | A1 |
20050122405 | Voss et al. | Jun 2005 | A1 |
20050193340 | Amburgey et al. | Sep 2005 | A1 |
20050193345 | Klassen et al. | Sep 2005 | A1 |
20050198128 | Anderson | Sep 2005 | A1 |
20050223066 | Buchheit et al. | Oct 2005 | A1 |
20050271300 | Pina | Dec 2005 | A1 |
20050288954 | McCarthy et al. | Dec 2005 | A1 |
20060026067 | Nicholas et al. | Feb 2006 | A1 |
20060107297 | Toyama et al. | May 2006 | A1 |
20060114338 | Rothschild | Jun 2006 | A1 |
20060119882 | Harris et al. | Jun 2006 | A1 |
20060242239 | Morishima et al. | Oct 2006 | A1 |
20060252438 | Ansamaa et al. | Nov 2006 | A1 |
20060265417 | Amato et al. | Nov 2006 | A1 |
20060270419 | Crowley et al. | Nov 2006 | A1 |
20060287878 | Wadhwa et al. | Dec 2006 | A1 |
20070004426 | Pfleging et al. | Jan 2007 | A1 |
20070038715 | Collins et al. | Feb 2007 | A1 |
20070040931 | Nishizawa | Feb 2007 | A1 |
20070073517 | Panje | Mar 2007 | A1 |
20070073823 | Cohen et al. | Mar 2007 | A1 |
20070075898 | Markhovsky et al. | Apr 2007 | A1 |
20070082707 | Flynt et al. | Apr 2007 | A1 |
20070136228 | Petersen | Jun 2007 | A1 |
20070192128 | Celestini | Aug 2007 | A1 |
20070198340 | Lucovsky et al. | Aug 2007 | A1 |
20070198495 | Buron et al. | Aug 2007 | A1 |
20070208751 | Cowan et al. | Sep 2007 | A1 |
20070210936 | Nicholson | Sep 2007 | A1 |
20070214180 | Crawford | Sep 2007 | A1 |
20070214216 | Carrer et al. | Sep 2007 | A1 |
20070233556 | Koningstein | Oct 2007 | A1 |
20070233801 | Eren et al. | Oct 2007 | A1 |
20070233859 | Zhao et al. | Oct 2007 | A1 |
20070243887 | Bandhole et al. | Oct 2007 | A1 |
20070244750 | Grannan et al. | Oct 2007 | A1 |
20070255456 | Funayama | Nov 2007 | A1 |
20070271521 | Harriger et al. | Nov 2007 | A1 |
20070281690 | Altman et al. | Dec 2007 | A1 |
20080022329 | Glad | Jan 2008 | A1 |
20080025701 | Ikeda | Jan 2008 | A1 |
20080032703 | Krumm et al. | Feb 2008 | A1 |
20080033930 | Warren | Feb 2008 | A1 |
20080043041 | Hedenstroem et al. | Feb 2008 | A2 |
20080049704 | Witteman et al. | Feb 2008 | A1 |
20080058006 | Ladouceur | Mar 2008 | A1 |
20080062141 | Chandhri | Mar 2008 | A1 |
20080076505 | Ngyen et al. | Mar 2008 | A1 |
20080092233 | Tian et al. | Apr 2008 | A1 |
20080094387 | Chen | Apr 2008 | A1 |
20080104503 | Beall et al. | May 2008 | A1 |
20080109844 | Baldeschweiler et al. | May 2008 | A1 |
20080120409 | Sun et al. | May 2008 | A1 |
20080147730 | Lee et al. | Jun 2008 | A1 |
20080148150 | Mall | Jun 2008 | A1 |
20080158230 | Sharma et al. | Jul 2008 | A1 |
20080168033 | Ott et al. | Jul 2008 | A1 |
20080168489 | Schraga | Jul 2008 | A1 |
20080189177 | Anderton et al. | Aug 2008 | A1 |
20080207176 | Brackbill et al. | Aug 2008 | A1 |
20080208692 | Garaventi et al. | Aug 2008 | A1 |
20080021421 | Rasanen et al. | Sep 2008 | A1 |
20080222545 | Lemay | Sep 2008 | A1 |
20080255976 | Altberg et al. | Oct 2008 | A1 |
20080256446 | Yamamoto | Oct 2008 | A1 |
20080256577 | Funaki et al. | Oct 2008 | A1 |
20080266421 | Takahata et al. | Oct 2008 | A1 |
20080270938 | Carlson | Oct 2008 | A1 |
20080288338 | Wiseman et al. | Nov 2008 | A1 |
20080306826 | Kramer et al. | Dec 2008 | A1 |
20080313329 | Wang et al. | Dec 2008 | A1 |
20080313346 | Kujawa et al. | Dec 2008 | A1 |
20080318616 | Chipalkatti et al. | Dec 2008 | A1 |
20090006191 | Arankalle et al. | Jan 2009 | A1 |
20090006565 | Velusamy et al. | Jan 2009 | A1 |
20090015703 | Kim et al. | Jan 2009 | A1 |
20090024956 | Kobayashi | Jan 2009 | A1 |
20090030774 | Rothschild et al. | Jan 2009 | A1 |
20090030999 | Gatzke et al. | Jan 2009 | A1 |
20090040324 | Nonaka | Feb 2009 | A1 |
20090042588 | Lottin et al. | Feb 2009 | A1 |
20090058822 | Chaudhri | Mar 2009 | A1 |
20090079846 | Chou | Mar 2009 | A1 |
20090008971 | Wood et al. | Apr 2009 | A1 |
20090089678 | Sacco et al. | Apr 2009 | A1 |
20090093261 | Ziskind | Apr 2009 | A1 |
20090011971 | Lo et al. | May 2009 | A1 |
20090132341 | Klinger | May 2009 | A1 |
20090132453 | Hangartner et al. | May 2009 | A1 |
20090132665 | Thomsen et al. | May 2009 | A1 |
20090148045 | Lee et al. | Jun 2009 | A1 |
20090153492 | Popp | Jun 2009 | A1 |
20090157450 | Athsani et al. | Jun 2009 | A1 |
20090157752 | Gonzalez | Jun 2009 | A1 |
20090160970 | Fredlund et al. | Jun 2009 | A1 |
20090163182 | Gatti et al. | Jun 2009 | A1 |
20090177299 | Bartel Marinus | Jul 2009 | A1 |
20090192900 | Collision | Jul 2009 | A1 |
20090199242 | Johnson et al. | Aug 2009 | A1 |
20090215469 | Fisher et al. | Aug 2009 | A1 |
20090232354 | Camp, Jr. et al. | Sep 2009 | A1 |
20090234815 | Boerries | Sep 2009 | A1 |
20090239552 | Churchill et al. | Sep 2009 | A1 |
20090249222 | Schmidt et al. | Oct 2009 | A1 |
20090249244 | Robinson et al. | Oct 2009 | A1 |
20090259949 | Verlaan et al. | Oct 2009 | A1 |
20090265647 | Martin et al. | Oct 2009 | A1 |
20090288022 | Almstrand | Nov 2009 | A1 |
20090291672 | Treves et al. | Nov 2009 | A1 |
20090292608 | Polachek | Nov 2009 | A1 |
20090319607 | Belz et al. | Dec 2009 | A1 |
20090327073 | Li | Dec 2009 | A1 |
20100062794 | Han | Mar 2010 | A1 |
20100080551 | Chen | Apr 2010 | A1 |
20100082427 | Burgener et al. | Apr 2010 | A1 |
20100082693 | Hugg et al. | Apr 2010 | A1 |
20100100568 | Papin et al. | Apr 2010 | A1 |
20100113065 | Narayan et al. | May 2010 | A1 |
20100120453 | Tamchina | May 2010 | A1 |
20100130233 | Lansing | May 2010 | A1 |
20100131880 | Lee et al. | May 2010 | A1 |
20100131895 | Wohlert | May 2010 | A1 |
20100153144 | Miller et al. | Jun 2010 | A1 |
20100159944 | Pascal et al. | Jun 2010 | A1 |
20100161658 | Hamynen et al. | Jun 2010 | A1 |
20100161831 | Haas et al. | Jun 2010 | A1 |
20100162149 | Sheleheda et al. | Jun 2010 | A1 |
20100183280 | Beauregard et al. | Jul 2010 | A1 |
20100185552 | Deluca et al. | Jul 2010 | A1 |
20100185665 | Horn et al. | Jul 2010 | A1 |
20100191631 | Weidmann | Jul 2010 | A1 |
20100197318 | Petersen et al. | Aug 2010 | A1 |
20100197319 | Petersen et al. | Aug 2010 | A1 |
20100198683 | Aarabi | Aug 2010 | A1 |
20100198694 | Muthukrishnan | Aug 2010 | A1 |
20100198826 | Petersen et al. | Aug 2010 | A1 |
20100198828 | Petersen et al. | Aug 2010 | A1 |
20100198862 | Jennings et al. | Aug 2010 | A1 |
20100198870 | Petersen et al. | Aug 2010 | A1 |
20100198917 | Petersen et al. | Aug 2010 | A1 |
20100201482 | Robertson et al. | Aug 2010 | A1 |
20100201536 | Robertson et al. | Aug 2010 | A1 |
20100214436 | Kim et al. | Aug 2010 | A1 |
20100223128 | Dukellis et al. | Sep 2010 | A1 |
20100223343 | Bosan et al. | Sep 2010 | A1 |
20100250109 | Johnston et al. | Sep 2010 | A1 |
20100257196 | Waters et al. | Oct 2010 | A1 |
20100259386 | Holley et al. | Oct 2010 | A1 |
20100273509 | Sweeney et al. | Oct 2010 | A1 |
20100281045 | Dean | Nov 2010 | A1 |
20100306669 | Della Pasqua | Dec 2010 | A1 |
20110004071 | Faiola et al. | Jan 2011 | A1 |
20110010205 | Richards | Jan 2011 | A1 |
20110012929 | Grosz | Jan 2011 | A1 |
20110026898 | Lussier | Feb 2011 | A1 |
20110029512 | Folgner et al. | Feb 2011 | A1 |
20110040783 | Uemichi et al. | Feb 2011 | A1 |
20110040804 | Peirce et al. | Feb 2011 | A1 |
20110050909 | Ellenby et al. | Mar 2011 | A1 |
20110050915 | Wang et al. | Mar 2011 | A1 |
20110064388 | Brown et al. | Mar 2011 | A1 |
20110066743 | Hurley et al. | Mar 2011 | A1 |
20110083101 | Sharon et al. | Apr 2011 | A1 |
20110102630 | Rukes | May 2011 | A1 |
20110119133 | Igelman et al. | May 2011 | A1 |
20110137881 | Cheng et al. | Jun 2011 | A1 |
20110145564 | Moshir et al. | Jun 2011 | A1 |
20110159890 | Fortescue et al. | Jun 2011 | A1 |
20110164163 | Bilbrey et al. | Jul 2011 | A1 |
20110197194 | D'Angelo et al. | Aug 2011 | A1 |
20110202598 | Evans et al. | Aug 2011 | A1 |
20110202968 | Nurmi | Aug 2011 | A1 |
20110211534 | Schmidt et al. | Sep 2011 | A1 |
20110213845 | Logan et al. | Sep 2011 | A1 |
20110215966 | Kim et al. | Sep 2011 | A1 |
20110225048 | Nair | Sep 2011 | A1 |
20110231288 | Crisan | Sep 2011 | A1 |
20110238763 | Shin et al. | Sep 2011 | A1 |
20110255736 | Thompson et al. | Oct 2011 | A1 |
20110273575 | Lee | Nov 2011 | A1 |
20110282799 | Huston | Nov 2011 | A1 |
20110283188 | Farrenkopf | Nov 2011 | A1 |
20110314419 | Dunn et al. | Dec 2011 | A1 |
20110320373 | Lee et al. | Dec 2011 | A1 |
20120028659 | Whitney et al. | Feb 2012 | A1 |
20120033718 | Kauffman et al. | Feb 2012 | A1 |
20120036015 | Sheikh | Feb 2012 | A1 |
20120036443 | Ohmori et al. | Feb 2012 | A1 |
20120054797 | Skog et al. | Mar 2012 | A1 |
20120059722 | Rao | Mar 2012 | A1 |
20120062805 | Candelore | Mar 2012 | A1 |
20120066573 | Berger | Mar 2012 | A1 |
20120084731 | Filman et al. | Apr 2012 | A1 |
20120084835 | Thomas et al. | Apr 2012 | A1 |
20120099800 | Llano et al. | Apr 2012 | A1 |
20120108293 | Law | May 2012 | A1 |
20120110096 | Smarr et al. | May 2012 | A1 |
20120113143 | Adhikari et al. | May 2012 | A1 |
20120113272 | Hata | May 2012 | A1 |
20120123830 | Svendsen et al. | May 2012 | A1 |
20120123871 | Svendsen et al. | May 2012 | A1 |
20120123875 | Svendsen et al. | May 2012 | A1 |
20120124126 | Alcazar et al. | May 2012 | A1 |
20120124176 | Curtis et al. | May 2012 | A1 |
20120124458 | Cruzada | May 2012 | A1 |
20120131507 | Sparandara et al. | May 2012 | A1 |
20120131512 | Takeuchi et al. | May 2012 | A1 |
20120001651 | Lalancette et al. | Jun 2012 | A1 |
20120143760 | Abulafia et al. | Jun 2012 | A1 |
20120150978 | Monaco | Jun 2012 | A1 |
20120166971 | Sachson et al. | Jun 2012 | A1 |
20120169855 | Oh | Jul 2012 | A1 |
20120172062 | Altman et al. | Jul 2012 | A1 |
20120173991 | Roberts et al. | Jul 2012 | A1 |
20120176401 | Hayward et al. | Jul 2012 | A1 |
20120184248 | Speede | Jul 2012 | A1 |
20120197724 | Kendall | Aug 2012 | A1 |
20120200743 | Blanchflower et al. | Aug 2012 | A1 |
20120209924 | Evans et al. | Aug 2012 | A1 |
20120210244 | De Francisco et al. | Aug 2012 | A1 |
20120212632 | Mate et al. | Aug 2012 | A1 |
20120220264 | Kawabata | Aug 2012 | A1 |
20120226748 | Bosworth et al. | Sep 2012 | A1 |
20120233000 | Fisher et al. | Sep 2012 | A1 |
20120236162 | Imamura | Sep 2012 | A1 |
20120239761 | Linner et al. | Sep 2012 | A1 |
20120250951 | Chen | Oct 2012 | A1 |
20120252418 | Kandekar et al. | Oct 2012 | A1 |
20120254325 | Majeti et al. | Oct 2012 | A1 |
20120278387 | Garcia | Nov 2012 | A1 |
20120278692 | Shi | Nov 2012 | A1 |
20120290637 | Perantatos et al. | Nov 2012 | A1 |
20120299954 | Wada et al. | Nov 2012 | A1 |
20120304052 | Tanaka et al. | Nov 2012 | A1 |
20120304080 | Wormaid et al. | Nov 2012 | A1 |
20120307096 | Ford et al. | Dec 2012 | A1 |
20120307112 | Kunishige et al. | Dec 2012 | A1 |
20120311434 | Skrenta | Dec 2012 | A1 |
20120319904 | Lee et al. | Dec 2012 | A1 |
20120323933 | He et al. | Dec 2012 | A1 |
20120324018 | Metcalf et al. | Dec 2012 | A1 |
20130006759 | Srivastava et al. | Jan 2013 | A1 |
20130024757 | Doll et al. | Jan 2013 | A1 |
20130036364 | Johnson | Feb 2013 | A1 |
20130045753 | Obermeyer et al. | Feb 2013 | A1 |
20130050260 | Reitan | Feb 2013 | A1 |
20130055083 | Fino | Feb 2013 | A1 |
20130057587 | Leonard et al. | Mar 2013 | A1 |
20130059607 | Herz et al. | Mar 2013 | A1 |
20130060690 | Oskolkov et al. | Mar 2013 | A1 |
20130063369 | Malhotra et al. | Mar 2013 | A1 |
20130067027 | Song et al. | Mar 2013 | A1 |
20130071093 | Hanks et al. | Mar 2013 | A1 |
20130080254 | Thramann | Mar 2013 | A1 |
20130085790 | Palmer et al. | Apr 2013 | A1 |
20130086072 | Peng et al. | Apr 2013 | A1 |
20130090171 | Holton et al. | Apr 2013 | A1 |
20130095857 | Garcia et al. | Apr 2013 | A1 |
20130102328 | Kalofonos | Apr 2013 | A1 |
20130104053 | Thornton et al. | Apr 2013 | A1 |
20130110885 | Brundrett, III | May 2013 | A1 |
20130111514 | Slavin et al. | May 2013 | A1 |
20130128059 | Kristensson | May 2013 | A1 |
20130129252 | Lauper | May 2013 | A1 |
20130132477 | Bosworth et al. | May 2013 | A1 |
20130145286 | Feng et al. | Jun 2013 | A1 |
20130159110 | Rajaram et al. | Jun 2013 | A1 |
20130159919 | Leydon | Jun 2013 | A1 |
20130167087 | Tighe | Jun 2013 | A1 |
20130169822 | Zhu et al. | Jul 2013 | A1 |
20130173729 | Starenky et al. | Jul 2013 | A1 |
20130182133 | Tanabe | Jul 2013 | A1 |
20130185131 | Sinha et al. | Jul 2013 | A1 |
20130191198 | Carlson et al. | Jul 2013 | A1 |
20130194301 | Robbins et al. | Aug 2013 | A1 |
20130198176 | Kim | Aug 2013 | A1 |
20130218965 | Abrol et al. | Aug 2013 | A1 |
20130218968 | Mcevilly et al. | Aug 2013 | A1 |
20130222323 | Mckenzie | Aug 2013 | A1 |
20130227476 | Frey | Aug 2013 | A1 |
20130232194 | Knapp et al. | Sep 2013 | A1 |
20130262568 | Raffel | Oct 2013 | A1 |
20130263031 | Oshiro et al. | Oct 2013 | A1 |
20130265450 | Barnes, Jr. | Oct 2013 | A1 |
20130267253 | Case et al. | Oct 2013 | A1 |
20130275505 | Gauglitz et al. | Oct 2013 | A1 |
20130290443 | Collins et al. | Oct 2013 | A1 |
20130304646 | De Geer | Nov 2013 | A1 |
20130311255 | Cummins et al. | Nov 2013 | A1 |
20130325964 | Berberat | Dec 2013 | A1 |
20130332379 | Hayes | Dec 2013 | A1 |
20130344896 | Kirmse et al. | Dec 2013 | A1 |
20130346869 | Asver et al. | Dec 2013 | A1 |
20130346877 | Borovoy et al. | Dec 2013 | A1 |
20140006129 | Heath | Jan 2014 | A1 |
20140011538 | Mulcahy et al. | Jan 2014 | A1 |
20140019264 | Wachman et al. | Jan 2014 | A1 |
20140032682 | Prado et al. | Jan 2014 | A1 |
20140043204 | Basnayake et al. | Feb 2014 | A1 |
20140045530 | Gordon et al. | Feb 2014 | A1 |
20140047016 | Rao | Feb 2014 | A1 |
20140047045 | Baldwin et al. | Feb 2014 | A1 |
20140047335 | Lewis et al. | Feb 2014 | A1 |
20140049652 | Moon et al. | Feb 2014 | A1 |
20140052485 | Shidfar | Feb 2014 | A1 |
20140052633 | Gandhi | Feb 2014 | A1 |
20140057660 | Wager | Feb 2014 | A1 |
20140082651 | Sharifi | Mar 2014 | A1 |
20140092130 | Anderson et al. | Apr 2014 | A1 |
20140096029 | Schultz | Apr 2014 | A1 |
20140114565 | Aziz et al. | Apr 2014 | A1 |
20140122658 | Haeger et al. | May 2014 | A1 |
20140122787 | Shalvi et al. | May 2014 | A1 |
20140129953 | Spiegel | May 2014 | A1 |
20140143143 | Fasoli et al. | May 2014 | A1 |
20140149519 | Redfern et al. | May 2014 | A1 |
20140155102 | Cooper et al. | Jun 2014 | A1 |
20140173424 | Hogeg et al. | Jun 2014 | A1 |
20140173457 | Wang et al. | Jun 2014 | A1 |
20140189592 | Benchenaa et al. | Jul 2014 | A1 |
20140201227 | Hamilton-Dick | Jul 2014 | A1 |
20140207679 | Cho | Jul 2014 | A1 |
20140214471 | Schreiner, III | Jul 2014 | A1 |
20140222564 | Kranendonk et al. | Aug 2014 | A1 |
20140258405 | Perkin | Sep 2014 | A1 |
20140265359 | Cheng et al. | Sep 2014 | A1 |
20140266703 | Dalley, Jr. et al. | Sep 2014 | A1 |
20140279061 | Elimeliah et al. | Sep 2014 | A1 |
20140279436 | Dorsey et al. | Sep 2014 | A1 |
20140279540 | Jackson | Sep 2014 | A1 |
20140280537 | Pridmore et al. | Sep 2014 | A1 |
20140282096 | Rubinstein et al. | Sep 2014 | A1 |
20140287779 | O'keefe et al. | Sep 2014 | A1 |
20140289833 | Briceno | Sep 2014 | A1 |
20140306986 | Gottesman et al. | Oct 2014 | A1 |
20140317302 | Naik | Oct 2014 | A1 |
20140324627 | Haver et al. | Oct 2014 | A1 |
20140324629 | Jacobs | Oct 2014 | A1 |
20140325383 | Brown et al. | Oct 2014 | A1 |
20140372369 | Babanov | Dec 2014 | A1 |
20150020086 | Chen et al. | Jan 2015 | A1 |
20150046278 | Pei et al. | Feb 2015 | A1 |
20150071619 | Brough | Mar 2015 | A1 |
20150087263 | Branscomb et al. | Mar 2015 | A1 |
20150088622 | Ganschow et al. | Mar 2015 | A1 |
20150095020 | Leydon | Apr 2015 | A1 |
20150096042 | Mizrachi | Apr 2015 | A1 |
20150116529 | Wu et al. | Apr 2015 | A1 |
20150169827 | Laborde | Jun 2015 | A1 |
20150172534 | Miyakawa et al. | Jun 2015 | A1 |
20150178260 | Brunson | Jun 2015 | A1 |
20150222814 | Li et al. | Aug 2015 | A1 |
20150261917 | Smith | Sep 2015 | A1 |
20150312184 | Langholz et al. | Oct 2015 | A1 |
20150350136 | Flynn, III et al. | Dec 2015 | A1 |
20150365795 | Allen et al. | Dec 2015 | A1 |
20150378502 | Hu et al. | Dec 2015 | A1 |
20160006927 | Sehn | Jan 2016 | A1 |
20160014063 | Hogeg et al. | Jan 2016 | A1 |
20160085773 | Chang et al. | Mar 2016 | A1 |
20160085863 | Allen et al. | Mar 2016 | A1 |
20160099901 | Allen et al. | Apr 2016 | A1 |
20160180887 | Sehn | Jun 2016 | A1 |
20160182422 | Sehn et al. | Jun 2016 | A1 |
20160182875 | Sehn | Jun 2016 | A1 |
20160239248 | Sehn | Aug 2016 | A1 |
20160277419 | Allen et al. | Sep 2016 | A1 |
20160321708 | Sehn | Nov 2016 | A1 |
20160373805 | Hogeg et al. | Dec 2016 | A1 |
20170006094 | Abou Mahmoud et al. | Jan 2017 | A1 |
20170053365 | Koch | Feb 2017 | A1 |
20170061308 | Chen et al. | Mar 2017 | A1 |
20170287006 | Azmoodeh et al. | Oct 2017 | A1 |
20180113587 | Allen | Apr 2018 | A1 |
20180255345 | Hogeg et al. | Sep 2018 | A1 |
20190364328 | Hogeg et al. | Nov 2019 | A1 |
Number | Date | Country |
---|---|---|
2887596 | Jul 2015 | CA |
2051480 | Apr 2009 | EP |
2151797 | Feb 2010 | EP |
2315167 | Apr 2011 | EP |
2732383 | Apr 2018 | EP |
3288275 | Dec 2021 | EP |
2399928 | Sep 2004 | GB |
19990073076 | Oct 1999 | KR |
20010078417 | Aug 2001 | KR |
WO-1996024213 | Aug 1996 | WO |
WO-1999063453 | Dec 1999 | WO |
WO-2000058882 | Oct 2000 | WO |
WO-0129642 | Apr 2001 | WO |
WO-2001029642 | Apr 2001 | WO |
WO-2001050703 | Jul 2001 | WO |
WO-2006118755 | Nov 2006 | WO |
WO-2007092668 | Aug 2007 | WO |
WO-2008043143 | Apr 2008 | WO |
WO-2009043020 | Apr 2009 | WO |
WO-2011040821 | Apr 2011 | WO |
WO-2011119407 | Sep 2011 | WO |
WO-2013008238 | Jan 2013 | WO |
WO-2013045753 | Apr 2013 | WO |
WO-2014006129 | Jan 2014 | WO |
WO-2014068573 | May 2014 | WO |
WO-2014115136 | Jul 2014 | WO |
WO-2014194262 | Dec 2014 | WO |
WO-2015192026 | Dec 2015 | WO |
WO-2016044424 | Mar 2016 | WO |
WO-2016054562 | Apr 2016 | WO |
WO-2016065131 | Apr 2016 | WO |
WO-2016100318 | Jun 2016 | WO |
WO-2016100318 | Jun 2016 | WO |
WO-2016100342 | Jun 2016 | WO |
WO-2016149594 | Sep 2016 | WO |
WO-2016179166 | Nov 2016 | WO |
Entry |
---|
Addictivetips.com top 10 Android App for Photo Editing, Jun. 10, 2011, pp. 1-12 http:/Avww.addictivetips.com/mobile/top-1 0-android-apps-for-photo-editing-styling-and-sharing=UTF-8″/>. |
“European Application Serial No. 17196636.9, Response filed Mar. 26, 2021 to Summons to Attend Oral Proceedings dated Dec. 1, 2020”, 23 pgs. |
“”, addictivetips.com, Top 10 Android Apps For Photo Editing, Styling And Sharing, [Online] retrieved from the internet: <http://www.addictivetips.com/mobile/top-10-android-apps-for-photo-editing-styling-and-sharing/>, (Jun. 20, 2011), 1-12. |
“A Whole New Story”, Snap, Inc., [Online] Retrieved from the Internet: <URL: https://www.snap.com/en-US/news/>, (2017), 13 pgs. |
“Adding photos to your listing”, eBay, [Online] Retrieved from the Internet: <URL: http://pages.ebay.com/help/sell/pictures.html>, (accessed May 24, 2017), 4 pgs. |
“U.S. Appl. No. 14/232,274, Final Office Action dated Mar. 25, 2016”, 18 pgs. |
“U.S. Appl. No. 14/232,274, Non Final Office Action dated Sep. 14, 2015”, 16 pgs. |
“U.S. Appl. No. 14/232,274, Notice of Allowance dated Jun. 13, 2016”, 7 pgs. |
“U.S. Appl. No. 14/232,274, Preliminary Amendment filed Jan. 12, 2014”, 8 pgs. |
“U.S. Appl. No. 14/232,274, Response filed May 4, 2016 to Final Office Action dated Mar. 25, 2016”, 13 pgs. |
“U.S. Appl. No. 14/232,274, Response filed Dec. 10, 2015 to Non Final Office Action dated Sep. 14, 2015”, 11 pgs. |
“U.S. Appl. No. 15/250,960, Corrected Notice of Allowability dated Apr. 25, 2019”, 2 pgs. |
“U.S. Appl. No. 15/250,960, Non Final Office Action dated Jul. 5, 2018”, 16 pgs. |
“U.S. Appl. No. 15/250,960, Non Final Office Action dated Oct. 30, 2017”, 15 pgs. |
“U.S. Appl. No. 15/250,960, Notice of Allowance dated Jan. 17, 2019”, 7 pgs. |
“U.S. Appl. No. 15/250,960, Notice of Allowance dated Mar. 14, 2018”, 10 pgs. |
“U.S. Appl. No. 15/250,960, Response filed Jan. 30, 2018 to Non Final Office Action dated Oct. 30, 2017”, 16 pgs. |
“U.S. Appl. No. 15/250,960, Response filed Oct. 4, 2018 to Non Final Office Action dated Jul. 5, 2018”, 14 pgs. |
“U.S. Appl. No. 15/974,409, Corrected Notice of Allowability dated Jul. 22, 2019”, 2 pgs. |
“U.S. Appl. No. 15/974,409, Non Final Office Action dated Jun. 22, 2018”, 13 pgs. |
“U.S. Appl. No. 15/974,409, Non Final Office Action dated Oct. 12, 2018”, 12 pgs. |
“U.S. Appl. No. 15/974,409, Notice of Allowance dated May 17, 2019”, 7 pgs. |
“U.S. Appl. No. 15/974,409, PTO Response to Rule 312 Communication dated Aug. 23, 2019”, 2 pgs. |
“U.S. Appl. No. 15/974,409, Response filed Jan. 14, 2019 to Non Final Office Action dated Oct. 12, 2018”, 14 pgs. |
“U.S. Appl. No. 15/974,409, Response filed Sep. 21, 2018 to Non Final Office Action dated Jun. 22, 2018”, 12 pgs. |
“U.S. Appl. No. 16/538,397, Corrected Notice of Allowability dated Feb. 3, 2021”, 3 pgs. |
“U.S. Appl. No. 16/538,397, Examiner Interview Summary dated Dec. 15, 2020”, 3 pgs. |
“U.S. Appl. No. 16/538,397, Non Final Office Action dated Sep. 4, 2020”, 18 pgs. |
“U.S. Appl. No. 16/538,397, Notice of Allowance dated Jan. 6, 2021”, 7 pgs. |
“U.S. Appl. No. 16/538,397, Response filed Dec. 2, 2020 to Non Final Office Action dated Sep. 4, 2020”, 10 pgs. |
“BlogStomp”, StompSoftware, [Online] Retrieved from the Internet: <URL: http://stompsoftware.com/blogstomp>, (accessed May 24, 2017), 12 pgs. |
“Brazilian Application Serial No. 112014000615-6, Office Action dated Oct. 29, 2019”, w/o English Translation, 5 pgs. |
“Brazilian Application Serial No. 112014000615-6, Response filed Feb. 7, 2020 to Office Action dated Oct. 29, 2019”, w/ Concise Statement of Relevance, 86 pgs. |
“Cup Magic Starbucks Holiday Red Cups come to life with AR app”, Blast Radius, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20160711202454/http://www.blastradius.com/work/cup-magic>, (2016), 7 pgs. |
“Daily App: InstaPlace (iOS/Android): Give Pictures a Sense of Place”, TechPP, [Online] Retrieved from the Internet: <URL: http://techpp.com/2013/02/15/instaplace-app-review>, (2013), 13 pgs. |
“Dear All Photo Apps: Mobli Just Won Filters”, [Online] Retrieved from the Internet: <URL: http://techcrunch.com/2011/09/08/mobli-filters/>, (Sep. 8, 2011), 8 pgs. |
“European Application Serial No. 12811671.2, Communication Pursuant to Article 94(3) EPC dated Jan. 2, 2017”, 8 pgs. |
“European Application Serial No. 12811671.2, Extended European Search Report dated Feb. 4, 2015”, 8 pgs. |
“European Application Serial No. 12811671.2, Response filed May 12, 2017 to Communication Pursuant to Article 94(3) EPC dated Jan. 2, 2017”, 11 pgs. |
“European Application Serial No. 12811671.2, Response filed Aug. 26, 2015 to Extended European Search Report dated Feb. 4, 2015”, 15 pgs. |
“European Application Serial No. 17196636.9, Communication Pursuant to Article 94(3) EPC dated May 31, 2019”, 8 pgs. |
“European Application Serial No. 17196636.9, Extended European Search Report dated Nov. 7, 2017”, 7 pgs. |
“European Application Serial No. 17196636.9, Response filed Dec. 9, 2019 to Communication Pursuant to Article 94(3) EPC dated May 31, 2019”, 14 pgs. |
“European Application Serial No. 17196636.9, Summons to Attend Oral Proceedings dated Dec. 1, 2020”, 11 pgs. |
“Instagram: Beautiful Photo Editing and Sharing App for iPhone”, MakeUseOf, Retrieved From the Internet, (Oct. 15, 2010). |
“InstaPlace Photo App Tell The Whole Story”, [Online] Retrieved from the Internet: <URL: youtu.be/uF_gFkg1hBM>, (Nov. 8, 2013), 113 pgs., 1:02 min. |
“International Application Serial No. PCT/IL2012/050242, International Preliminary Report on Patentability dated Jan. 23, 2014”, 7 pgs. |
“International Application Serial No. PCT/IL2012/050242, International Search Report dated Nov. 9, 2012”, 2 pgs. |
“International Application Serial No. PCT/IL2012/050242, Written Opinion dated Nov. 9, 2012”, 5 pgs. |
“International Application Serial No. PCT/US2015/037251, International Search Report dated Sep. 29, 2015”, 2 pgs. |
“Introducing Snapchat Stories”, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20131026084921/https://www.youtube.com/watch?v=88Cu3yN-LIM>, (Oct. 3, 2013), 92 pgs.; 00:47 min. |
“Israel Application Serial No. 230366, Office Action dated Mar. 16, 2017”, 2 pgs. |
“Israel Application Serial No. 230366, Office Action dated Sep. 7, 2015”, 2 pgs. |
“Israel Application Serial No. 255797, Office Action dated Jan. 20, 2020”, w/ English Translation, 7 pgs. |
“Israel Application Serial No. 255797, Office Action dated Oct. 21, 2018”, w/ English translation, 8 pgs. |
“Israel Application Serial No. 255797, Response Filed Feb. 12, 2019 to Office Action dated Oct. 21, 2018”, w/English Claims, 25 pgs. |
“Israel Application Serial No. 255797, Response filed May 20, 2020 to Office Action dated Jan. 20, 2020”, w/ English Claims, 15 pgs. |
“Israeli Application Serial No. 230366, Response filed Jul. 16, 2017 to Office Action dated Mar. 16, 2017”, 37 pgs. |
“Israeli Application Serial No. 255797, Notification Prior to Examination dated Nov. 23, 2017”, with English Translation, 6 pgs. |
“Macy's Believe-o-Magic”, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20190422101854/https://www.youtube.com/watch?v=xvzRXy3J0Z0&feature=youtu.be>, (Nov. 7, 2011), 102 pgs.; 00:51 min. |
“Macy's Introduces Augmented Reality Experience in Stores across Country as Part of Its 2011 Believe Campaign”, Business Wire, [Online] Retrieved from the Internet: <URL: https://www.businesswire.com/news/home/20111102006759/en/Macys-Introduces-Augmented-Reality-Experience-Stores-Country>, (Nov. 2, 2011), 6 pgs. |
“Mexican Application Serial No. MX/a/2014/000392, Office Action dated Dec. 12, 2014”, 3 pgs. |
“Mexican Application Serial No. MX/a/2014/000392, Response filed Apr. 16, 2015 to Office Action dated Dec. 12, 2014”, 15 pgs. |
“Starbucks Cup Magic”, [Online] Retrieved from the Internet: <URL: https://www.youtube.com/watch?v=RWwQXi9RG0w>, (Nov. 8, 2011), 87 pgs.; 00:47 min. |
“Starbucks Cup Magic for Valentine's Day”, [Online] Retrieved from the Internet: <URL: https://www.youtube.com/watch?v=8nvqOzjq10w>, (Feb. 6, 2012), 88 pgs.; 00:45 min. |
“Starbucks Holiday Red Cups Come to Life, Signaling the Return of the Merriest Season”, Business Wire, [Online] Retrieved from the Internet: <URL: http://www.businesswire.com/news/home/20111115005744/en/2479513/Starbucks-Holiday-Red-Cups-Life-Signaling-Return>, (Nov. 15, 2011), 5 pgs. |
Carthy, Roi, “Dear All Photo Apps: Mobli Just Won Filters”, TechCrunch, [Online] Retrieved from the Internet: <URL: https://techcrunch.com/2011/09/08/mobli-filters>, (Sep. 8, 2011), 10 pgs. |
Janthong, Isaranu, “Instaplace ready on Android Google Play store”, Android App Review Thailand, [Online] Retrieved from the Internet: <URL: http://www.android-free-app-review.com/2013/01/instaplace-android-google-play-store.html>, (Jan. 23, 2013), 9 pgs. |
Ken, McMahon, et al., “”, Paint Shop Pro 9, [Online] retrieved from the internet: <http://academic.safaribooksonline.com/book/photo-and-graphic-manipulation/9780240519814>, (Mar. 22, 2005), 1-4. |
Khan, Sameed, “Top 10 Android Apps For Photo Editing, Styling And Sharing”, URL: https://www.addictivetips.com/mobile/top-10-android-apps-for-photo-editing-styling-and-sharing/, (Jun. 7, 2011), 12 pgs. |
MacLeod, Duncan, “Macys Believe-o-Magic App”, [Online] Retrieved from the Internet: <URL: http://theinspirationroom.com/daily/2011/macys-believe-o-magic-app>, (Nov. 14, 2011), 10 pgs. |
MacLeod, Duncan, “Starbucks Cup Magic Lets Merry”, [Online] Retrieved from the Internet: <URL: http://theinspirationroom.com/daily/2011/starbucks-cup-magic>, (Nov. 12, 2011), 8 pgs. |
Notopoulos, Katie, “A Guide to the New Snapchat Filters and Big Fonts”, [Online] Retrieved from the Internet: <URL: https://www.buzzfeed.com/katienotopoulos/a-guide-to-the-new-snapchat-filters-and-big-fonts?utm_term=.bkQ9qVZWe#.nv58YXpkV>, (Dec. 22, 2013), 13 pgs. |
Panzarino, Matthew, “Snapchat Adds Filters, A Replay Function and for Whatever Reason, Time, Temperature and Speed Overlays”, TechCrunch, [Online] Retrieved form the Internet: <URL: https://techcrunch.com/2013/12/20/snapchat-adds-filters-new-font-and-for-some-reason-time-temperature-and-speed-overlays/>, (Dec. 20, 2013), 12 pgs. |
Tripathi, Rohit, “Watermark Images in PHP and Save File on Server”, [Online] Retrieved from the Internet: <URL: http://code.rohitink.com/2012/12/28/watermark-images-in-php-and-save-file-on-server>, (Dec. 28, 2012), 4 pgs. |
“Israel Application Serial No. 282379, Office Action dated Oct. 28, 2021”, w/ English Translation, 7 pgs. |
“Israel Application Serial No. 282379, Response filed Feb. 23, 2022 to Office Action dated Oct. 28, 2021”, w/ English claims, 13 pgs. |
“European Application Serial No. 21218404.8, Extended European Search Report dated Mar. 14, 2022”, 11 pgs. |
Number | Date | Country | |
---|---|---|---|
20210227284 A1 | Jul 2021 | US |
Number | Date | Country | |
---|---|---|---|
61506670 | Jul 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16538397 | Aug 2019 | US |
Child | 17224973 | US | |
Parent | 15974409 | May 2018 | US |
Child | 16538397 | US | |
Parent | 15250960 | Aug 2016 | US |
Child | 15974409 | US | |
Parent | 14232274 | US | |
Child | 15250960 | US |