The present disclosure relates generally to analysis of user behavior, and more specifically to systems and methods for generating contextual scenarios respective of user device activity.
Mobile launchers are used for navigating through content existing on mobile devices as well as content available over the World Wide Web. The mobile launchers usually configure user interfaces and/or provide content upon receiving requests from a user of the mobile device. Such requests are usually received as a query that a user provides via the interfaces.
As content providers are usually motivated by financial incentives, the content may not be user-oriented. As a result, users may not receive content that efficiently meets their requirements. Specifically, the content may be arranged or otherwise presented so as to emphasize content that provides a financial benefit for content providers rather than emphasizing content that best matches the user's needs at a given time. As a result, user would be required to wade through significant amounts of content that are irrelevant to their present needs before finding relevant content. This would waste user time and computing resources, and may result in user frustration.
Further, even when content is intended to be user-oriented, existing mobile launchers face several drawbacks with respect to user convenience. For example, existing mobile launchers typically require interaction with the user to obtain desired content. The interaction with the user device may be undesirable when, e.g., the user is driving or otherwise engaged in activity that makes interaction difficult or unsafe. Additionally, due to the increasing number of sources of content, a user may not be aware of which content source will provide appropriate content based on the user's current intent.
It would therefore be advantageous to provide a solution that would overcome the deficiencies of the prior art.
A summary of several example embodiments of the disclosure follows. This summary is provided for the convenience of the reader to provide a basic understanding of such embodiments and does not wholly define the breadth of the disclosure. This summary is not an extensive overview of all contemplated embodiments, and is intended to neither identify key or critical elements of all embodiments nor to delineate the scope of any or all aspects. Its sole purpose is to present some concepts of one or more embodiments in a simplified form as a prelude to the more detailed description that is presented later. For convenience, the term “some embodiments” may be used herein to refer to a single embodiment or multiple embodiments of the disclosure.
The disclosed embodiments include a method for executing actions respective of contextual scenarios. The method comprises determining at least one variable based in part on at least one signal captured by at least one sensor of the user device; generating at least one insight based on the at least one variable; generating a context for the user device based on the at least one insight; determining, based on the context, a user intent, wherein the user intent includes at least one action; and causing execution of the at least one action on the user device.
The disclosed embodiments also include a user device configured to execute actions respective of contextual scenarios related to the user device. The user device comprises at least one sensor; a processing unit; and a memory, the memory containing instructions that, when executed by the processing unit, configure the user device to: determine at least one variable based in part on at least one signal captured by the at least one sensor; generate at least one insight respective of the at least one environmental variable generated by the at least one sensor; generate a context for the user device respective of the at least one insight; determine, based on the context, a user intent, wherein the user intent includes at least one action; and execute the at least one action.
The disclosed embodiments also include a system for executing actions respective of contextual scenarios related to a user device. The system comprises a processing unit; and a memory, the memory containing instructions that, when executed by the processing unit, configure the system to: determine at least one variable based in part on at least one signal captured by at least one sensor of the user device; generate at least one insight based on the at least one variable; generate a context for the user device based on the at least one insight; determine, based on the context, a user intent, wherein the user intent includes at least one action; and cause execution of the at least one action on the user device.
The subject matter disclosed herein is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other objects, features, and advantages of the disclosed embodiments will be apparent from the following detailed description taken in conjunction with the accompanying drawings.
It is important to note that the embodiments disclosed herein are only examples of the many advantageous uses of the innovative teachings herein. In general, statements made in the specification of the present application do not necessarily limit any of the various claimed embodiments. Moreover, some statements may apply to some inventive features but not to others. In general, unless otherwise indicated, singular elements may be in plural and vice versa with no loss of generality. In the drawings, like numerals refer to like parts through several views.
The various disclosed embodiments include a method and system for executing actions on a user device respective of context. Insights are generated based on signals collected by the user device. The collected insights are analyzed and aggregated. Contextual scenarios are generated respective of the analysis and aggregation. Based on the contextual scenarios, actions are executed.
The user device may allow access to a plurality of applications 113 (hereinafter referred to individually as an application (app) 113 and collectively as applications 113, merely for simplicity purposes). An application 113 may include a native application and/or a virtual application. A native application (or app) is installed and executed on the user device 110. A virtual application (app) is executed on a server and only relevant content is rendered and sent to the user device 110. That is, the virtual application typically runs within a browser embedded in another program, thereby permitting users to utilize virtual versions of the application without any download to the user device 110 directly.
The user device 110 has an agent 115 installed therein. The agent 115 may be implemented as an application program having instructions that reside in a memory of the respective user device 110. The agent 115 is communicatively connected to the CES 130 over the network 120. The user device 110 further includes a plurality of sensors 111-1 through 111-N (hereinafter referred to collectively as sensors 111 and individually as a sensor 111, merely for simplicity purposes) implemented thereon. The sensors 111 are configured to collect signals related to a current state of a user of the user device 110. The sensors 111 may include, but are not limited to, a global positioning system (GPS), a temperature sensor, an accelerometer, a gyroscope, a microphone, a speech recognizer, a camera, a web camera, a touch screen, and so on. The signals may include, but are not limited to, raw data related to a user of the user device and/or to the user device 110.
In an embodiment, the agent 115 may be configured to determine one or more variables based on the sensor signals and to send the determined variables to the CES 130. Such variables may include environmental variables that indicate, for example, temporal information (e.g., a time of day, a day, a week, a month, a year), location information, motion information, weather information, sounds, images, user interactions with the user device 110, connections to external devices, and so on. In an embodiment, any of the variables may be received from a resource external to the user device 110, e.g., from the variables database 150. Optionally, the received variables may further include personal variables such as, e.g., a user profile, demographic information related to the user, user preferences, and so on.
Respective of the variables, the CES 130 is configured to generate one or more insights related to a user of the user device 110. The insights may include, but are not limited to, classifications of variables as well as conclusions based on the variables. The insights may include environmental insights and/or personal insights. Environmental insights may be generally generated based on variables over which users have typically limited or no control. Examples for such variable include, but not limited to, a time of day, location, motion information, weather information, sounds, images and more. Personal insights may include, but are not limited to, user profile information of the user, demographic information related to the user, actions executed by the user on the user device 110, and so on.
As a non-limiting example, an environmental insight related to a variable indicating a time of 1:00 P.M. on a Monday may classify the variable as a temporal variable and may include a conclusion that the user is at work. As another non-limiting example, an environmental insight related to a variable indicating a Bluetooth connection to a computing system in a car may classify the variable as a connection and include a conclusion that the user is driving and is about to drive.
Based on the insights, the CES 130 is configured to determine a contextual scenario of a context related to the user device 110. The determination of the contextual scenario is performed by the CES 130 using a contextual database 140 having stored therein a plurality of contexts associated with a plurality of respective insights. The context represents a current state of the user as demonstrated via the insights. In an embodiment, based on the contextual scenario, the CES 130 is configured to predict an intent of the user.
The intent may represent the type of content, the content, and/or actions that may be of an interest to the user at a current time period. For example, if the time is 8 A.M. and it is determined that the user device 110 is communicatively connected to a car via Bluetooth, the user intent may be related to “traffic update.” In an embodiment, the sensor signals may be further monitored to determine any changes that may in turn change the intent. In a further embodiment, the changed signals may be analyzed to determine a current (updated) intent of the user.
The CES 130 is further configured to perform a plurality of actions on the user device 110 via the agent 115 based on the contextual scenarios. The actions may include, but are not limited to, displaying and/or launching application programs, displaying and/or providing certain content from the web and/or from the user device 110, and so on.
In an embodiment, performing of an action may include selecting resources (not shown) that would serve the user intent. The resources may be, but are not limited to, web search engines, servers of content providers, vertical comparison engines, servers of content publishers, mobile applications, widgets installed on the user device 110, and so on. In certain configurations, the resources may include “cloud-based” applications, that is, applications executed by servers in a cloud-computing infrastructure, such as, but not limited to, a private-cloud, a public-cloud, or any combination thereof. Selecting resources based on user intent is described further in the above-reference U.S. patent application Ser. No. 13/712,563, assigned to the common assignee, which is hereby incorporated by reference.
The actions to be performed may be determined so as to fulfill the determined user intent(s). For example, if the user intent is determined to be “soccer game update,” the actions performed may be to launch a sports news application program and/or to display a video clip showing highlights of the most recent soccer game. As another example, if the user intent is determined to be “weather forecast,” the actions performed may be to launch a weather application and/or to display an article from the Internet regarding significant meteorological events.
The CES 130 typically includes a processing unit (PU) 138 and a memory (mem) 139 containing instructions to be executed by the processing unit 138. The processing unit 138 may comprise, or be a component of, a larger processing unit implemented with one or more processors. The one or more processors may be implemented with any combination of general-purpose microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), controllers, state machines, gated logic, discrete hardware components, dedicated hardware finite state machines, or any other suitable entities that can perform calculations or other manipulations of information.
The memory 139 may also include machine-readable media for storing software. Software shall be construed broadly to mean any type of instructions, whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise. Instructions may include code (e.g., in source code format, binary code format, executable code format, or any other suitable format of code). The instructions, when executed by the processing unit 138, cause the processing unit 138 to perform the various functions described herein.
The agent 115 In some implementations, the agent 115 can operate and be implemented as a stand-alone program or, alternatively, can communicate and be integrated with other programs or applications executed in the user device 110.
The operation of the CES 130 starts when one or more of a plurality of sensors 1111 through 111N collects a plurality of signals 2011 through 201M (hereinafter referred to individually as a signal 201 and collectively as signals 201, merely for simplicity purposes). The signals 201 are received by the CES 130. Based on the collected signals 201, the plurality of insighters 1311 through 131O are configured to generate a plurality of insights 2021 through 202P (hereinafter referred to individually as an insight 202 or collectively as insights 202, merely for simplicity purposes). Each insight 202 relates to one of the collected signals 201.
The insight aggregator 132 is configured to differentiate between the plurality of insights 202 generated by the insighters 131. The differentiation may include, but is not limited to, identifying common behavior patterns as opposed to frequent uses, thereby increasing the efficiency of the insights generation.
According to the disclosed embodiments, a common behavior pattern may be identified when, for example, a particular signal is received at approximately regular intervals. For example, a common behavior pattern may be identified when a GPS signal indicates that a user is at a particular location between 8 A.M. and 10 AM every business day (Monday through Friday). Such a GPS signal may not be identified as a common behavior pattern when the signal is determined at sporadic intervals. For example, a user occupation of the same location on a Monday morning one week, a Friday afternoon the next week, and a Saturday evening on a third week may not be identified as a common behavior pattern. As another example, a common behavior pattern may be identified when an accelerometer signal indicates that a user is moving at 10 miles per hour every Saturday morning. As yet another example, a common behavior pattern may be identified when a user calls a particular contact in the evening on the first day of each month.
The differentiated insights 202 are sent to a contextual scenario engine (CSE) 133 and/or to a prediction engine 134. The CSE 133 is configured to generate one or more contextual scenarios 203 associated with the insights 202 using the contextual database 140 and to execute actions 204 on the user device 110 using the agent 115 respective thereof. Generating contextual scenarios and executing actions respective thereof are described further herein below with respect to
The prediction engine 134 is configured to predict future behavior of the user device 110 based on the insights 202. Based on the predicted future behavior, the prediction engine 134 may be configured to generate a prediction model. The prediction model may be utilized to determine actions indicating user intents that may be taken by the user in response to particular contextual scenarios. Further, the prediction model may include a probability that a particular contextual scenario will result in a particular action. For example, if user interactions used to generate the prediction model indicate that a user launched an application for seeking drivers 3 out of the last 4 Saturday nights, an action of launching the driver application may be associated with a contextual scenario for Saturday nights and with a 75% probability that the user intends to launch the application on a given Saturday night.
The prediction engine 134 may include an application program predictor (app. predictor) 135 for identifying application programs that are likely to be launched on the user device. The prediction engine 134 may further include an actions predictor 136 for identifying actions that a user is likely to perform on the user device 110. The prediction engine 134 may further include a contact predictor 137 used to identify data related to persons that a user of the user device 110 is likely to contact.
The various elements discussed with reference to
It should be noted that
It should be further noted that the embodiments described herein above with respect to
In S310, environmental and/or personal variables are determined. In an embodiment, the variables may be, but are not limited to, determined based on sensor signals, retrieved from a database, and so on.
In S320, insights are generated respective of the variables. The insight generation may include, but is not limited to, classifying the variables, generating a conclusion based on the variables, and so on. Generating conclusions about the variables may further be based on past user interactions. Such generated conclusions may indicate, for example, whether a particular variable is related to a user behavior pattern.
In optional S330, a weighted factor is generated respective of each insight. Each weighted factor indicates the level of confidence in each insight, i.e., the likelihood that the insight is accurate for the user's current intent. The weighted factors may be adapted over time. To this end, the weighted factors may be based on, for example, previous user interactions with the user device. Specifically, the weighted factors may indicate a probability, based on previous user interactions, that a particular insight is associated with the determined variables.
In S340, a context is generated respective of the insights and their respective weighted factors. Generating contexts may be performed, for example, by matching the insights to insights stored in a contextual database (e.g., the contextual database 140). The generated context may be based on the contextual scenarios associated with each matching insight. In an embodiment, the matching may further include matching textual descriptions of the insights.
In S350, based on the generated context, a user intent and one or more corresponding actions are determined. The user intent and actions may be determined based on a prediction model. In an embodiment, the prediction model may have been generated as described further herein above with respect to
In S360, the determined actions are executed on the user device. It should be noted that the actions are performed without user intervention, i.e., the user may not be required to enter any queries, activate any function on the user device, and so on. In an embodiment, the actions are triggered by the agent independently or under the control of the CES 130.
In S370, it is checked whether additional variables have been determined and, if so, execution continues with S310; otherwise, execution terminates. The checks for additional variables may be performed, e.g., continuously, at regular intervals, and/or upon determination that one or more signals have changed.
In an embodiment, the method of
As a non-limiting example, a GPS signal is used to determine environmental variables indicating that the user is at the address of a supermarket on a Saturday morning. Based on the variables, an insight illustrating that the variable is related to a location and a conclusion that variables are in accordance with the user's typical behavior pattern is generated. The insight is further matched to insights stored in a contextual database to identify contextual scenarios associated thereto and a context is generated based on the contextual scenarios. The context indicates that the user is performing his weekly grocery shopping. A prediction model is applied to the context to determine that there is a 90% probability that the user launches a grocery shopping application when at this location on a Saturday morning. Accordingly, the grocery shopping application is launched.
The various embodiments disclosed herein can be implemented as hardware, firmware, software, or any combination thereof. Moreover, the software is preferably implemented as an application program tangibly embodied on a program storage unit or computer readable medium consisting of parts, or of certain devices and/or a combination of devices. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPUs”), a memory, and input/output interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU, whether or not such a computer or processor is explicitly shown. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit. Furthermore, a non-transitory computer readable medium is any computer readable medium except for a transitory propagating signal.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the principles of the disclosed embodiment and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the disclosed embodiments, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
This application claims the benefit of U.S. Provisional Application No. 62/086,728 filed on Dec. 3, 2014. This application is also a continuation-in-part of U.S. patent application Ser. No. 14/850,200 filed on Sep. 10, 2015 which is a continuation of U.S. patent application Ser. No. 13/712,563 filed on Dec. 12, 2012, now U.S. Pat. No. 9,141,702. The Ser. No. 13/712,563 application is a continuation-in-part of: (a) U.S. patent application Ser. No. 13/156,999 filed on Jun. 9, 2011, which claims the benefit of U.S. provisional patent application No. 61/468,095 filed Mar. 28, 2011 and U.S. provisional application No. 61/354,022, filed Jun. 11, 2010; and (b) U.S. patent application Ser. No. 13/296,619 filed on Nov. 15, 2011, now pending. This application is also a continuation-in-part of U.S. patent application Ser. No. 14/583,310 filed on Dec. 26, 2014, now pending. The Ser. No. 14/583,310 application claims the benefit of U.S. Provisional Patent Application No. 61/920,784 filed on Dec. 26, 2013. The Ser. No. 14/583,310 application is also a continuation-in-part of the Ser. No. 13/712,563 application. The contents of the above-referenced applications are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5911043 | Duffy et al. | Jun 1999 | A |
5924090 | Krellenstein | Jul 1999 | A |
6101529 | Chrabaszcz | Aug 2000 | A |
6484162 | Edlund et al. | Nov 2002 | B1 |
6546388 | Edlund et al. | Apr 2003 | B1 |
6560590 | Shwe et al. | May 2003 | B1 |
6564213 | Ortega et al. | May 2003 | B1 |
6605121 | Roderick | Aug 2003 | B1 |
6668177 | Salmimaa et al. | Dec 2003 | B2 |
7181438 | Szabo | Feb 2007 | B1 |
7266588 | Oku | Sep 2007 | B2 |
7302272 | Ackley | Nov 2007 | B2 |
7359893 | Sadri et al. | Apr 2008 | B2 |
7376594 | Nigrin | May 2008 | B2 |
7461061 | Aravamudan et al. | Dec 2008 | B2 |
7529741 | Aravamudan et al. | May 2009 | B2 |
7533084 | Holloway et al. | May 2009 | B2 |
7565383 | Gebhart et al. | Jul 2009 | B2 |
7594121 | Eytchison et al. | Sep 2009 | B2 |
7599925 | Larson et al. | Oct 2009 | B2 |
7636900 | Xia | Dec 2009 | B2 |
7707142 | Ionescu | Apr 2010 | B1 |
7721192 | Milic-Frayling et al. | May 2010 | B2 |
7774003 | Ortega et al. | Aug 2010 | B1 |
7783419 | Taniguchi et al. | Aug 2010 | B2 |
7792815 | Aravamudan et al. | Sep 2010 | B2 |
7797298 | Sareen et al. | Sep 2010 | B2 |
7899829 | Malla | Mar 2011 | B1 |
7958141 | Sundaresan et al. | Jun 2011 | B2 |
7966321 | Wolosin et al. | Jun 2011 | B2 |
7974976 | Yahia et al. | Jul 2011 | B2 |
8032666 | Srinivansan et al. | Oct 2011 | B2 |
8073860 | Venkataraman et al. | Dec 2011 | B2 |
8086604 | Arrouye et al. | Dec 2011 | B2 |
8271333 | Grigsby et al. | Sep 2012 | B1 |
8312484 | McCarty et al. | Nov 2012 | B1 |
8392449 | Pelenur et al. | Mar 2013 | B2 |
8571538 | Sprigg et al. | Oct 2013 | B2 |
8572129 | Lee et al. | Oct 2013 | B1 |
8606725 | Agichtein et al. | Dec 2013 | B1 |
8626589 | Sengupta et al. | Jan 2014 | B2 |
8700804 | Meyers et al. | Apr 2014 | B1 |
8718633 | Sprigg et al. | May 2014 | B2 |
8793265 | Song et al. | Jul 2014 | B2 |
8799273 | Chang et al. | Aug 2014 | B1 |
8825597 | Houston et al. | Sep 2014 | B1 |
8843853 | Smoak et al. | Sep 2014 | B1 |
20030018778 | Martin et al. | Jan 2003 | A1 |
20030182394 | Ryngler | Sep 2003 | A1 |
20040186989 | Clapper | Sep 2004 | A1 |
20040229601 | Zabawskyj et al. | Nov 2004 | A1 |
20050060297 | Najork | Mar 2005 | A1 |
20050071328 | Lawrence | Mar 2005 | A1 |
20050076367 | Johnson et al. | Apr 2005 | A1 |
20050102407 | Clapper | May 2005 | A1 |
20050108406 | Lee et al. | May 2005 | A1 |
20050138043 | Lokken | Jun 2005 | A1 |
20050149496 | Mukherjee et al. | Jul 2005 | A1 |
20050232423 | Horvitz et al. | Oct 2005 | A1 |
20050243019 | Fuller et al. | Nov 2005 | A1 |
20050283468 | Kamvar et al. | Dec 2005 | A1 |
20060004675 | Bennett et al. | Jan 2006 | A1 |
20060031529 | Keith | Feb 2006 | A1 |
20060064411 | Gross et al. | Mar 2006 | A1 |
20060085408 | Morsa | Apr 2006 | A1 |
20060089945 | Paval | Apr 2006 | A1 |
20060095389 | Hirota et al. | May 2006 | A1 |
20060112081 | Qureshi | May 2006 | A1 |
20060129931 | Simons et al. | Jun 2006 | A1 |
20060136403 | Koo | Jun 2006 | A1 |
20060167896 | Kapur et al. | Jul 2006 | A1 |
20060190439 | Chowdhury et al. | Aug 2006 | A1 |
20060200761 | Judd et al. | Sep 2006 | A1 |
20060206803 | Smith | Sep 2006 | A1 |
20060217953 | Parikh | Sep 2006 | A1 |
20060224448 | Herf | Oct 2006 | A1 |
20060224593 | Benton et al. | Oct 2006 | A1 |
20060248062 | Libes et al. | Nov 2006 | A1 |
20060265394 | Raman et al. | Nov 2006 | A1 |
20060271520 | Ragan | Nov 2006 | A1 |
20060277167 | Gross et al. | Dec 2006 | A1 |
20070011167 | Krishnaprasad et al. | Jan 2007 | A1 |
20070055652 | Hood et al. | Mar 2007 | A1 |
20070082707 | Flynt et al. | Apr 2007 | A1 |
20070112739 | Burns et al. | May 2007 | A1 |
20070136244 | MacLaurin et al. | Jun 2007 | A1 |
20070174900 | Marueli et al. | Jul 2007 | A1 |
20070195105 | Koberg | Aug 2007 | A1 |
20070204039 | Inamdar | Aug 2007 | A1 |
20070226242 | Wang et al. | Sep 2007 | A1 |
20070239724 | Ramer et al. | Oct 2007 | A1 |
20070255831 | Hayashi et al. | Nov 2007 | A1 |
20070300185 | Macbeth et al. | Dec 2007 | A1 |
20080065685 | Frank | Mar 2008 | A1 |
20080077883 | Kim et al. | Mar 2008 | A1 |
20080082464 | Ozzie et al. | Apr 2008 | A1 |
20080104195 | Hawkins et al. | May 2008 | A1 |
20080114759 | Yahia et al. | May 2008 | A1 |
20080133605 | MacVarish | Jun 2008 | A1 |
20080172362 | Shacham et al. | Jul 2008 | A1 |
20080172374 | Wolosin et al. | Jul 2008 | A1 |
20080222140 | Lagad et al. | Sep 2008 | A1 |
20080256443 | Li et al. | Oct 2008 | A1 |
20080306913 | Newman et al. | Dec 2008 | A1 |
20080306937 | Whilte et al. | Dec 2008 | A1 |
20080307343 | Robert et al. | Dec 2008 | A1 |
20090013052 | Robarts et al. | Jan 2009 | A1 |
20090013285 | Blyth et al. | Jan 2009 | A1 |
20090031236 | Robertson et al. | Jan 2009 | A1 |
20090049052 | Sharma et al. | Feb 2009 | A1 |
20090063491 | Barclay et al. | Mar 2009 | A1 |
20090070318 | Song et al. | Mar 2009 | A1 |
20090077034 | Kim et al. | Mar 2009 | A1 |
20090077047 | Cooper et al. | Mar 2009 | A1 |
20090094213 | Wang | Apr 2009 | A1 |
20090125374 | Deaton et al. | May 2009 | A1 |
20090125482 | Peregrine et al. | May 2009 | A1 |
20090150792 | Laakso et al. | Jun 2009 | A1 |
20090210403 | Reinshmidt et al. | Aug 2009 | A1 |
20090228439 | Manolescu et al. | Sep 2009 | A1 |
20090234811 | Jamil et al. | Sep 2009 | A1 |
20090234814 | Boerries et al. | Sep 2009 | A1 |
20090239587 | Negron et al. | Sep 2009 | A1 |
20090240680 | Tankovich et al. | Sep 2009 | A1 |
20090265328 | Parekh et al. | Oct 2009 | A1 |
20090277322 | Cai et al. | Nov 2009 | A1 |
20090327261 | Hawkins | Dec 2009 | A1 |
20100030753 | Nad et al. | Feb 2010 | A1 |
20100042912 | Whitaker | Feb 2010 | A1 |
20100082661 | Beaudreau | Apr 2010 | A1 |
20100094854 | Rouhani-Kalleh | Apr 2010 | A1 |
20100106706 | Rorex et al. | Apr 2010 | A1 |
20100162183 | Crolley | Jun 2010 | A1 |
20100184422 | Ahrens | Jul 2010 | A1 |
20100228715 | Lawrence | Sep 2010 | A1 |
20100257552 | Sharan et al. | Oct 2010 | A1 |
20100262597 | Han | Oct 2010 | A1 |
20100268673 | Quadracci | Oct 2010 | A1 |
20100274775 | Fontes et al. | Oct 2010 | A1 |
20100280983 | Cho | Nov 2010 | A1 |
20100299325 | Tzvi et al. | Nov 2010 | A1 |
20100306191 | Lebeau et al. | Dec 2010 | A1 |
20100312782 | Li et al. | Dec 2010 | A1 |
20100332958 | Weinberger et al. | Dec 2010 | A1 |
20110029541 | Schulman | Feb 2011 | A1 |
20110029925 | Robert et al. | Feb 2011 | A1 |
20110035699 | Robert et al. | Feb 2011 | A1 |
20110041094 | Robert et al. | Feb 2011 | A1 |
20110047145 | Ershov | Feb 2011 | A1 |
20110047510 | Yoon | Feb 2011 | A1 |
20110055759 | Robert et al. | Mar 2011 | A1 |
20110058046 | Yoshida et al. | Mar 2011 | A1 |
20110072492 | Mohler et al. | Mar 2011 | A1 |
20110078767 | Cai et al. | Mar 2011 | A1 |
20110093488 | Amacker et al. | Apr 2011 | A1 |
20110131205 | Iyer et al. | Jun 2011 | A1 |
20110225145 | Greene et al. | Sep 2011 | A1 |
20110252329 | Broman | Oct 2011 | A1 |
20110264656 | Dumais et al. | Oct 2011 | A1 |
20110295700 | Gilbane et al. | Dec 2011 | A1 |
20110314419 | Dunn et al. | Dec 2011 | A1 |
20120158685 | White et al. | Jun 2012 | A1 |
20120166411 | MacLaurin et al. | Jun 2012 | A1 |
20120198347 | Hirvonen et al. | Aug 2012 | A1 |
20120284247 | Jiang et al. | Nov 2012 | A1 |
20120284256 | Mahajan et al. | Nov 2012 | A1 |
20130132896 | Lee et al. | May 2013 | A1 |
20130166525 | Naranjo et al. | Jun 2013 | A1 |
20130173513 | Chu et al. | Jul 2013 | A1 |
20130219319 | Park et al. | Aug 2013 | A1 |
20130290319 | Glover et al. | Oct 2013 | A1 |
20130290321 | Shapira et al. | Oct 2013 | A1 |
20140007057 | Gill et al. | Jan 2014 | A1 |
20140025502 | Ramer et al. | Jan 2014 | A1 |
20140049651 | Voth | Feb 2014 | A1 |
20140279013 | Chelly et al. | Sep 2014 | A1 |
20150032714 | Simhon et al. | Jan 2015 | A1 |
Number | Date | Country |
---|---|---|
2288113 | Feb 2011 | EP |
2009278342 | Nov 2009 | JP |
20090285550 | Nov 2009 | JP |
2011044147 | Mar 2011 | JP |
20030069127 | Aug 2003 | KR |
20070014595 | Feb 2007 | KR |
20110009955 | Jan 2011 | KR |
2007047464 | Apr 2007 | WO |
2009117582 | Sep 2009 | WO |
2010014954 | Feb 2010 | WO |
2011016665 | Feb 2011 | WO |
2012083540 | Jun 2012 | WO |
Entry |
---|
Alice Corp V. CLS Bank International, 573 US___ 134 S. Ct. 2347 (2014). |
“Categories App Helps Organize iPhone Apps on your iPhone's Home Screen,” iPhoneHacks, url: http://www.phonehacks.com/2008/10/categoriesapp.html, pp. 1-4, date of publication: Oct. 5, 2008. |
“iOS 4.2 for iPad New Features: Folders,” Purcell, url: http://notebooks.com/2010/11/22/ios-4-2-foripad-new-features-folders/, pp. 1-5, date of publication Nov. 22, 2010. |
Foreign Office Action for Patent Application No. 201380000403.X dated Jun. 2, 2017 by the State Intellectual Property Office of the P.R.C. |
Chinese Foreign Action dated Mar. 13, 2017 from the State Intellectual Property of the P.R.C. for Chinese Patent Application: 201280004301.0, China. |
The Second Office Action for Chinese Patent Application No. 201280004301.0 dated Jan. 19, 2018, SIPO. |
Second Office Action for Chinese Patent Application No. 201280004300.6 dated Aug. 23, 2017, SIPO. |
Foreign Office Action for JP2015-537680 dated Dec. 6, 2016 from the Japanese Patent Office. |
Kurihara, et al., “How to Solve Beginner's Problem, Mac Fan Supports” Mac Fan, Mainichi Communications Cooperation, Dec. 1, 2009, vol. 17, 12th issue, p. 92. |
Notice of the First Office Action for Chinese Patent Application No. 201280004300.6, State Intellectual Property Office of the P.R.C., dated Oct. 26, 2016. |
Chinese Foreign Action dated Sep. 3, 2018 from the State Intellectual Property of the P.R.C. for Chinese Patent Application: 201280004301.0, China. |
Currie, Brenton, Apple adds search filters, previous purchases to iPad App Store, Neowin.net, Feb. 5, 2011, http://www.neowin.net/news/apple-adds-search-filters-previous-purchases-to-ipad-app-store. |
International Search Authority: “Written Opinion of the International Searching Authority” (PCT Rule 43bis.1) including International Search Report for International Patent Application No. PCT/US2012/059548, dated Mar. 26, 2013. |
International Search Authority: “Written Opinion of the International Searching Authority” (PCT Rule 43bis.1) including International Search Report for corresponding International Patent Application No. PCT/US2012/069250, dated Mar. 29, 2013. |
International Searching Authority: International Search Report including “Written Opinion of the International Searching Authority” (PCT Rule 43bis.1) for the related International Patent Application No. PCT/US2011/039808, dated Feb. 9, 2012. |
Nie et al., “Object-Level Ranking: Bringing Order to Web Objects”, International World Wide Web Conference 2005; May 10-14, 2005; Chiba, Japan. |
Qin et al., “Learning to Rank Relationship Objects and Its Application to Web Search”, International World Wide Web Conference 2008 / Refereed Track: Search—Ranking & Retrieval Enhancement; Apr. 21-25, 2008; Beijing, China. |
Kurihara, et al., “How to Solve Beginner's Problem, Mac Fan Supports” Mac Fan, Mainichi Communications Cooperation, Dec. 1, 2009, vol. 17, 12th issue, p. 92, Translated. |
Number | Date | Country | |
---|---|---|---|
20160077715 A1 | Mar 2016 | US |
Number | Date | Country | |
---|---|---|---|
62086728 | Dec 2014 | US | |
61920784 | Dec 2013 | US | |
61468095 | Mar 2011 | US | |
61354022 | Jun 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13712563 | Dec 2012 | US |
Child | 14850200 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14850200 | Sep 2015 | US |
Child | 14955831 | US | |
Parent | 14583310 | Dec 2014 | US |
Child | 14850200 | US | |
Parent | 13296619 | Nov 2011 | US |
Child | 13712563 | US | |
Parent | 13156999 | Jun 2011 | US |
Child | 13296619 | US | |
Parent | 13712563 | Dec 2012 | US |
Child | 14583310 | US |