Discovering enterprise content based on implicit and explicit signals

Information

  • Patent Grant
  • 10394827
  • Patent Number
    10,394,827
  • Date Filed
    Monday, March 3, 2014
    10 years ago
  • Date Issued
    Tuesday, August 27, 2019
    5 years ago
Abstract
Recommending relevant content to a user based on personalized implicit and explicit activity signals aggregated for various content items is provided. A user is provided with situational awareness of content they may use by aggregating and displaying content that has been acted on by people the user works with most closely. Relationships between people and activities around content may be represented in a work graph which may be surfaced to the user. Content and relationship information pertaining to content may be surfaced to the user via a user interface. The user may query the content according to a variety of queries such as “popular with my colleagues,” “viewed by me” (i.e., the querying user), “worked on by me,” “most viewed,” and the like.
Description
BACKGROUND

In an enterprise, content items are oftentimes scattered across a variety of workloads and storage systems (e.g., email, social feeds, intranet sites, network file systems, etc.). Individuals in the enterprise may spend time and effort searching for content or asking another individual to share content. Searching for content may require a user to either browse through folder structures in individual workloads or conduct a search using an individual's name or search terms that match the content for which he/she is searching. For example, a user may be presented with a list view of content items from a single source. Additionally, sometimes an individual may not be aware that certain pieces of content that may be relevant to his/her work have already been created, causing a duplicated effort.


It is with respect to these and other considerations that the present invention has been made.


SUMMARY

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.


Embodiments of the present invention solve the above and other problems by recommending relevant content to a user based on personalized implicit (e.g., reading) and explicit (e.g., sharing) activity signals aggregated for various content items. A user is provided with situational awareness of various content items by aggregating and displaying content that has been acted on by people the user works with most closely. Relationships between people and activities around content may be represented in a work graph which may be surfaced to the user. Content and relationship information pertaining to content may be surfaced to the user via a user interface component, referred to herein as a landing page. The user may query the content on the landing page according to a variety of queries such as “popular with my colleagues,” “viewed by me” (i.e., the querying user), “worked on by me,” “most viewed,” and the like.


According to one embodiment, an indication to display an aggregated view of content items relevant to a user may be received, and a determination may be made as to which content items from one or more repositories to display according to a relevance ranking associated with the content items. A user interface may be generated for displaying the content items, wherein the content items may be displayed in an order according to the relevance ranking.


The details of one or more embodiments are set forth in the accompanying drawings and description below. Other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that the following detailed description is explanatory only and is not restrictive of the invention as claimed.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various embodiments of the present invention. In the drawings:



FIG. 1 is a block diagram of one embodiment of a system for providing an aggregated view of top ranking content items based on relevance to a user;



FIG. 2A is an illustration of an example landing page UI comprising a grid of aggregated content items;



FIG. 2B is an illustration of a search query in an example landing page UI comprising a grid of aggregated content items;



FIG. 2C is an illustration of results from a search query displayed as an aggregated grouping of content items;



FIG. 3 is a flow chart of a method for providing an aggregated view of top ranking content items based on relevance to a user;



FIG. 4 is a block diagram illustrating example physical components of a computing device with which embodiments of the invention may be practiced;



FIGS. 5A and 5B are simplified block diagrams of a mobile computing device with which embodiments of the present invention may be practiced; and



FIG. 6 is a simplified block diagram of a distributed computing system in which embodiments of the present invention may be practiced.





DETAILED DESCRIPTION

The following detailed description refers to the accompanying drawings. Wherever possible, the same reference numbers are used in the drawing and the following description to refer to the same or similar elements. While embodiments of the invention may be described, modifications, adaptations, and other implementations are possible. For example, substitutions, additions, or modifications may be made to the elements illustrated in the drawings, and the methods described herein may be modified by substituting, reordering, or adding stages to the disclosed methods. Accordingly, the following detailed description does not limit the invention, but instead, the proper scope of the invention is defined by the appended claims.


As briefly described above, embodiments of the present invention are directed to recommending relevant content to a user based on personalized implicit (e.g., reading) and explicit (e.g., sharing) activity signals aggregated for various content items. Content may be aggregated from multiple content sources and may be surfaced to a user on a user interface (UI) display (sometimes referred to herein as a “landing page”). Navigation of surfaced content may be enabled via one or more predefined queries or “boards” that algorithmically aggregate content matching certain parameters. Content may be recommended to the user based on the user's recent activities, the user's interactions with other users, as well as, activities of the other users.


Referring now to the drawings, in which like numerals represent like elements, various embodiments will be described. FIG. 1 is a block diagram illustrating a system architecture 100 for providing aggregated view of top ranking content items based on relevance to a user. The system architecture 100 includes an aggregator 108 operable to collect organizational relationship data 105 for individuals and activity data 106 associated with individuals 102A-B (collectively 102) and content items 103 from a plurality of information sources 104A-N (collectively 104) and store the relationship data 105 and activity data 106 in a graph 114. The information sources 104 may include various types of workloads or information sources such as social networking services, enterprise social network services, online productivity software suites (which may include applications such as, but not limited to, a word processing application, a spreadsheet application, a slide presentation application, a notes taking application, a calendaring application, a video conferencing, an instant messaging application, etc.), collaboration services, communication software, etc.


Activity data 106 may comprise various types of information such as, but not limited to, presence data, interaction data, data associated with communication with another person (e.g., emailing, messaging, conferencing, etc.), data associated with an individual's activity stream (e.g., authoring or modifying a document, liking, commenting, following, or sharing a document, following a person, commenting on a feed, etc.), trending data, group membership (e.g., inclusion in a distribution list, attendee in a meeting invitation, etc.). Organizational relationship data 105 may comprise various types of information such as, but not limited to, data associated with a project structure or organizational structure (e.g., who an individual works with, works for, is a peer to, directs, manages, is managed by, etc.).


As mentioned above, the organizational relationship data 105 and activity data 106 may be stored in a graph 114. Activities and people relationships may be stored as edges 112A-B (collectively 112), and individuals 102 who act upon a content item 103 or interact with another individual 102, content items 103 that are acted upon may be stored as nodes 110A-C (collectively 110). For example, a node 110 may include an individual 102 (nodes 110A and 110C), a group of individuals, a content item 103 such as a document (node 110B), an email or other communication type, a webpage, etc.


An edge 112 may include various types of actions (i.e., activity edge 112B) (e.g., like, comment, follow, share, authoring, modifying, communication, participation, etc.) and relationships (i.e., relationship edge 112A). Consider for example that an individual 102 “likes” a certain document (i.e., selects a “like” option associated with the document). The individual and the document (content item 103) may be stored as nodes 110 and the “like” selection may be stored as an edge 112.


A relationship edge 112A may include explicit relationships and/or implicit relationships. Explicit relationships may include relationships defined according to an organization structure and data (i.e., organizational relationship data 105). For example, an explicit relationship may include an individual's manager, peers, directs, etc. An explicit relationship may be stored as a relationship edge 112A such as a manager edge, peer edge, directs edge, etc. Implicit relationships may include relationships determined according to activity in one or more workloads (i.e., activity data 106 from one or more information sources 104). For example, an implicit relationship may include an individual 102 following another individual on an enterprise social network service (information source 104), being included on a distribution list with another individual, is a co-author of a document with another individual, emailing (or other type of communication) with another individual, group memberships, commenting on another individual's feed, etc.


Edges 112 may also include inferred edges that may be created between a first individual 102 and a content item 103 acted upon or a person interacted with by a second individual 102 with whom the first individual 102 shares a relationship edge 112A. An inferred edge may also be created between a first individual 102 and a second individual 102 when the second individual acts upon a content item 103 with which the first individual 102 shares an activity edge 112B. For example, a first individual 102 named Ann may share a relationship edge 112A with a second individual 102 named Bob. An inferred edge 112 may be created between Ann and a content item 103 that Bob modifies.


The system 100 may comprise an analytics engine 115 operable to calculate and apply weights on edges 112 according to what activity is performed (e.g., a like, comment, share, follow, email, etc.) and the relationship between a first individual 102 and an individual(s) 102 performing the activity. Weights may also be based on how recently an activity was performed. A weight on a relationship edge 112A may be based on implicit or explicit signals generated through activity on the plurality of workloads, such as an amount and type of activity an individual 102 has with another person, a number of times an individual 102 interacts with a content item 103, the type of interaction, etc. For example, if an individual 102 communicates via email with a first information worker (IW) daily, and is frequently an attendee of meetings that the first IW is also an attendee of, the weight of a relationship edge 112A between the individual 102 and the first IW may be higher than the weight of a relationship edge 112A between the individual 102 and a second IW whom the individual 102 emails less frequently and who share a common “like” of a document on a social network site. A weight on an activity edge 112B may also be based on a type of activity. For example, an “edit” or “share” operation may be considered to be more important than a “like” operation, and thus may have a higher weighting than the “like” operation. An individual's relationship edges 112A and activity edges 112B may be ranked according to their calculated weights.


According to embodiments, an aggregated view of top ranking content items 103 based on relevance to a user 122 may be presented to the user 122, wherein the user 122 is an individual 102 represented in the graph 114. The aggregated content items (aggregated content 116) may be displayed as a grid in a first board referred to herein as a landing page. The content items 103 may be stored across a variety of different repositories and workloads (i.e., information sources 104), and may be persisted and tracked in the graph 114 as described above. The aggregated content 116 may comprise a plurality of content items 103 recommended to the user 122 based on his/her activity, his/her interactions with other individuals 102 and their recent activity. The landing page and other boards will be described in further detail below with reference to FIGS. 2A-2C.


The aggregated view of content items 103 may be presented to the user 122 via a client application 120 on a computing device 118. The computing device 118 may be one of a variety of suitable computing devices described below with reference to FIGS. 4 through 6. For example, the computing device 118 may include a tablet computing device, a desktop computer, a mobile communication device, a laptop computer, a laptop/tablet hybrid computing device, a gaming device, or other type of computing device for executing applications 120 for performing a variety of tasks.


The application 120 illustrated in association with computing device 118 is illustrative of any application having sufficient computer executable instructions for enabling embodiments of the present invention as described herein. The application 120 may include a thick client application, which may be stored locally on the computing device 118, or may include a thin client application (i.e., web application) that may reside on a remote server and accessible over a network, such as the Internet or an intranet. A thin client application may be hosted in a browser-controlled environment or coded in a browser-supported language and reliant on a common web browser to render the application executable on a computing device 118.


Referring now to FIG. 2A, an example landing page 202A is illustrated that may be displayed on any suitable computing device 118 described above. The landing page 202A may comprise a plurality of content items 103 A-F (collectively 103) displayed in a grid. The content items 103 may be organized and ordered according to a relevance ranking. According to an embodiment, the content items 103 may be displayed as selectable objects that may comprise one or more of a visual representation of the content item 103 (e.g., a thumbnail image or other salient image that is extracted from the content item 103), the title of the content item 103, activity insights (e.g., number of views, a number of likes, a number of followers, a number of comments, etc.), a summary or brief description of the content item, 103, etc. Other information may also be provided, such as an individual 102 the user 122 shares a relationship edge 112A with who has acted on the content item 103, the action taken, and how recently the action took place. For example, as illustrated in FIG. 2A, the first content item 103A shows that an individual 102 Liz Andrews modified the content item 103A “about an hour ago.”


Additionally, one or more selectable tags 206 may be automatically suggested and displayed with a content item 103. A tag 206 may provide personalized information that may be useful to the user 122. For example, tags 214 may provide information such as if a content item 103 has been presented to the user 122, shared with the user 122 (e.g., via email, via a file hosting service, etc.), trending around the user 122, trending around other individuals 102, worked on by the user 122, viewed by the user 122, followed by the user 122, contributed to by the user 122, modified by the user 122, viewed by, worked on, commented on, followed by, or modified by an individual 102 with whom the user 122 has an implicit or explicit relationship, etc. As mentioned, tags 206 may be selectable. Selection of a tag 206 may initiate a search query for additional content items 103 matching the selected tag 206.


According to embodiments, a user 122 may pivot between boards 202A-B (collectively 202) or navigate to a predefined or to a user-defined query via selection of a navigation control 204. As illustrated in FIG. 2A, a title or header of a board 202 may be a selectable navigation control 204. When selected, the user 122 may select from a predefined query or may enter a search query for content items 103 meeting certain criteria.


Referring now to FIG. 2B, the example landing page 202A of FIG. 2A is shown, and as illustrated, the user 122 may enter a search query 208. In the example illustrated in FIG. 2B, the user 122 selectively enters a search query 208 for content that has been recommended to her. According to embodiments, the user 122 may conduct various types of search queries. For example, the user 122 may search for content items 103 that the user 122 has had some form of interaction with (e.g., content items 103 that have been recommended to the user 122, content items 103 that the user 122 has worked on previously, content items 103 that have been presented to the user, content that a particular colleague has worked on, etc.). The user 122 may also search for content items 103 associated with a certain topic (i.e., an exploratory search) or for content items 103 the user 122 has seen elsewhere (e.g., search for an email previously viewed in an email inbox application).


As described above, to navigate or pivot to another board 202 or query, the user 122 may select from a predefined query or may enter a search query for content items 103 meeting certain criteria. Predefined queries may comprise, but are not limited to, a “popular with my colleagues” query, a “viewed by me” query, a “worked on by me” query, and a “most viewed query.” Content items 103 matching criteria of a predefined query may be pre-aggregated, such that when a user selects a predefined query 208, the pre-aggregated content items 103 may be retrieved from the graph 114 and displayed in a new board 202.


According to an embodiment, a user 122 may be able to enter a search term or a text string which may be processed via natural language processing, and an aggregation may be dynamically created based on the natural language processing of the query 208. Content items 103 matching the query 208 may be aggregated from the graph 114 and displayed in a board 202.


According to embodiments, a query 208 may be further personalized to a user 122 based on analytics developed about the user 122. A profile may be developed for a user 122 comprising topic affinities, people affinities, etc. For example, a determination may be made that a particular user 122 searches for content of certain topics, for example, Ergonomics, and/or views, shares, and comments on a lot of content items 103 about Ergonomics. Accordingly, content items 103 that are associated with Ergonomics may be ranked higher for the user 122 than content items of another topic. As can be appreciated, two users 122 could enter an almost identical search; however, because one user has a profile that orients them toward, in this example, Ergonomics, content items 103 that are associated with Ergonomics may appear in the user's board 202 whereas other content may be presented to the other user. Additionally, different users 122 may have different permissions, and thus each user 122 may be provided with different aggregated content 116 according to his/her permissions.


Referring now to FIG. 2C, results from the search query 208 are shown displayed in a “recommended to me” board 202B. According to one embodiment, all results may be presented. According to another embodiment, a top n results may be displayed, wherein the top n results may be content items 103 with the highest ranking edges 112 as determined according to such factors as the type of activity performed (e.g., viewing, following, commenting on, liking, etc.), how close of a relationship the user 122 has with individual(s) 102 performing the activity, how recently the activity was performed, etc. The n may be a predetermined number or a selectable number. According to embodiments, the ranking may be according to the query 208 in context. For example, if a “worked on by me” query is selected, content items 103 that have been worked on by the user 122 may be aggregated and ranked in chronological order, while if a “popular with my colleagues” query is selected, content items 103 may be ranked according to a predictive relevance score.


Referring still to FIG. 2C, content items 103 matching the query 208 of “recommended to me” are shown displayed in the “recommended to me” board 202B. As described above, the content items 103 may be ranked and displayed in order of relevance to the user 122. The user 122 may select the navigation control 204 (i.e., title or header) and select or enter another search query 208, may navigate back to the landing page 202A, or may select a content item 103 for additional information or to act on the content item.



FIG. 3 is a flow chart showing one embodiment of a method 300 for providing an aggregated view of top ranking content items based on relevance to a user. The method 300 starts at OPERATION 305 and proceeds to OPERATION 310, where activity data 106 and organizational relationship data 105 for one or more individuals 102 may be retrieved from one or more of a plurality of workloads or information sources 104. As described above, activity data 106 may comprise various types of information such as, but not limited to, presence data, data associated with authoring or modification of a document, trending data, feedback data (e.g., like, comment, follow, share, etc.), data associated with whom an individual 102 interacts and communicates, etc. Organizational relationship data 105 may comprise data associated with organizational structure (e.g., who an individual works with, works for, is a peer to, directs, manages, is managed by, etc.). The one or more workloads or information sources 104 may include information sources such as social networking services, enterprise social network services, online productivity software suites, collaboration services, communication software, etc. According to an embodiment, OPERATION 305 may include a set-up process where each individual 102 may indicate which information sources 104 he/she uses from which activity data 106 and organizational relationship data 105 may be received. Each individual 102 may be required to enter authentication information for the various information sources 104.


The method 300 may proceed to OPERATION 315, where the activity data 106 and organizational relationship data 105 may be stored in a graph 114 as a collection of nodes 110 and edges 112 as described above. Relationships may be established between an individual 102 and content items 103 (e.g., documents, emails, webpages, etc.) upon which an activity was performed by the individual 102 or by other people with whom the individual 102 is associated implicitly and/or explicitly.


At OPERATION 320, weights for the edges 112 may be calculated and ranked according to their relevance to an individual 102. Weights may be calculated according to such factors as what activity is performed (e.g., a like, comment, share, follow, email, etc.) and the relationship between a first individual 102 and an individual(s) 102 performing the activity. Weights may also be based on how recently an activity was performed. A weight on a relationship edge 112A may be based on implicit or explicit signals generated through activity on the plurality of workloads, such as an amount and type of activity an individual 102 has with another person, a number of times an individual 102 interacts with a content item 103, the type of interaction, etc. Additionally, content items 103 may be aggregated into one or more queries 208 as determined by implicit and explicit signals. For example, content items 103 may be aggregated into one or more of a “popular with my colleagues” query, a “viewed by me” query, a “worked on by me” query, or a “most viewed query.”


The method 300 may proceed to OPERATION 325, where an indication to display a view of content items 103 to a user 122 is received, wherein the user 122 is an individual 102 represented in the graph 114. For example, the user 122 may select to view an aggregated collection of content items 103 stored in one or more folders, document libraries, or other repositories, etc. According to one embodiment, the user 122 may select to view content items 122 determined to be relevant to him/her. According to another embodiment, the user 122 may select to view content items determined to be relevant to another individual 102.


At OPERATION 330, the graph 114 may be queried for relationship 105 and activity data 106 associated with the user 122 (or associated with a selected individual 102), and content items 103 relevant to the user 122 (or the selected individual 102) may be provided. At OPERATION 335, the content items 103 may be consolidated to top ranking content items 103 based on relevance to the user 122 (or individual) according to their calculated edge weights. The number of content items 103 may be a predetermined number, may be a number selected by the user 122, or may be a variable number based on a threshold of weights.


The method 300 may proceed to OPERATION 340, where an aggregated and consolidated view of relevant content items 103 may be generated and displayed in a landing page 202A. As described above, the landing page 202A may comprise a grid of content items 103 ordered according to their relevance ranking.


The method 300 may end at OPERATION 295, or may proceed to OPERATION 345, where an indication of a selection of a search query may be received. As described above, the user 122 may select a tag 206 or may select a navigation control 204 and select either select a predefined query 208 or enter a search term or a text string.


At DECISION OPERATION 350, a determination may be made as to whether the user 122 selected a predefined query 208 or entered a search term or text string. If a determination is made that a predefined query 208 is selected, the method 300 may return to OPERATION 340, where a consolidated view of aggregated content 116 may be generated and displayed in a board 202. As was described with respect to OPERATION 320, the content items 103 may be aggregated into one or more queries (e.g., a “popular with my colleagues” query, a “viewed by me” query, a “worked on by me” query, or a “most viewed query,” etc.) as determined by implicit and explicit signals.


If a determination is made at DECISION OPERATION 350 that a search term or text string is received, the method 300 may proceed to OPERATION 355 where the search input may be processed, and a search for content items 103 matching the search criteria may be performed. According to an embodiment, processing the search input may comprise natural language processing. Content items 103 matching the query 208 may be aggregated from the graph 114.


The method 300 may then return to OPERATION 335, the matching content items 103 may be consolidated based on relevance to the user 122 (or individual) according to their calculated edge weights. An aggregated and consolidated view of relevant content items 103 matching parameters of the query 208 may be generated and displayed in a board 202. The method may end at OPERATION 395.


While the invention has been described in the general context of program modules that execute in conjunction with an application program that runs on an operating system on a computer, those skilled in the art will recognize that the invention may also be implemented in combination with other program modules. Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types.


The embodiments and functionalities described herein may operate via a multitude of computing systems including, without limitation, desktop computer systems, wired and wireless computing systems, mobile computing systems (e.g., mobile telephones, netbooks, tablet or slate type computers, notebook computers, and laptop computers), hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, and mainframe computers.


In addition, the embodiments and functionalities described herein may operate over distributed systems (e.g., cloud-based computing systems), where application functionality, memory, data storage and retrieval and various processing functions may be operated remotely from each other over a distributed computing network, such as the Internet or an intranet. User interfaces and information of various types may be displayed via on-board computing device displays or via remote display units associated with one or more computing devices. For example user interfaces and information of various types may be displayed and interacted with on a wall surface onto which user interfaces and information of various types are projected. Interaction with the multitude of computing systems with which embodiments of the invention may be practiced include, keystroke entry, touch screen entry, voice or other audio entry, gesture entry where an associated computing device is equipped with detection (e.g., camera) functionality for capturing and interpreting user gestures for controlling the functionality of the computing device, and the like.



FIGS. 4-6 and the associated descriptions provide a discussion of a variety of operating environments in which embodiments of the invention may be practiced. However, the devices and systems illustrated and discussed with respect to FIGS. 4-6 are for purposes of example and illustration and are not limiting of a vast number of computing device configurations that may be utilized for practicing embodiments of the invention, described herein.



FIG. 4 is a block diagram illustrating physical components (i.e., hardware) of a computing device 400 with which embodiments of the invention may be practiced. The computing device components described below may be suitable for the client device 118 described above. In a basic configuration, the computing device 400 may include at least one processing unit 402 and a system memory 404. Depending on the configuration and type of computing device, the system memory 404 may comprise, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read-only memory), flash memory, or any combination of such memories. The system memory 404 may include an operating system 405 and one or more program modules 406 suitable for running software applications 450 such as the aggregator 108, analytics engine 115, or client application 120. The operating system 405, for example, may be suitable for controlling the operation of the computing device 400. Furthermore, embodiments of the invention may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system. This basic configuration is illustrated in FIG. 4 by those components within a dashed line 408. The computing device 400 may have additional features or functionality. For example, the computing device 400 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 4 by a removable storage device 409 and a non-removable storage device 410.


As stated above, a number of program modules and data files may be stored in the system memory 404. While executing on the processing unit 402, the program modules 406 may perform processes including, but not limited to, one or more of the stages of the method 300 illustrated in FIG. 3. Other program modules that may be used in accordance with embodiments of the present invention and may include applications such as electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer-aided application programs, etc.


Furthermore, embodiments of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. For example, embodiments of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in FIG. 4 may be integrated onto a single integrated circuit. Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned”) onto the chip substrate as a single integrated circuit. When operating via an SOC, the functionality, described herein, with respect to providing a personalized view of insights into social activity surrounding a content item 103 may be operated via application-specific logic integrated with other components of the computing device 400 on the single integrated circuit (chip). Embodiments of the invention may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies. In addition, embodiments of the invention may be practiced within a general purpose computer or in any other circuits or systems.


The computing device 400 may also have one or more input device(s) 412 such as a keyboard, a mouse, a pen, a sound input device, a touch input device, etc. The output device(s) 414 such as a display, speakers, a printer, etc. may also be included. The aforementioned devices are examples and others may be used. The computing device 400 may include one or more communication connections 416 allowing communications with other computing devices 418. Examples of suitable communication connections 416 include, but are not limited to, RF transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports.


The term computer readable media as used herein may include computer storage media. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules. The system memory 404, the removable storage device 409, and the non-removable storage device 410 are all computer storage media examples (i.e., memory storage.) Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the computing device 400. Any such computer storage media may be part of the computing device 400. Computer storage media does not include a carrier wave or other propagated or modulated data signal.


Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.



FIGS. 5A and 5B illustrate a mobile computing device 500, for example, a mobile telephone, a smart phone, a tablet personal computer, a laptop computer, and the like, with which embodiments of the invention may be practiced. With reference to FIG. 5A, one embodiment of a mobile computing device 500 for implementing the embodiments is illustrated. In a basic configuration, the mobile computing device 500 is a handheld computer having both input elements and output elements. The mobile computing device 500 typically includes a display 505 and one or more input buttons 510 that allow the user to enter information into the mobile computing device 500. The display 505 of the mobile computing device 500 may also function as an input device (e.g., a touch screen display). If included, an optional side input element 515 allows further user input. The side input element 515 may be a rotary switch, a button, or any other type of manual input element. In alternative embodiments, mobile computing device 500 may incorporate more or less input elements. For example, the display 505 may not be a touch screen in some embodiments. In yet another alternative embodiment, the mobile computing device 500 is a portable phone system, such as a cellular phone. The mobile computing device 500 may also include an optional keypad 535. Optional keypad 535 may be a physical keypad or a “soft” keypad generated on the touch screen display. In various embodiments, the output elements include the display 505 for showing a graphical user interface (GUI), a visual indicator 520 (e.g., a light emitting diode), and/or an audio transducer 525 (e.g., a speaker). In some embodiments, the mobile computing device 500 incorporates a vibration transducer for providing the user with tactile feedback. In yet another embodiment, the mobile computing device 500 incorporates input and/or output ports, such as an audio input (e.g., a microphone jack), an audio output (e.g., a headphone jack), and a video output (e.g., a HDMI port) for sending signals to or receiving signals from an external device.



FIG. 5B is a block diagram illustrating the architecture of one embodiment of a mobile computing device. That is, the mobile computing device 500 can incorporate a system (i.e., an architecture) 502 to implement some embodiments. In one embodiment, the system 502 is implemented as a “smart phone” capable of running one or more applications (e.g., browser, e-mail, calendaring, contact managers, messaging clients, games, and media clients/players). In some embodiments, the system 502 is integrated as a computing device, such as an integrated personal digital assistant (PDA) and wireless phone.


One or more application programs 550 may be loaded into the memory 562 and run on or in association with the operating system 564. Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth. The system 502 also includes a non-volatile storage area 568 within the memory 562. The non-volatile storage area 568 may be used to store persistent information that should not be lost if the system 502 is powered down. The application programs 550 may use and store information in the non-volatile storage area 568, such as e-mail or other messages used by an e-mail application, and the like. A synchronization application (not shown) also resides on the system 502 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in the non-volatile storage area 568 synchronized with corresponding information stored at the host computer. As should be appreciated, other applications may be loaded into the memory 562 and run on the mobile computing device 500.


The system 502 has a power supply 570, which may be implemented as one or more batteries. The power supply 570 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries.


The system 502 may also include a radio 572 that performs the function of transmitting and receiving radio frequency communications. The radio 572 facilitates wireless connectivity between the system 502 and the “outside world,” via a communications carrier or service provider. Transmissions to and from the radio 572 are conducted under control of the operating system 564. In other words, communications received by the radio 572 may be disseminated to the application programs 150 via the operating system 564, and vice versa.


The visual indicator 520 may be used to provide visual notifications and/or an audio interface 574 may be used for producing audible notifications via the audio transducer 525. In the illustrated embodiment, the visual indicator 520 is a light emitting diode (LED) and the audio transducer 525 is a speaker. These devices may be directly coupled to the power supply 570 so that when activated, they remain on for a duration dictated by the notification mechanism even though the processor 560 and other components might shut down for conserving battery power. The LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device. The audio interface 574 is used to provide audible signals to and receive audible signals from the user. For example, in addition to being coupled to the audio transducer 525, the audio interface 574 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation. In accordance with embodiments of the present invention, the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below. The system 502 may further include a video interface 576 that enables an operation of an on-board camera 530 to record still images, video stream, and the like.


A mobile computing device 500 implementing the system 502 may have additional features or functionality. For example, the mobile computing device 500 may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape. Such additional storage is illustrated in FIG. 5B by the non-volatile storage area 568.


Data/information generated or captured by the mobile computing device 500 and stored via the system 502 may be stored locally on the mobile computing device 500, as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio 572 or via a wired connection between the mobile computing device 500 and a separate computing device associated with the mobile computing device 500, for example, a server computer in a distributed computing network, such as the Internet. As should be appreciated such data/information may be accessed via the mobile computing device 500 via the radio 572 or via a distributed computing network. Similarly, such data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.



FIG. 6 illustrates one embodiment of the architecture of a system for providing an aggregated view of top ranking content items 103 based on relevance to a user 122, as described above. Content developed, interacted with, or edited in association with the application 120 may be stored in different communication channels or other storage types. For example, various documents may be stored using a directory service 622, a web portal 624, a mailbox service 626, an instant messaging store 628, or a social networking site 630. The application 120 may use any of these types of systems or the like for providing an aggregated view of top ranking content items 103 based on relevance to a user 122, as described herein. A server 615 may provide the application 120 to clients 118. As one example, the server 615 may be a web server providing the application 120 over the web. The server 615 may provide the application 120 over the web to clients 118 through a network 610. By way of example, the client computing device 118 may be implemented and embodied in a personal computer 605A, a tablet computing device 605B and/or a mobile computing device 605C (e.g., a smart phone), or other computing device. Any of these embodiments of the client computing device may obtain content from the store 616.


Embodiments of the present invention, for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to embodiments of the invention. The functions/acts noted in the blocks may occur out of the order as shown in any flowchart. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.


The description and illustration of one or more embodiments provided in this application are not intended to limit or restrict the scope of the invention as claimed in any way. The embodiments, examples, and details provided in this application are considered sufficient to convey possession and enable others to make and use the best mode of claimed invention. The claimed invention should not be construed as being limited to any embodiment, example, or detail provided in this application. Regardless of whether shown and described in combination or separately, the various features (both structural and methodological) are intended to be selectively included or omitted to produce an embodiment with a particular set of features. Having been provided with the description and illustration of the present application, one skilled in the art may envision variations, modifications, and alternate embodiments falling within the spirit of the broader aspects of the general inventive concept embodied in this application that do not depart from the broader scope of the claimed invention.

Claims
  • 1. A method for providing an aggregated view of content items, the method comprising: receiving an indication to display an aggregated view of content items relevant to a user;determining one or more content items from one or more repositories to display according to a relevance ranking, based on an activity by the user or other users and a relationship associated with the user and the activity, the activity being weighted according to the relationship;generating a user interface for displaying the one or more content items; anddisplaying visual representations of the one or more content items in an order according to the relevance ranking, each of the visual representations providing activity insights relating to at least one action by the user or other users with a content item, each visual representation having an associated tag providing personalized information that specifies at least one activity performed on one or more content items and how the at least one activity relates to the user, wherein the tag is selectable for initiating a predefined search query relating to the at least one activity specified by the personalized information, wherein initiation of the predefined search query results in aggregation and displaying of additional content items relating to the activity and not already included in the displayed visual representations.
  • 2. The method of claim 1, wherein determining one or more content items from one or more repositories to display according to a relevance ranking comprises querying a graph for content items acted on by the user and content items acted on by individuals with whom the user shares a relationship.
  • 3. The method of claim 2, further comprising querying a graph for content items determined to be relevant to the user according to a match of a topic of the content item and a topic affinity associated with the user.
  • 4. The method of claim 2, wherein the content items acted on by the user and content items acted on by individuals whom the user shares a relationship are weighted according to one or more of: a type of activity performed;a type of relationship the user has with an individual performing the activity;how recently the activity was performed; ora number of times the activity was performed.
  • 5. The method of claim 4, wherein the content items acted on by the user and content items acted on by individuals whom the user shares a relationship are ranked in order of relevance to the user according to their weights.
  • 6. The method of claim 1, further comprising: receiving an indication of a search query;searching for one or more content items from one or more repositories matching the search query;generating a user interface for displaying the one or more content items matching the search query; anddisplaying the one or more content items matching the search query in the user interface.
  • 7. The method of claim 6, wherein receiving an indication of a search query comprises receiving an indication of a selection of a predefined query.
  • 8. The method of claim 7, wherein searching for one or more content items from one or more repositories matching the search query comprises searching for one or more content items pre-aggregated according to the predefined query.
  • 9. The method of claim 7, wherein a predefined query comprises one of: one or more content items determined to be popular with the user's colleagues;one or more content items viewed by the user;one or more content items worked on by the user; orone or more content items most viewed by the user and by individuals whom the user has an explicit or implicit relationship.
  • 10. The method of claim 6, wherein receiving an indication of a search query comprises receiving an indication of input of a search term or text string.
  • 11. The method of claim 10, further comprising: performing a natural language processing of the search term or text string;requesting one or more content items from one or more repositories matching the search term or text string;aggregating content items from the graph matching the search term or text string;determining one or more aggregated content items to display according to a relevance ranking;generating a user interface for displaying the one or more aggregated content items; anddisplaying the one or more aggregated content items in the user interface.
  • 12. A system for providing an aggregated view of content items, the system comprising: one or more processors; anda memory storing a program and coupled to the one or more processors, the one or more processors operable under control of the program to:receive an indication to display an aggregated view of content items relevant to a user;query a graph for one or more content items from one or more repositories, the one or more content items acted on by the user or acted on by individuals with whom the user shares a relationship;determine which of the one or more content items to display according to a relevance ranking, based on an activity by the user or other users and a relationship associated with the user and the activity, the activity being weighted according to the relationship;generate a user interface for displaying the one or more content items; anddisplay visual representations of the one or more content items in an order according to the relevance ranking, each of the visual representations providing activity insights relating to at least one action by the user or other users with a content item, each visual representation having an associated tag providing personalized information that specifies at least one activity performed on one or more content items and how the at least one activity relates to the user, wherein the tag is selectable for initiating a predefined search query relating to the at least one activity specified by the personalized information, wherein initiation of the predefined search query results in aggregation and displaying of additional content items relating to the activity and not already included in the displayed visual representations.
  • 13. The system of claim 12, wherein weights are calculated according to one or more of: a type of activity performed;a type of relationship the user has with an individual performing the activity;how recently the activity was performed; ora number of times the activity was performed.
  • 14. The system of claim 12, wherein the one or more processors are further operable to: receive an indication of a search query;search for one or more content items from one or more repositories matching the search query;generate a user interface for displaying the one or more content items matching the search query; anddisplay the one or more content items matching the search query in the user interface.
  • 15. The system of claim 14, wherein in receiving an indication of a search query, the one or more processors are further operable to: receive an indication of a selection of a predefined query; andsearch for one or more content items pre-aggregated according to the predefined query.
  • 16. The system of claim 14, wherein in receiving an indication of a search query, the one or more processors are further operable to: receive an indication of input of a search term or text string;perform a natural language processing of the search term or text string;request one or more content items matching the search term or text string;aggregate content items from the graph matching the search term or text string;determine one or more aggregated content items to display according to a relevance ranking;generate a user interface for displaying the one or more aggregated content items; anddisplay the one or more aggregated content items in the user interface.
  • 17. One or more computer storage devices containing computer executable instructions which, when executed by a computer, perform a method for providing an aggregated view of content items, the method comprising: receiving an indication to display an aggregated view of content items relevant to a user;querying a graph for one or more content items from one or more repositories, the one or more content items acted on by the user, acted on by individuals with whom the user shares a relationship, or determined to be relevant to the user according to a match of a topic of the content item and a topic affinity associated with the user;determining which of the one or more content items to display according to a relevance ranking according to calculated weights, wherein the weights are calculated based on an activity by the user or other users and a relationship associated with the user and the activity, including one or more of:a type of activity performed,a type of relationship the user has with an individual performing the activity,how recently the activity was performed, ora number of times the activity was performed;generating a user interface for displaying the one or more content items; anddisplaying visual representations of the one or more content items in an order according to the relevance ranking, each of the visual representations providing activity insights relating to at least one action by the user or other users with a content item, each visual representation having an associated tag providing personalized information that specifies at least one activity performed on one or more content items and how the at least one activity relates to the user, wherein the tag is selectable for initiating a predefined search query relating to the at least one activity specified by the personalized information, wherein initiation of the predefined search query results in aggregation and displaying of additional content items relating to the activity and not already included in the displayed visual representations.
  • 18. The one or more computer storage devices of claim 17, wherein the method further comprises: receiving an indication of a search query;searching for one or more content items from one or more repositories matching the search query;generating a user interface for displaying the one or more content items matching the search query; anddisplaying the one or more content items matching the search query in the user interface.
  • 19. The one or more computer storage devices of claim 18, wherein receiving an indication of a search query comprises: receiving an indication of a selection of a predefined query; andsearching for one or more content items pre-aggregated according to the predefined query.
  • 20. The one or more computer storage devices of claim 18, wherein receiving an indication of a search query comprises: receiving an indication of input of a search term or text string;performing a natural language processing of the search term or text string;requesting one or more content items matching the search term or text string;aggregating content items from the graph matching the search term or text string;determining one or more aggregated content items to display according to a relevance ranking;generating a user interface for displaying the one or more aggregated content items; anddisplaying the one or more aggregated content items in the user interface.
US Referenced Citations (254)
Number Name Date Kind
6434556 Levin et al. Aug 2002 B1
7031961 Pitkow et al. Apr 2006 B2
7055168 Errico et al. May 2006 B1
7444344 Galindo-Legaria et al. Oct 2008 B2
7509320 Hess Mar 2009 B2
7571121 Bezos et al. Aug 2009 B2
7577718 Slawson et al. Aug 2009 B2
7587101 Bourdev Sep 2009 B1
7640236 Pogue Dec 2009 B1
7756945 Andreessen et al. Jul 2010 B1
7761447 Brill et al. Jul 2010 B2
7783630 Chevaller et al. Aug 2010 B1
7788245 Eddings Aug 2010 B1
7873641 Frieden et al. Jan 2011 B2
7890501 Lunt et al. Feb 2011 B2
7945571 Wanker May 2011 B2
7958116 House et al. Jun 2011 B2
7962481 Ganesh et al. Jun 2011 B2
8005817 Amer-Yahia et al. Aug 2011 B1
8060513 Basco et al. Nov 2011 B2
8065383 Carlson et al. Nov 2011 B2
8204870 Mukkamala et al. Jun 2012 B2
8204888 Frieden et al. Jun 2012 B2
8209349 Howes et al. Jun 2012 B2
8214325 Navas Jul 2012 B2
8266144 Tankovich et al. Sep 2012 B2
8301764 Konig et al. Oct 2012 B2
8312056 Peng et al. Nov 2012 B1
8341017 Payne et al. Dec 2012 B2
8341150 Riley et al. Dec 2012 B1
8346765 Guo et al. Jan 2013 B2
8346950 Andreessen et al. Jan 2013 B1
8380562 Toebes et al. Feb 2013 B2
8386515 Bent et al. Feb 2013 B2
8463795 Van Hoff Jun 2013 B2
8538959 Jin et al. Sep 2013 B2
8600981 Chau et al. Dec 2013 B1
8601023 Brave et al. Dec 2013 B2
8751621 Vaynblat et al. Jun 2014 B2
8751636 Tseng et al. Jun 2014 B2
8775334 Lloyd et al. Jul 2014 B1
8782036 Chen et al. Jul 2014 B1
8799296 Agapiev Aug 2014 B2
8812947 Maoz et al. Aug 2014 B1
8825649 Heimendinger et al. Sep 2014 B2
8825711 Chan et al. Sep 2014 B2
8874550 Soubramanien et al. Oct 2014 B1
8886633 Smyth et al. Nov 2014 B2
8898156 Xu et al. Nov 2014 B2
8909515 O'Neil et al. Dec 2014 B2
8984098 Tomkins et al. Mar 2015 B1
8996631 Staddon et al. Mar 2015 B1
9165305 Chandra et al. Oct 2015 B1
9177293 Gagnon Nov 2015 B1
9223866 Marcucci et al. Dec 2015 B2
9438619 Chan et al. Sep 2016 B1
9514191 Solheim et al. Dec 2016 B2
9542440 Wang et al. Jan 2017 B2
9576007 Sivathanu Feb 2017 B1
20010034859 Swoboda et al. Oct 2001 A1
20020038299 Zernik et al. Mar 2002 A1
20020091736 Wall Jul 2002 A1
20020169759 Kraft et al. Nov 2002 A1
20030025692 Lu et al. Feb 2003 A1
20030071814 Jou et al. Apr 2003 A1
20030115271 Weissman Jun 2003 A1
20040255237 Tong Dec 2004 A1
20040267736 Yamane et al. Dec 2004 A1
20050076240 Appelman Apr 2005 A1
20050076241 Appelman Apr 2005 A1
20050201535 LaLonde Sep 2005 A1
20050203929 Hazarika Sep 2005 A1
20050246420 Little Nov 2005 A1
20050278321 Vailaya et al. Dec 2005 A1
20050278325 Mihalcea et al. Dec 2005 A1
20060004892 Lunt et al. Jan 2006 A1
20060074883 Teevan et al. Apr 2006 A1
20060123014 Ng Jun 2006 A1
20060168036 Schultz Jul 2006 A1
20060294085 Rose Dec 2006 A1
20070162443 Liu et al. Jul 2007 A1
20070192306 Papakonstantinou et al. Aug 2007 A1
20070208751 Cowan et al. Sep 2007 A1
20080005064 Sarukkai Jan 2008 A1
20080010337 Hayes Jan 2008 A1
20080010350 Chen et al. Jan 2008 A1
20080016053 Frieden et al. Jan 2008 A1
20080086344 Martini et al. Apr 2008 A1
20080097968 Delgado et al. Apr 2008 A1
20090049053 Barker et al. Feb 2009 A1
20090094233 Marvit et al. Apr 2009 A1
20090125560 Munekuni et al. May 2009 A1
20090132490 Okraglik May 2009 A1
20090132516 Patel et al. May 2009 A1
20090150866 Schmidt Jun 2009 A1
20090182727 Majko Jul 2009 A1
20090256678 Ryu Oct 2009 A1
20090313295 Blaxland et al. Dec 2009 A1
20090327271 Amitay et al. Dec 2009 A1
20100063878 Bachet et al. Mar 2010 A1
20100082695 Hardt Apr 2010 A1
20100083151 Lanza Apr 2010 A1
20100169320 Patnam et al. Jul 2010 A1
20100169326 Ma Jul 2010 A1
20100179874 Higgins et al. Jul 2010 A1
20100185610 Lunt et al. Jul 2010 A1
20100223266 Balmin et al. Sep 2010 A1
20100268703 Buck Oct 2010 A1
20100306185 Smith Dec 2010 A1
20100332330 Goel et al. Dec 2010 A1
20110004831 Steinberg et al. Jan 2011 A1
20110040617 Moonka et al. Feb 2011 A1
20110055241 Lewis Mar 2011 A1
20110060803 Barlin et al. Mar 2011 A1
20110087644 Frieden et al. Apr 2011 A1
20110145275 Stewart Jun 2011 A1
20110145719 Chen et al. Jun 2011 A1
20110214046 Haberman et al. Sep 2011 A1
20110218946 Stern et al. Sep 2011 A1
20110231381 Mercuri Sep 2011 A1
20110271224 Cruz Moreno et al. Nov 2011 A1
20120030169 Allen et al. Feb 2012 A1
20120047114 Duan et al. Feb 2012 A1
20120054303 Priyadarshan et al. Mar 2012 A1
20120076367 Tseng Mar 2012 A1
20120078896 Nixon et al. Mar 2012 A1
20120084291 Chung Apr 2012 A1
20120124041 Bawri et al. May 2012 A1
20120158720 Luan Jun 2012 A1
20120158791 Kasneci et al. Jun 2012 A1
20120209859 Blount Aug 2012 A1
20120209878 Park et al. Aug 2012 A1
20120210240 Neystadt et al. Aug 2012 A1
20120221558 Byrne et al. Aug 2012 A1
20120221566 Iwasa et al. Aug 2012 A1
20120239618 Kung Sep 2012 A1
20120254790 Colombino et al. Oct 2012 A1
20120271807 Smyth et al. Oct 2012 A1
20120290399 England et al. Nov 2012 A1
20120290637 Perantatos et al. Nov 2012 A1
20120296918 Morris et al. Nov 2012 A1
20120304215 McCarthy et al. Nov 2012 A1
20120310922 Johnson et al. Dec 2012 A1
20120311139 Brave et al. Dec 2012 A1
20120323998 Schoen et al. Dec 2012 A1
20120324002 Chen Dec 2012 A1
20120324027 Vaynblat et al. Dec 2012 A1
20120330908 Stowe et al. Dec 2012 A1
20120330992 Kanigsberg et al. Dec 2012 A1
20130006754 Horvitz et al. Jan 2013 A1
20130013678 Murthy Jan 2013 A1
20130031489 Gubin et al. Jan 2013 A1
20130036230 Bakos Feb 2013 A1
20130041896 Ghani et al. Feb 2013 A1
20130054349 Ogawa Feb 2013 A1
20130073280 O'Neil et al. Mar 2013 A1
20130073568 Federov et al. Mar 2013 A1
20130073632 Fedorov et al. Mar 2013 A1
20130073979 Shepherd et al. Mar 2013 A1
20130073983 Rasmussen et al. Mar 2013 A1
20130080218 Reapso Mar 2013 A1
20130086057 Harrington et al. Apr 2013 A1
20130091149 Dunn et al. Apr 2013 A1
20130097143 Shenoy et al. Apr 2013 A1
20130097184 Berkhin et al. Apr 2013 A1
20130103683 Haveliwala et al. Apr 2013 A1
20130110638 Ogawa May 2013 A1
20130110802 Shenoy et al. May 2013 A1
20130110827 Nabar et al. May 2013 A1
20130110978 Gordon et al. May 2013 A1
20130124437 Pennacchiotti et al. May 2013 A1
20130124613 Plache et al. May 2013 A1
20130132138 Doganata et al. May 2013 A1
20130151611 Graham et al. Jun 2013 A1
20130155068 Bier et al. Jun 2013 A1
20130159096 Santhanagopal et al. Jun 2013 A1
20130191416 Lee et al. Jul 2013 A1
20130204706 Tang et al. Aug 2013 A1
20130212081 Shenoy et al. Aug 2013 A1
20130218885 Satyanarayanan Aug 2013 A1
20130218899 Raghavan et al. Aug 2013 A1
20130227011 Sharma et al. Aug 2013 A1
20130238449 Ferreira et al. Sep 2013 A1
20130238587 Annau et al. Sep 2013 A1
20130238588 Annau et al. Sep 2013 A1
20130246404 Annau et al. Sep 2013 A1
20130246405 Annau et al. Sep 2013 A1
20130246521 Schacht et al. Sep 2013 A1
20130262588 Barak et al. Oct 2013 A1
20130268973 Archibong et al. Oct 2013 A1
20130288715 Shieh et al. Oct 2013 A1
20130290323 Saib Oct 2013 A1
20130298084 Spivack et al. Nov 2013 A1
20130326369 Buchanon Dec 2013 A1
20130332523 Luu Dec 2013 A1
20130346329 Alib-Bulatao et al. Dec 2013 A1
20140013353 Mathur Jan 2014 A1
20140032563 Lassen et al. Jan 2014 A1
20140040008 Belani et al. Feb 2014 A1
20140040244 Rubinstein et al. Feb 2014 A1
20140040246 Rubinstein et al. Feb 2014 A1
20140040367 Lessin et al. Feb 2014 A1
20140040370 Buhr Feb 2014 A1
20140040729 Marlow et al. Feb 2014 A1
20140041038 Lessin et al. Feb 2014 A1
20140046982 Chan et al. Feb 2014 A1
20140074602 van Elsas et al. Mar 2014 A1
20140074888 Potter et al. Mar 2014 A1
20140074934 van Hoff et al. Mar 2014 A1
20140114986 Bierner et al. Apr 2014 A1
20140156652 Abiola Jun 2014 A1
20140164388 Zhang Jun 2014 A1
20140173459 Gaiser et al. Jun 2014 A1
20140181091 Lassen et al. Jun 2014 A1
20140188899 Whitnah et al. Jul 2014 A1
20140189530 Anand et al. Jul 2014 A1
20140207860 Wang et al. Jul 2014 A1
20140215351 Gansca et al. Jul 2014 A1
20140280080 Solheim et al. Sep 2014 A1
20140282029 Vishria Sep 2014 A1
20140324850 Magnaghi et al. Oct 2014 A1
20140330551 Bao et al. Nov 2014 A1
20140330809 Raina et al. Nov 2014 A1
20140330818 Raina et al. Nov 2014 A1
20140330819 Raina et al. Nov 2014 A1
20140344288 Evans et al. Nov 2014 A1
20140359023 Homsany Dec 2014 A1
20150039596 Stewart Feb 2015 A1
20150058758 Tseng Feb 2015 A1
20150067505 Metcalf et al. Mar 2015 A1
20150081449 Ge et al. Mar 2015 A1
20150100644 Gulik Apr 2015 A1
20150120700 Holm et al. Apr 2015 A1
20150127677 Wang et al. May 2015 A1
20150142785 Roberts et al. May 2015 A1
20150187024 Karatzoglou et al. Jul 2015 A1
20150220531 Helvik et al. Aug 2015 A1
20150242402 Holm et al. Aug 2015 A1
20150242473 Brugard et al. Aug 2015 A1
20150248222 Stickler et al. Sep 2015 A1
20150248480 Miller et al. Sep 2015 A1
20150249715 Helvik et al. Sep 2015 A1
20150294138 Barak et al. Oct 2015 A1
20150363402 Jackson et al. Dec 2015 A1
20150363407 Huynh et al. Dec 2015 A1
20150379586 Mooney et al. Dec 2015 A1
20150381552 Vijay et al. Dec 2015 A1
20160034469 Livingston et al. Feb 2016 A1
20160070764 Helvik et al. Mar 2016 A1
20160117740 Linden et al. Apr 2016 A1
20160203510 Pregueiro et al. Jul 2016 A1
20170072002 Bafundo et al. Mar 2017 A1
20170091644 Chung et al. Mar 2017 A1
20190180204 Stickler et al. Jun 2019 A1
Foreign Referenced Citations (17)
Number Date Country
1666279 Sep 2005 CN
102150161 Aug 2011 CN
102298612 Dec 2011 CN
102567326 Jul 2012 CN
102693251 Sep 2012 CN
102930035 Feb 2013 CN
2409271 Jan 2012 EP
2426634 Mar 2012 EP
2764489 Aug 2014 EP
2008097969 Aug 2008 WO
2008111087 Sep 2008 WO
2010029410 Mar 2010 WO
2012129400 Sep 2012 WO
2013026095 Feb 2013 WO
2013043654 Mar 2013 WO
2013123550 Aug 2013 WO
2013173232 Nov 2013 WO
Non-Patent Literature Citations (73)
Entry
Yong Yin at al., An improved Search Strategy for Even Degree Distribution Networks, Jul. 2013, Academy Publisher, vol. 8, No. 7, pp. 1558-1565 (Year: 2013).
Jason J. Jung, Understanding information propagation on online social tagging systems, Nov. 4, 2012, Springer Science + Business Media, Edition or vol. 48, pp. 745-754 (Year: 2012).
Reza Bakhshandeh et al., Personalized Serach based on Micro-blogging Social Networks, May 1, 2012, IEEE, pp. 283-286 (Year: 2012).
Varun Mishra et al., Improving Mobile Search through Location Based Content and Personalization, 2012, IEEE, pp. 392-396 (Year: 2012).
Soussi, Rania, “Querying and Extracting Heterogeneous Graphs from Structured Data and Unstrutured Content”, In Doctoral Dissertation, Ecole Centrale Paris, Jun. 22, 2012, 208 pages (1 page Abstract).
“Facets for Enterprise Search Collections”, Retrieved on: Jun. 17, 2014, Available at: http://pic.dhe.ibm.com/infocenter/analytic/v3r0m0/index.jsp?topic=%2Fcom.ibm.discovery.es.ad.doc%2Fiiysafacets.htm.
“Introduction to Managed Metadata”, Retrieved on: Jun. 23, 2014 Available at: http://office.microsoft.com/en-001/office365-sharepoint-online-enterprise-help/introduction-to-managed-metadata-HA102832521.aspx.
Daly, et al., “Social Lens: Personalization around user Defined Collections for Filtering Enterprise Message Streams”, In Proceedings of the Fifth International AAAI Conference on Weblogs and Social Media, Published on: Jul. 17, 2011, 8 pages.
Masuch, Lukas, “Hack: Enterprise Knowledge Graph—One Graph to Connect them All”, Published on: Mar. 28, 2014, Available at : http://www.managementexchange.com/hack/enterprise-knowledge-graph-one-graph-connect-them-all.
Pecovnik, Simon, “Enterprise Graph Search—take 1”, Published on: Jan. 28, 2014, Available at: http://www.ravn.co.uk/2014/01/28/enterprise-graph-search/.
Perer, et al., “Visual Social Network Analytics for Relationship Discovery in the Enterprise”, In IEEE Conference on Visual Analytics Science and Technology, Published on: Oct. 23, 2011, 9 Pages.
U.S. Appl. No. 14/469,943, filed Aug. 27, 2014 entitled “Aggregating Enterprise Graph Content Around User-Generated Topcis,”.
Li, et al., “Research Of Information Recommendation System Based On Reading Behavior”, In International Conference on Machine Learning and Cybernetics, vol. 3, Jul. 12, 2008, 6 pages.
“Enterprise Search from Microsoft”, Published on: Jan. 2007, Available at: https://www.google.co.in/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&ved=0CDMQFjAB&url=http%3A%2F%2Fdownload.microsoft.com%2Fdownload%2Fd%2F0%2F1%2Fd0165e6d-11cb-464b-b24a-c019d82def0d%2FEnterprise%2520Search%2520from%2520Microsoft.
“Bing Ads targeting—training”, Published on: Mar. 31, 2013 Available at: http://advertise.bingads.microsoft.com/en-ca/cl/245/training/bing-ads-targeting.
“Campaign Element Template Parameters—Training”, Retrieved on: Oct. 1, 2014 Available at: https://www-304.ibm.com/support/knowledgecenter/SSZLC2_7.0.0/com.ibm.commerce.management-center_customization.doc/concepts/csbcustargdef.htm.
“Connections Enterprise Content Edition”, Published on: Nov. 22, 2013 Available at: http://www-03.ibm.com/software/products/en/connections-ecm/.
“Getting Started with your My Site”, Published on: Apr. 6, 2013, Available at: http://office.microsoft.com/en-in/sharepoint-server-help/getting-started-with-your-my-site-HA101665444.aspx.
“How to Segment and Target Your Emails—Training”, Published on: Aug. 15, 2014 Available at: http://www.marketo.com/_assets/uploads/How-to-Segment-and-Target-Your-Emails.pdf?20130828153321.
“Introducing Delve (codename Oslo) and the Office Graph”, Published on: Mar. 11, 2014, Available at: http://blogs.office.com/2014/03/11/introducing-codename-oslo-and-the-office-graph/.
“Persistent Search: Search's Next Big Battleground”, Available at: http://billburnham.blogs.com/burnhamsbeat/2006/04/persistent_sear.html, Published on: Apr. 10, 2006, 3 pages.
“Turn search history off or on”, retrieved from http://onlinehelp.microsoft.com/en-US/bing/ff808483.aspx, Retrieved date: Dec. 12, 2013, 1 page.
“Yammer the Enterprise Social Network”, Published on: Sep. 9, 2013 Available at: https://about.yammer.com/product/feature-list/.
Amitay et al., “Social Search and Discovery using a Unified Approach”, In Proceedings of the 20th ACM Conference on Hypertext and Hypermedia, Jun. 29, 2009, pp. 199-208.
Bailly, Nestor, “Finding the Best Video Content Using the Power of the Social Graph”, Published on: Jul. 17, 2013 Available at: http://iq.intel.com/iq/35820000/finding-the-best-video-content-using-the-power-of-the-social-graph.
Bobadilla et al., “Recommender Systems Survey”, In Journal of Knowledge-Based Systems, vol. 46, Jul. 2013, pp. 109-132.
Diaz et al., “SIGIR 2013 Workshop on Time Aware Information Access (#TAIA2013)”, In Proceedings of the 36th International ACM SIGIR Conference on Research and Development in Information Retrieval, Aug. 1, 2013, 41 pages.
Elbassuoni et al., “Language-Model-Based Ranking for Queries on RDF-Graphs”, In Proceedings of the 18th ACM Conference on Information and Knowledge Management, Nov. 2, 2009, 10 pages.
Fan et al., “Tuning Before Feedback: Combining Ranking Discovery and Blind Feedback for Robust Retrieval”, Retrieved at http://filebox.vt.edu/users/wfan/paper/ARRANGER/p52-Fan.pdf, 27th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul. 25, 2004, 8 pages.
Fazio, Stephanie, “How Social is Enterprise Search?”, Published on: Mar. 14, 2012, Available at: http://blogs.opentext.com/vca/blog/1.11.647/article/1.26.2007/2012/3/14/How_Social_is_Enterprise_Search%3F.
Fox, Vanessa, “Marketing in the Age of Google”, John Wiley & Sons, Mar. 8, 2012, 3 pages.
Giugno et al., “GraphGrep: A Fast and Universal Method for Querying Graphs”, In Proceedings of the 16th International Conference on Pattern Recognition, vol. 2, Aug. 11, 2002, 4 pages.
Gruhl et al., “The Web beyond Popularity—A Really Simple System for Web Scale RSS”, In Proceedings of the 15th International Conference on World Wide Web, May 23, 2006, pp. 183-192.
Guy et al., “Finger on the Pulse: The Value of the Activity Stream in the Enterprise”, In Proceedings of 14th IFIP TC 13 International Conference on Human-Computer Interaction, Sep. 2, 2013, 18 pages.
Guy et al., “Personalized Recommendation of Social Software Items Based on Social Relations”, In Proceedings of the Third ACM Conference on Recommender Systems, Oct. 2009, pp. 53-60.
Hackett, Wes, “Extending the Activity Feed with Enterprise Content”, In Proceedings of ActivityFeed, Development, Featured, Sharepoint, Social Features, Jun. 16, 2011, 27 pages.
Hanada, Tetsuya, “Yammer—Enterprise Graph SharePoint”, In Australian Sharepoint Conference, Jun. 11, 2013, 23 pages.
Josh, “Send Notifications to your Customers in their Timezone—training”, Published on: Aug. 19, 2014 Available at: https://mixpanel.com/blog/2014/08/19/announcement-send-notifications-in-your-customer-s-timezone.
Kelly et al., “The Effects of Topic Familiarity on Information Search Behavior”, Retrieved at http://www.ils.unc.edu/˜dianek/kelly-jcd102.pdf, Joint Conference on Digital Libraries, Portland, Oregon, USA, Jul. 13, 2002, 2 pages.
Khodaei et al., “Social-Textual Search and Ranking”, In Proceedings of the First International Workshop on Crowdsourcing Web Search, Apr. 17, 2012, 6 pages.
Kubica et al., “cGraph: A Fast Graph-Based Method for Link Analysis and Queries”, In Proceedings of the IJCAI Text-Mining & Link-Analysis Workshop, Aug. 2003, 10 pages.
Li et al., “Personalized Feed Recommendation Service for Social Networks”, In IEEE 2nd International Conference on Social Computing, Aug. 20, 2010, 8 pages.
Liang et al., “Highlighting in Information Visualization: A Survey”, In Proceedings of 14th International Conference Information Visualisation, Jul. 26, 2010, pp. 79-85.
Muralidharan et al., “Social Annotations in Web Search”, In Proceedings of the ACM Annual Conference on Human Factors in Computing Systems, May 5, 2012, 10 pages.
Ronen et al., “Social Networks and Discovery in the Enterprise (SaND)”, In Proceedings of the 32nd International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul. 19, 2009, 1 page.
Roth et al., “Suggesting Friends Using the Implicit Social Graph”, In Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Jul. 25, 2010, 9 pages.
Ubbesen, Christian, “Enterprise Graph Search”, Published on: Jan. 28, 2013, Available at: http://www.findwise.com/blog/enterprise-graph-search/.
Yap, Jamie, “Graph Search Capabilities Offer Enterprise Benefits”, Published on: Feb. 14, 2013, Available at: http://www.zdnet.com/graph-search-capabilities-offer-enterprise-benefits-7000011304/.
Yeung, Ken, “Yammer Unveils the Open Graph for the Enterprise, to Help make Business Apps More Social”, Published on: Oct. 29, 2012, Available at: http://thenextweb.com/insider/2012/10/29/yammer-using-the-enterprise-graph/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+TheNextWeb+(The+Next+Web+All+Stories).
Zhibao et al., “EISI: An Extensible Security Enterprise Search System”, In 2nd International Conference on Computer Science and Network Technology, Dec. 29, 2012, pp. 896-900.
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2015/017878”, dated Jun. 8, 2015, 9 Pages.
PCT International Preliminary Report on Patentability Issued in Application No. PCT/US2015/017878 , dated Sep. 15, 2016, 7 Pages.
Resnick, “Request for Comments: 5322”, Network Working Group, Qualcomm Incorporated, 57 Pages (Oct. 2008).
PCT International Preliminary Report on Patentability Issued In Application No. PCT/US2016/012399, dated Jul. 11, 2017, 9 Pages.
“Office Action Issued in European Patent Application No. 15710653.5”, dated Jul. 27, 2017, 8 Pages.
U.S. Appl. No. 14/188,079, Notice of Allowance dated Sep. 7, 2017, 7 pages.
“8 Things Marketers Ought to Know About Facebooks New Trending Feature”, Retrieved from: https://web.archive.org/save/https://www.facebook.com/notes/brandlogist/8-things-marketers-ought-to-know-about-facebooks-new-trending-feature/650859898308191/, Jan. 30, 2014, 5 Pages.
“Trending—Definition and Synonyms”, Retrieved from https://web.archive.org/web/20170618063522/http://www.macmillandictionary.com:80/us/dictionary/american/trending, Jul. 18, 2014, 1 Page.
Dayal, Priyanka, “How Many Tweets Make a Trend?”, Retrieved from https://www.vuelio.com/uk/blog/how-many-tweets-make-a-trend/, Aug. 28, 2013, 5 Pages.
“Final Office Action Issued in U.S. Appl. No. 14/469,943”, dated Jul. 5, 2018, 36 Pages.
Barbie E. Keiser, Semisocial information Discovery, Novi Dec. 2013, Online searcher, pp. 16-22 (Year: 2013).
Anthony Stefanidis et al., Harvesting ambient geospatial information from social media feeds, Dec. 4, 2011, GeoJournal, Edition or vol. 78, pp. 319-338 (Year: 2011).
“First Office Action & Search Report Issued in Chinese Patent Application No. 201480058874.0”, dated Dec. 5, 2018, 14 Pages.
“Non Final Office Action Issued in U.S. Appl. No. 14/192,235”, dated Dec. 26, 2018, 16 Pages.
“Final Office Action Issued in U.S. Appl. No. 14/593,650”, dated Jan. 4, 2019, 35 Pages.
“First Office Action and Search Report Issued in Chinese Patent Application No. 201580011895.1”, dated Mar. 5, 2019, 18 Pages.
“Office Action Issued in European Patent Application No. 15710632.9”, dated Feb. 18, 2019, 07 Pages.
“First Office Action and Search Report Issued in Chinese Patent Application No. 201580010703.5”, dated Mar. 8, 2019, 12 Pages.
“Final Office Action Issued in U.S. Appl. No. 14/064,393”, dated Mar. 4, 2019, 19 Pages.
“Office Action Issued in European Patent Application No. 15771764.6”, dated May 13, 2019, 9 Pages.
“Final Office Action Issued in U.S. Appl. No. 14/194,700”, dated May 20, 2019, 25 Pages.
“Final Office Action Issued in U.S. Appl. No. 14/296,747”, dated May 1, 2019, 30 Pages.
“Advisory Action Issued in U.S. Appl. No. 14/064,393”, dated Jun. 6, 2019, 6 Pages.
Related Publications (1)
Number Date Country
20150248410 A1 Sep 2015 US