Context-based file selection

Information

  • Patent Grant
  • 12032518
  • Patent Number
    12,032,518
  • Date Filed
    Thursday, April 27, 2023
    a year ago
  • Date Issued
    Tuesday, July 9, 2024
    4 months ago
  • CPC
    • G06F16/156
    • G06F16/14
  • Field of Search
    • US
    • NON E00000
  • International Classifications
    • G06F16/14
    • Disclaimer
      This patent is subject to a terminal disclaimer.
      Term Extension
      0
Abstract
A method for context-based file selection that includes receiving a request pertaining to searching one or more files for a user; in response to receiving the request, identifying file request context information associated with the request, wherein the file request context information pertains at least in part to a topic; analyzing, based on the file request context information, contents of the one or more file; in response to analyzing the contents of the one or more files, extracting, from the contents of the one or more files, information pertaining to the file request context information; determining one or more suggested contents based on the extracted information; and providing, for display to the user, a display portion of a user interface for presentation pertaining to the one or more files and the one or more suggested contents, the display portion including a representation of at least one of the one or more files and at least one of the one or more suggested contents.
Description
TECHNICAL FIELD

This disclosure relates in general to accessing files using context-based file selection.


BACKGROUND

Computing devices can be employed by a user to create and manage documents such as text files, spreadsheets, pictures, videos, presentations, e-mail messages, web pages, etc. As computing devices and associated networks become more capable and complex and storage options become more extensive, finding and selecting a particular document can be difficult.


SUMMARY

This disclosure relates generally to systems, methods and apparatuses that may be used to more quickly access files by relying on context-based file selection.


An aspect of the disclosed embodiments is a method that includes: responsive to a request to access a file stored in a memory of a computing device, identifying, by a processor, a context in which the access to the file is being requested; identifying, by the processor, one or more computer files that at least partially match the context; generating, by the processor, for a display, a list of stored files selectable by a user, the list of stored files including the identified one or more computer files at least partially matching the context; and generating, by the processor, for the display, a list of user contacts selectable by the user, the list of user contacts including one or more user contacts identified as having communicated the user that include the identified one or more computer files at least partially matching the context, wherein each of the user contacts in the list of user contacts includes a link to the identified one or more computer files at least partially matching the context that are included in communications between a respective user contact and the user.


Another aspect of the disclosed embodiments is a computing device that includes a memory and a processor to execute instructions stored in the memory to: responsive to a request to access a file stored in a memory of a computing device, identify a context in which the access to the file is being requested; identify one or more computer files that at least partially match the context; generate, for a display, a list of stored files selectable by a user, the list of stored files including the identified one or more computer files at least partially matching the context; and generate, for the display, a list of user contacts selectable by the user, the list of user contacts including one or more user contacts identified as having communicated the user that include the identified one or more computer files at least partially matching the context, wherein each of the user contacts in the list of user contacts includes a link to the identified one or more computer files at least partially matching the context that are included in communications between a respective user contact and the user.


Variations in these and other aspects of this disclosure will be described in additional detail hereafter.





BRIEF DESCRIPTION OF THE DRAWINGS

The description herein makes reference to the accompanying drawings wherein like reference numerals refer to like parts throughout the several views, and wherein:



FIG. 1 is a block diagram of an exemplary computing device in which implementations of the teachings herein may be incorporated;



FIG. 2 is a flowchart of a process for creating a document according to an implementation of the teachings herein;



FIG. 3 is a diagram of user generated context according to an implementation of the teachings herein;



FIG. 4 is a diagram of a user generating a file access request according to an implementation of the teachings herein;



FIG. 5 is a diagram of a context-based file open dialog according to an implementation of the teachings herein;



FIG. 6 is a diagram of user generated context according to an implementation of the teachings herein;



FIG. 7 is a diagram of a user generated file access request according to an implementation of the teachings herein; and



FIG. 8 is a diagram of a context-based file open dialog according to an implementation of the teachings herein.





DETAILED DESCRIPTION

As computing devices and networks such as the Internet to which they are attached become larger and more capable, locating information can become a more difficult task. For example, trying to find a single computer file among thousands of files can be a time-consuming task. On many computing devices, the user is presented with an “open file” dialog box when a user desires to access a file. The open file dialog lists file names from either the last location accessed or a default location. In order to find a particular file, the user must navigate through a potentially large and complex file system to locate the file.


The file system to be navigated to locate a file can be enormous and contain many files, making quick access to a given file a very difficult and time-consuming task. Using a search function can assist in finding a particular file but can require that a separate application be run and the search results entered into the file open dialog. Further, the ability of a search engine to find a particular file can be a function of the quality of the search string formed by a user. To form a high quality search string, relevant information regarding the file must be remembered to make searching efficient.


One addition to a file open dialog to make file locating easier is the “recent” feature that organizes file names of recently accessed files in a separate list. Organizing file names in a “recent” list can make searching for a particular file easier if the file has been recently accessed. However, this list cannot help locate the file if the file has not been accessed recently, since “recent” lists are limited in the number of entries to make the list manageable. Another problem is that some recent lists include all files accessed recently, not just files relevant to the application or task currently be undertaken by the user, thereby making the likelihood that a desired file is listed in a recent list lower.


In contrast, and in accordance with the teachings herein, candidate files may be found by using the context associated with the request to open a file to help find the correct file. Aspects of disclosed implementations assist in accessing files on a computing device by taking into account the context within which a computer file is being accessed. Context can be defined as information regarding past and current activities occurring on a computer system. More precisely, context can be defined as information regarding past and current activities as related to current user activity. For example, the current activity can be running a particular application on a computing device and the context can include information regarding what files are accessed by the application and what other applications might be executing at the same time. Context may be stored in a context file that contains the information regarding past and current use of a computing device and how it is related to other activities. The terms “context” and “context file” will be used interchangeably herein.


Context information can be created by a software program that observes actions being performed on a computer and makes decisions regarding which actions can be included in the context file. A context software program can use the context information from the context file to suggest files to access or other actions based on current activities. For example, context information regarding a user's current and past interaction with a computing device can be used to search the file system, filter file names found and present to the user a list of file names belonging to files that have been determined to be relevant to the current activity occurring on the computing device. The presentation is responsive to a request to open a file or a request to attach a file to an email, etc. That is, for example, the current context may be compared to the stored context to find files in response to a request to access a file, such as to open a file or attach a file within an already open application. Herein, a user request for a file may be interchangeably referred to as a request to access a file, to search for a file, to select a file or to open a file.


Context information regarding a user's interaction with a computing device can be acquired in real time, as a user is interacting with the computing device. For example, one implementation may observe the text a user is entering into a document by typing and form context information while the document is being prepared for use in selecting candidate files for presentation to the user.


The presentation to the user may be a dialog with file identifiers including file names and/or thumbnails for files determined to be relevant to the context in which the file is requested. This can provide a high likelihood that at least one of the file names presented is a file name desired by the user at that particular time. Further, selecting a file name from the files names presented can be incorporated into the context associated with the file and thereby make the next file selection task produce an improved list of file names that can be even more relevant to the user.


Accessing a file on a computing device can require that the accessing device know more than simply the file name of the file to be opened. Accessing a file can require, for example, information regarding the location of the file and possibly passwords or other security information such as encryption keys. “File name” will be used herein as a shorthand descriptor for the information required in identifying, locating and accessing files with a computing device.


Information in addition to file names can be presented to a user as a result of analyzing the context associated with a user or computing device. For example, when composing an e-mail message, a context application can analyze the contents of the e-mail as it is being written and suggest possible recipients in addition to suggesting possible attachments to the e-mail. In another example, if dates and times are mentioned when composing an e-mail or other document, a context analysis application can analyze the contents of the document and extract information from a calendar program related to the dates and times. This information can be used to create a new appointment to be entered into the calendar or can be displayed for a user to prevent double booking appointments, for example.



FIG. 1 is a block diagram of an exemplary computing device 100 in which implementations of the teachings herein may be incorporated. Computing device 100 can be in the form of a computing system including multiple computing devices, or in the form of a single computing device. For example, computing device 100 may be a stationary computing device, such as a personal computer (PC), a server, a workstation, a minicomputer, or a mainframe computer; or a mobile computing device, such as a mobile telephone, a personal digital assistant (PDA), a laptop, or a tablet PC. Although shown as a single unit, any one or more elements of computing system 100 may be integrated into any number of separate physical units connected for interaction with each other either physically or wirelessly. This connection may be over a network, such as the Internet, a wired or wireless local area network (LAN), a cellular telephone network or any combination of these. In some implementations, certain components of computing system 100 may be omitted. In others, additional components may be included.


A CPU 102 in computing device 100 can be a conventional central processing unit. Alternatively, CPU 102 can be any other type of device, or multiple devices, capable of manipulating or processing information now-existing or hereafter developed. Although the disclosed implementations can be practiced with a single processor as shown, e.g., CPU 102, advantages in speed and efficiency may be achieved using more than one processor.


A memory 104 in computing device 100 can be a read only memory (ROM) device or a random access memory (RAM) device in an implementation. Any other suitable non-transitory type of storage device can be used as memory 104. Memory 104 can include code and data 106 that is accessed by CPU 102 using a bus 108. Memory 104 can further include an operating system 110 and application programs 112, the application programs 112 including at least one program that permits CPU 102 to perform methods described herein. For example, application programs 112 can include applications 1 through N, which further include an application for context processing that performs methods described here. Computing device 100 can also include a secondary storage 114 that can, for example, be a memory card used with computing device 100 when it is mobile. Images described herein may contain a significant amount of information, so they can be stored in whole or in part in secondary storage 114 and loaded into memory 104 as needed for processing.


Computing device 100 can also include one or more output devices, such as a display 116. Display 116 may be, in one example, a touch sensitive display that combines a display with a touch sensitive element that is operable to sense touch inputs. Display 116 can be coupled to CPU 102 via bus 108. Other input devices that permit a user to program or otherwise use computing device 100 can be provided in addition to or as an alternative to display 116, such as a keyboard or mouse. Display 116 can be implemented in various ways, including by a liquid crystal display (LCD), a cathode-ray tube (CRT) display or a light emitting diode (LED) display, such as an OLED display.


Computing device 100 can also include or be in communication with an image-sensing device 118, for example a camera, or any other image-sensing device 118 now existing or hereafter developed that can sense and capture an image such as the image of a user operating computing device 100. Image-sensing device 118 can be positioned such that it is directed toward or away from the user operating computing device 100. In an example, the position and optical axis of image-sensing device 118 can be configured such that the field of vision is directed away from display 116 and is able to produce an image visible on display 116.


Computing device 100 can also include or be in communication with a sound-sensing device 120, for example a microphone or any other sound-sensing device now existing or hereafter developed that can sense sounds near computing device 100. Sound-sensing device 120 can be positioned such that it is directed toward a user operating computing device 120 and can be configured to receive sounds, for example, speech or other utterances, made by the user while the user operates computing device 100.


Although FIG. 1 depicts CPU 102 and memory 104 of computing device 100 as being integrated into a single unit, other configurations can be utilized. The operations of CPU 102 can be distributed across multiple machines (each machine having one or more of processors) that can be coupled directly or across a local area or other network. Memory 104 can be distributed across multiple machines such as a network-based memory or memory in multiple machines performing the operations of computing device 100. Although depicted here as a single bus, bus 108 of computing device 100 can be composed of multiple buses. Further, secondary storage 114 can be directly coupled to the other components of computing device 100 or can be accessed via a network and can comprise a single integrated unit such as a memory card or multiple units such as multiple memory cards. Computing device 100 can thus be implemented in a wide variety of configurations.


Computing device 100 can be used to locate and open computer files stored anywhere on a computing device or a network to which the computing device is connected. Computing device 100 can also be used to create documents such as e-mail messages or other documents and store them locally and/or transmit them to other computing devices via a network. Aspects of disclosed implementations can observe one or more user's activities on a computing device, extract information from the observations according to rules and store the extracted information in memory. Other aspects can use the information in the memory to suggest file names, e-mail recipients or other information to a user when the user's current activities on the computing device suggest that information from the context file might be useful to the user.



FIG. 2 is a flowchart of a process 200 for using context to assist a user in accessing a document according to an implementation of the teachings herein. Process, or method of operation, 200 can be implemented, for example, as a software program that is executed by computing devices such as computing device 100. The software program can include machine-readable instructions that are stored in a memory such as memory 104 that, when executed by a processor such as CPU 102, cause the computing device to perform process 200. Process 200 can also be implemented using hardware. As explained above, some computing devices may have multiple memories and multiple processors, and the steps of process 200 may in such cases be distributed using different processors and memories. Use of the terms “processor” and “memory” in the singular encompasses computing devices that have only one processor or one memory as well as devices having multiple processors or memories that may each be used in the performance of some but not necessarily all of the recited steps.


For simplicity of explanation, process 200 is depicted and described as a series of steps. However, steps in accordance with this disclosure can occur in various orders and/or concurrently. Additionally, steps in accordance with this disclosure may occur with other steps not presented and described herein. Furthermore, not all illustrated steps may be required to implement a method in accordance with the disclosed subject matter.


Process 200 may be implemented as an application (e.g., a mobile application) executing locally on a computing device of a user or may be executed on a remote computing device in communication with the computing device of a user to transmit commands from the computing device and receive information from the remote computing device, such as through a web browser, for example.


At step 202, responsive to a request to select a file stored in memory, such as a request to open a file or a request to attach a file to an email, process 200 can identify a context associated with the user at the time of making the request. The request can take the form of a user using an input device such as a keyboard, mouse or touch screen to indicate to a software program executing on a computing device that the user would like to select a file to open, for example. Aspects of disclosed implementations can also use gesture-sensitive devices or voice recognition to determine that a user would like to select a file.


Identifying the context may be performed by a software program or application that runs in the background as other computing tasks are performed on the computing device. The context identifying portion of process 200 can observe and record activities performed on the computing device in response to input from a user, for example running an application. The context forming application can observe and record any and all activities occurring on the computing device including user input, file access, program execution, network access including internet browsing and changes in the computing device environment including re-configuring hardware and software or other settings.


The context can be associated with a particular user of a computing device or can be associated with the computing device such that it is associated with any and all users of the computing device. On computing devices that require that users log on, the user associated with a context can be identified by identifying the user currently logged into the computing device. On computing systems that do not require an explicit login, a user can be identified by requesting that the user provide their identity to an application running on the computing device. Alternatively, computing devices equipped with video or audio input devices as described above can use automatic facial recognition software or automatic voice recognition software to identify a user. In other implementations, process 200 can update the context without requiring identification of a user, assuming that the context will apply to any user of a computing device and assigning a dummy user ID to the acquired context.


The context identified at step 202 can capture the current state of the computing device including files currently or recently being accessed, software programs currently or recently being executed, storage devices currently or recently being accessed or web sites currently or recently being visited, for example. In the case of files being accessed, the context can include information related to file names, file owners, file types, file size, creation dates, modification dates, last access date, whether the file is shared with other users, the location where the file is stored and information regarding the contents of the file. This information, particularly information regarding the file contents, can be automatically determined by content analysis software. In other implementations, a user can be prompted to enter some or all of the information by process 200.


In the case of software programs being executed, context information can include an identifier for the computing device currently being used by the user, the current time and date, information regarding the networks to which the computing device is in communication with, the physical location of the computing device and the user's identity. For example, a first context can include identifying a particular user as interacting with a first computing device, which can be a laptop computer located at a business location connected to a business network and the Internet, at 10:00 AM on a Monday morning. A second context can include identifying a particular user as interacting with a cell phone connected to a cellular telephone network and the Internet located on a highway at 5:30 PM on a weekday. A third context can include a particular user interacting with a tablet computer connected to the Internet at home at 9:00 PM on a weeknight.


These three examples show how even small amounts of context information can assist in locating files. For example, in the first example shown above, the context can establish that the user is at work and can be intending to open files related to work. The second example can establish that the user is using a mobile phone and likely driving in a car. Files desired by a user in this situation can relate to maps, appointments outside of work and shopping, for example. In the third example a user can be using a tablet computer at home and the desired files can relate to entertainment such as movies or TV shows, for example.


Context can also be identified by which software programs are currently executing on a computing device. This can include software programs such as operating systems, internet access applications such as web browsers, word processors, picture processing and archiving applications, e-mail applications, calendar applications, and social media applications. The context can include information regarding the data being handled by the applications including documents, images, e-mails, appointments, web sites visited and data received from social media software.


Context can also include information determined to be universally relevant to context regardless of the state of a particular software program. For example, although the calendar application may not currently be open on a particular computing device, the context for a user on that computing device at a particular time can include an upcoming scheduled meeting from the calendar database.


Context can also include information from an application currently being accessed by a user, updated in real time. For example, a user can be entering text into a file. While the user is entering text, the text may be analyzed to gather clues regarding what types of files might be requested by a user. In another example, a user can be surfing the Internet searching a particular topic. The search strings and contents of web pages may be analyzed to anticipate what files might be requested by the user. Context analysis software can be running in the background on a computing device on an ongoing basis, observing operational behavior and continuously updating the context based on the observations.


Context can also include information on past behavior. Context analysis software can track which files were opened by a user when running a particular program at a particular time. If a user is running the same programs on the same computing device at a similar time of day and opens a particular file, it can be assumed from the context that the user may desire to open other files that were also opened along with the particular file, for example.


It is worth noting that the information used in forming a context for a user or computing system is not designed to be accessed by any other computing device or entity that may be permanently or intermittently associated with the computing device. The context information is not shared or transmitted to any system for any purpose other than providing context for a specific user that is used to search for files, which may be located on storage associated with the computing device or on a network, which may include the Internet. The information that aspects of disclosed implementations exposes to systems outside of the computing device can be limited to information regarding which file or files to inspect to determine if they are relevant to a current file access. Thus, context information is not directly accessible by any other computing device.


Context information can be divided into different categories and made active for different periods of time. For example, it can be observed that a particular user or computing device accesses files associated with health and fitness during the entire year, while files associated with a particular sport are accessed mainly during the season the sport is active. File names for files associated with health and fitness can therefore be included in the list of relevant file names at any time of the year. File names for files associated with a particular sport can be included in the list of relevant file names only during and near the time the sport is in season.


The algorithm used to determine context can be heuristic, meaning that it can be based on a rule-driven algorithm with a list of rules that can be edited or modified either automatically based on experience or manually by editing existing rules or entering new rules. In this way the behavior of the algorithm that forms the context can evolve as it observes activities on the computing device or can be guided towards a particular behavior by editing the rules. Continuing the example started above, if a user decides that they no longer are fans of a particular sport, and the context forming algorithm continues to find file names of files associated with that particular sport, a file containing the heuristic rules to delete rules can be made editable to permit a user to delete references or rules related to that sport.


Context information can be used to suggest actions to a user in addition to file names. For example, possible e-mail recipients, calendar events or website addresses could be suggested to a user based on context information. Context can be analyzed by an automatic context analysis algorithm in real time with the results of the analysis combined with previously stored context information to suggest files, users, locations or events for a user to attach, contact, visit, view or attend. Context can be used to refer to information or objects residing on any computing device associated with a user including mobile or networked devices. For example, context for a user at a computing device can include information on photographs residing on a user's mobile phone, assuming the user has indicated that the contents of their mobile phone can be included in the context for that user.



FIG. 3 is an example showing an e-mail message 300 being created by a user. Message 300 includes an e-mail address 302 of a sender, an e-mail address 304 of an intended recipient and a message body 306. Aspects of disclosed implementations can analyze this e-mail as it is being written to add information to the context file associated with a user of the computing device. For example, based on message 300, the context file can be updated to include the receiver's e-mail address, a previous vacation and a reference to a family member. Context developed from this information can include other previously stored information such as information regarding the receiver including network or system identification, file ownership or sharing and joint participation in past or future meetings from a related calendar program. Other e-mail messages to or from the intended recipient can be examined to determine if context information from those e-mails could also be used.


Context can also include information regarding the vacation mentioned in body 306 of message 300. By examining a calendar related to the user or computing device, it can be determined, for example, that the user scheduled a vacation to this location for certain dates in the recent past. By examining the user's communications with travel websites it could be determined which airlines, rental cars and hotels were used on which dates, thereby establishing an itinerary for the user. In addition, the context algorithm can observe that the word “pictures” was used in the e-mail message and by combining the itinerary information extracted above with date and geographic information associated with picture files determine which picture files accessible to the computing device are associated with this vacation, for example.



FIG. 6 is a diagram showing a document 600 being created by a user using a word processing application. Document 600 includes a title 602 and a date range 604. This information can be automatically extracted from document 600 as it is being typed and then be analyzed by process 200 for inclusion in the context. As in the example above, this information can be combined with other information available to the context algorithm to create a richer, more inclusive context. The dates can be correlated with a calendar and travel information as described above to establish that the user has taken a trip and determine, for example, whether the trip was a vacation or a business trip, whether a conference was included and whether the user delivered a paper at the conference. The context can be prepared to identify files related to this trip, whether they are vacation photos, new business contacts or a copy of slides presented at a conference, e.g., when the user seeks to access another file, such as an image file.


Returning to FIG. 2, process 200 identifies computer files associated with the context at step 204. This can occur responsive to a request to access a file or files from an application based on a user action or initiated by the application itself. For example, when opening a spreadsheet application, one of the first things the application does is present a dialog box to the user requesting a spreadsheet file to open. Applications can have a single default location to which the file open dialog defaults, or use the last location accessed. If the location of desired file is not at the default location a user would have to remember where the desired file was stored and navigate to that location to open a file.


According to the teachings herein, however, the dialog box may include a list of file names associated with a context at the time the request to open a file occurs. The context may include, for example, whether a particular spreadsheet is opened during the last week of one or more months. The context forming algorithm can have rules that recognize that particular reports are due at the end of the month and status reporting spreadsheets are normally worked on at this computing device during the last week of the month. As a result, a file name list can be formed that includes status reporting spreadsheets that have been opened during the last week of a month.


More generally, process 200 can use the current context information to form search strings. These search strings can be applied to a searchable file system that includes files accessible to the user. The search string can include information regarding files that results in a list of files relevant to the search string. Readily available searching technology can provide scores along with the file names and locating information so that the list of files can be sorted according to a “relevance” measure that measures how closely the file matches the search criteria. In this way, the files that are determined to be most relevant to the current context can be presented to the user.


File names to be included in a list to be presented to a user can be acquired by forming file access templates and returning all file names that match the criteria in the template. In the example above, the search may include all file names that start with the string “monthly status” and end in the file type extension “.xls” and have been opened during the last week of a month. File names can be identified by matching information contained in the file header such as name, creation date, creator name, or last modified date.


File names can also be identified by searching the contents of files and forming matches with rules in the context. Searching for text files by contents can be as simple as: “search for all files containing the word ‘baseball’” or can be more complex, such as: “search for all files containing references to seasons of the year.” In the latter case, a heuristic search strategy might include a list of words related to seasons and synonyms can be developed to assist in searching.


Files containing data that is not normally searchable can be searched by including keywords. For example, photographs can be identified by having a user add keywords to identify the photograph. Contents of files such as photographs can also be automatically processed to identify keywords to be associated with the image file. Software is currently available that analyzes photographs according to the contents and automatically identifies people (names) and places (for example, the Eiffel Tower). This type of information can be used to identify a photograph for the algorithm that selects files based on the context.


Other information can be used to identify files like photographs. For example, some smart phones include GPS units that can identify the latitude and longitude at which a photograph was taken. Combining this information with information from a geographic information system (GIS) can indicate in which country or state a photograph was taken, thereby providing further identification information for the photograph. Combining this information with the dates and further information regarding time off work can indicate that the photographs were taken on vacation to a particular location, for example.



FIG. 4 shows a user selecting an “add attachment” icon 402 related to an e-mail message 300. Since, as described in relation to FIG. 3, the context algorithm has analyzed message body 306 and possibly combined it with other information from the system, process 200 is prepared to select file names associated with e-mail message 300 to present to the user in response to the request to add a file as an attachment. In this case, the attached can be files, particularly pictures, associated with a vacation to a particular location at a particular time.



FIG. 7 shows a user selecting an “insert image” or “add image” icon 702 associated with document 600 from FIG. 6. Since, as described in relation to FIG. 6, the context application has analyzed the contents 602, 604 of document 600604, the context application can identify files containing images associated with the trip mentioned in the document when the user selects the insert image icon 702. Following the example begun above, if the trip were a vacation, they could be vacation photos. If the trip were a conference, the method of operation can identify file names associated with the conference, for example copies of slides presented at a meeting.


Returning again to FIG. 2, the information selected by the context application at step 204 can be displayed for a user at step 206. As discussed above, this can include file names for files determined by the context to be associated with the application currently accepting commands from a user, or other information included in the context. This other information can include suggested recipients for an e-mail message, suggested users with whom to share a document or photograph, data from social media sites, website addresses, favorite files and websites, e-mail messages or calendar entries, for example.



FIG. 5 is an example of a dialog 500 associated with e-mail message 300 of FIG. 3. In response to a user selecting the “add attachment” icon 402 in FIG. 4, process 200 may display dialog 500 including lists 502, 504, 506, 508, 510 and 512. Dialog 500 includes a list 502 of links to photographs (“PHOTOS”) and social media sites (“APP1”, “APP2”) selected by the context to apply to the current e-mail message. The links in list 502 at least partially match search strings formed by process 200 using the current context in response to the user requesting a file by selecting icon 402. Dialog 500 also includes a list 504 of suggested recipients for message 300 based on the analyzed contents and a list 506 of other context and calendar events related to message 300. List 504, for example, includes links to e-mail addresses (“USER1”, “USER2” and “USER3”) from a user's e-mail address book that process 200 has identified as having sent or received e-mails containing data that at least partially match the search strings formed by process 200. Similarly, list 506 may include events from a calendar program that at least partially match the search strings formed by process 200.


Dialog 500 can also include a list 508 with thumbnail images of “favorites” previously selected by the context application based on their frequency of access to be presented in dialog 500 regardless of the contents of related documents. Desirably, however, the files of list 508 at least partially match the search strings formed by process 200. List 508 may include reduced views of the contents of files to help a user quickly determine which file to open. List 508 may also include a list of file names to permit a user to select a file that has not had the contents displayed. This permits process 200 to display more files than if a thumbnail of the contents were displayed for all files identified. Match scores for the files with respect to the search strings can determine which files to show with a thumbnail of the contents and which files to display with only the file name.


The context application can also suggest and display a list 510 of related messages and other files including documents or events received, opened and/or scheduled during a particular time period 512, for example the current day. The items in list 510 at least in part match the search strings. Similarly to list 508, list 510 may include thumbnails of file contents (“Just shared with you”) and lists of items that can include e-mail messages or contacts (“USER4”), files or events, for example. In this example, list 510 includes recent files that have been recently accessed by the user (“TODAY”).



FIG. 8 is an example of a dialog 800 associated with document 600 of FIG. 6. Dialog 800 was formed in response to the selection by the user of icon 702 of FIG. 7 while editing document 600. Process 200 has determined from the context that the files the user desires to open include image files; hence the heading “Photos” shown for dialog 800. Items displayed in dialog 800 can include a list 802 of links to files including applications and websites addresses including a social media site (“APP1”), photo web sites such as a picture archiving application (“GALLERY”) and e-mail message, all of which include image files related to the search strings formed by process 200.


Also displayed is a list 804 of e-mail addresses (“USER1”, “USER2”, “USER3”, “USER4”, “USER5” and “USER6”) from a user's e-mail address book that process 200 has identified as having sent or received e-mails that include data that at least partially matches the search strings formed by process 200. These addresses may be links to a list of documents associated with each address that are found by process 200. List 806 of files including image files that process 200 has identified as being related to the context formed from document 600 are included in dialog 800. List 806 may include thumbnails residing on storage devices accessible by the user that at least in part match the search string formed by process 200 in response to the user requesting a file. Also included in dialog 800 are lists 808, 810 of additional file names of files that have been identified by process 200 as being related to the context formed from document 600 arranged in groups according to dates upon which they were acquired. List 808 includes a list of items shared with the user in the past week (“PAST WEEK”), and list 810 includes a list of items shares with the user during a specific past time period (“MARCH 2013”) that at least in part match the search strings. Lists 808, 810 can include thumbnails of files.


Returning to FIG. 2, one of the particular advantages resulting from the teachings herein is speed. Providing a selected set of file names based on context can permit a user to locate and open a file more quickly than using a standard file open dialog that requires that one navigate to a location and filter results. Process 200 can identify and list files regarded as relevant to the user and the task identified by the context, thereby reducing the number of irrelevant data items presented to a user.


In operation, in the event that a dialog presented by process 200 does not include any files that are deemed relevant by the user, the user can select an option, such as through a button or other display item, to return to a standard or default file open dialog and permit the user to identify, locate and open a file using that file open dialog. In FIG. 5, for example, a search box is included by which a user may enter a command such as “default” to generate the file open dialog.


Items in addition to files can be presented to the user as a result of generating a list of stored files to a user. For example, links to social media sites associated with a user can be searched for content that matches the search strings compiled by process 200. E-mail address books can be searched for e-mail addresses that have provided content that matches the search string and calendars can be searched for appointments or meetings associated with the search strings compiled by process 200. In addition, recently visited web sites or storage locations can be listed along with recently shared files, e-mails, appointments or web sites from other users. Following presentation of the list of lists of files, addresses and websites to a user, process 200 permits a user to select a file, address or website to navigate to.


The list of stored files generated at step 206 can be displayed to a user to permit the user to select a file to open or include as an attachment to an e-mail or calendar event, for example. Presenting a list of stored files generated in this fashion can permit a user to find a desired file more quickly with less work than using a standard file open dialog. In addition, providers of software and services such as search or social media are increasingly gathering more and more information regarding online behavior. This information may be used by process 200 to identify patterns of behavior that can be used to find files appropriate to a user's request to open files.


In situations in which the systems discussed here collect personal information about users, or may make use of personal information, the users may be provided with an opportunity to control whether programs or features collect user information (e.g., information about a user's social network, social actions or activities, profession, a user's preferences or a user's current location), or to control whether and/or how to receive content from the content server that may be more relevant to the user. In addition, certain data may be treated in one or more ways before it is stored or used so that personally identifiable information is removed. For example, a user's identity may be treated so that no personally identifiable information can be determined for the user, or a user's geographic location may be generalized where location information is obtained (such as to a city, ZIP code, or state level), so that a particular location of a user cannot be determined. Thus, the user may have control over how information is collected about the user and is used by a content server.


Using the teachings herein, the most appropriate suggestions for accessing a file, such as opening a file or attaching a file within an existing application process, may be presented to a user at the time the request is made. The teachings may be implemented in a 100% heuristic-based algorithm to improve searches over time. By using the teachings herein, users can spend less time picking files and more time with task-related activity such as performing operations and the like using the files.


The words “example” or “exemplary” are used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “example” or “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs. Rather, use of the words “example” or “exemplary” is intended to present concepts in a concrete fashion. As used in this application, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X includes A or B” is intended to mean any of the natural inclusive permutations. That is, if X includes A; X includes B; or X includes both A and B, then “X includes A or B” is satisfied under any of the foregoing instances. In addition, the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form. Moreover, use of the term “an implementation” or “one implementation” throughout is not intended to mean the same embodiment or implementation unless described as such.


Implementations of computing device 100 (and the algorithms, methods, instructions, etc., stored thereon and/or executed thereby) can be realized in hardware, software, or any combination thereof. The hardware can include, for example, computers, intellectual property (IP) cores, application-specific integrated circuits (ASICs), programmable logic arrays, optical processors, programmable logic controllers, microcode, microcontrollers, servers, microprocessors, digital signal processors or any other suitable circuit. In the claims, the term “processor” should be understood as encompassing any of the foregoing hardware, either singly or in combination.


Further, in one aspect, for example, computing device 100 can be implemented using a general purpose computer or general purpose processor with a computer program that, when executed, carries out any of the respective methods, algorithms and/or instructions described herein. In addition or alternatively, for example, a special purpose computer/processor can be utilized which can contain other hardware for carrying out any of the methods, algorithms, or instructions described herein.


All or a portion of implementations of the present invention can take the form of a computer program product accessible from, for example, a tangible computer-usable or computer-readable medium. A computer-usable or computer-readable medium can be any device that can, for example, tangibly contain, store, communicate, or transport the program for use by or in connection with any processor. The medium can be, for example, an electronic, magnetic, optical, electromagnetic, or a semiconductor device. Other suitable mediums are al so available.


The above-described embodiments, implementations and aspects have been described in order to allow easy understanding of the present invention and do not limit the present invention. On the contrary, the invention is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structure as is permitted under the law.

Claims
  • 1. A method comprising: receiving a request pertaining to searching one or more files for a user;in response to receiving the request and in view of a set of rules for identifying file request context information associated with the request, identifying the file request context information associated with the request, wherein the file request context information pertains at least in part to a topic;analyzing, based on the file request context information, contents of the one or more files;in response to analyzing the contents of the one or more files, extracting, from the contents of the one or more files, information pertaining to the file request context information;determining one or more suggested contents based on the extracted information; andproviding, for display to the user, a display portion of a user interface for presentation pertaining to the one or more files and the one or more suggested contents, the display portion including a representation of at least one of the one or more files and at least one of the one or more suggested contents.
  • 2. The method of claim 1, wherein the file request context information includes one or more of a file name, a file owner, a file type, a file size, a file creation date, a file modification date, a file access date, a status of a file as being shared, a file location, or a file content.
  • 3. The method of claim 1, wherein the request omits a file identifier identifying the one or more files.
  • 4. The method of claim 1, wherein receiving the request includes receiving the request from a user device, and wherein the file request context information includes information indicating the user device.
  • 5. The method of claim 1, wherein providing, for display to the user, the display portion of the user interface comprises transmitting information representing the display portion to a user device.
  • 6. The method of claim 1, further comprising: identifying one or more candidate files based on the file request context information, and wherein the display portion includes a second representation of at least one of the one or more candidate files.
  • 7. The method of claim 6, wherein identifying the one or more candidate files comprises: forming a search string using the file request context information; andsearching using the search string.
  • 8. The method of claim 6, further comprising: identifying one or more candidate user contacts based on the file request context information, each of the one or more candidate user contacts having sent communications to or received communications from the user, wherein the communications include at least one of the one or more candidate files, wherein the display portion includes a third representation of at least one of the one or more candidate user contacts, and wherein the third representation of at least one of the one or more candidate user contacts comprises a link to the one or more candidate files.
  • 9. A system comprising: a memory device storing instructions; anda processing device coupled to the memory device, the processing device to execute the instructions to perform operations comprising:receiving a request pertaining to searching one or more files for a user;in response to receiving the request and in view of a set of rules for identifying file request context information associated with the request, identifying the file request context information associated with the request, wherein the file request context information pertains at least in part to a topic;analyzing, based on the file request context information, contents of the one or more files;in response to analyzing the contents of the one or more files, extracting, from the contents of the one or more files, information pertaining to the file request context information;determining one or more suggested contents based on the extracted information; andproviding, for display to the user, a display portion of a user interface for presentation pertaining to the one or more files and the one or more suggested contents, the display portion including a representation of at least one of the one or more files and at least one of the one or more suggested contents.
  • 10. The system of claim 9, wherein the file request context information includes one or more of a file name, a file owner, a file type, a file size, a file creation date, a file modification date, a file access date, a status of a file as being shared, a file location, or a file content.
  • 11. The system of claim 9, wherein the request omits a file identifier identifying the one or more files.
  • 12. The system of claim 9, wherein receiving the request includes receiving the request from a user device, and wherein the file request context information includes information indicating the user device.
  • 13. The system of claim 9, wherein providing, for display to the user, the display portion of the user interface comprises transmitting information representing the display portion to a user device.
  • 14. The system of claim 9, the operations further comprising: identifying one or more candidate files based on the file request context information, and wherein the display portion includes a second representation of at least one of the one or more candidate files.
  • 15. The system of claim 14, wherein to identify the one or more candidate files, the operations further comprise: forming a search string using the file request context information; andsearching using the search string.
  • 16. The system of claim 14, the operations further comprising: identifying one or more candidate user contacts based on the file request context information, each of the one or more candidate user contacts having sent communications to or received communications from the user, wherein the communications include at least one of the one or more candidate files, wherein the display portion includes a third representation of at least one of the one or more candidate user contacts, and wherein the third representation of at least one of the one or more candidate user contacts comprises a link to the one or more candidate files.
  • 17. A non-transitory computer-readable storage medium comprising executable instructions that, when executed by a processing device, cause the processing device to perform operations comprising: receiving a request pertaining to searching one or more files for a user;in response to receiving the request and in view of a set of rules for identifying file request context information associated with the request, identifying the file request context information associated with the request, wherein the file request context information pertains at least in part to a topic;analyzing, based on the file request context information, contents of the one or more files;in response to analyzing the contents of the one or more files, extracting, from the contents of the one or more files, information pertaining to the file request context information;determining one or more suggested contents based on the extracted information; andproviding, for display to the user, a display portion of a user interface for presentation pertaining to the one or more files and the one or more suggested contents, the display portion including a representation of at least one of the one or more files and at least one of the one or more suggested contents.
  • 18. The non-transitory computer-readable storage medium of claim 17, wherein the file request context information includes one or more of a file name, a file owner, a file type, a file size, a file creation date, a file modification date, a file access date, a status of a file as being shared, a file location, or a file content.
  • 19. The non-transitory computer-readable storage medium of claim 17, wherein providing, for display to the user, the display portion of the user interface comprises transmitting information representing the display portion to a user device.
  • 20. The non-transitory computer-readable storage medium of claim 17, the operations further comprising: identifying one or more candidate files based on the file request context information, and wherein the display portion includes a second representation of at least one of the one or more candidate files; andidentifying one or more candidate user contacts based on the file request context information, each of the one or more candidate user contacts having sent communications to or received communications from the user, wherein the communications include at least one of the one or more candidate files, wherein the display portion includes a third representation of at least one of the one or more candidate user contacts, and wherein the third representation of at least one of the one or more candidate user contacts comprises a link to the one or more candidate files.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of co-pending U.S. patent application Ser. No. 17/666,434, filed on Feb. 7, 2022, entitled “CONTEXT-BASED FILE SELECTION,” which is a continuation of U.S. patent application Ser. No. 15/807,891, filed on Nov. 9, 2017, entitled “CONTEXT-BASED FILE SELECTION,” now issued as U.S. Pat. No. 11,243,912, which is a continuation of U.S. patent application Ser. No. 14/010,850, filed on Aug. 27, 2013, entitled “CONTEXT-BASED FILE SELECTION,” now issued as U.S. Pat. No. 9,842,113, which are both herein incorporated by reference in their entirety.

US Referenced Citations (434)
Number Name Date Kind
4694406 Shibui et al. Sep 1987 A
4853878 Brown Aug 1989 A
5038138 Akiyama et al. Aug 1991 A
5175813 Golding et al. Dec 1992 A
5280367 Zuniga Jan 1994 A
5317306 Abraham et al. May 1994 A
5361361 Hickman et al. Nov 1994 A
5394523 Harris Feb 1995 A
5395423 Suwa et al. Mar 1995 A
5398310 Tchao et al. Mar 1995 A
5448695 Douglas et al. Sep 1995 A
5493692 Theimer et al. Feb 1996 A
5506951 Ishikawa Apr 1996 A
5526480 Gibson Jun 1996 A
5544049 Henderson et al. Aug 1996 A
5563996 Tchao Oct 1996 A
5600778 Swanson et al. Feb 1997 A
5610828 Kodosky et al. Mar 1997 A
5613163 Marron et al. Mar 1997 A
5623613 Rowe et al. Apr 1997 A
5721849 Amro Feb 1998 A
5732399 Katiyar et al. Mar 1998 A
5737553 Bartok Apr 1998 A
5790127 Anderson et al. Aug 1998 A
5812123 Rowe et al. Sep 1998 A
5821928 Melkus et al. Oct 1998 A
5826015 Schmidt Oct 1998 A
5826241 Stein et al. Oct 1998 A
5845300 Comer et al. Dec 1998 A
5859640 De Judicibus Jan 1999 A
5870770 Wolfe Feb 1999 A
5877763 Berry et al. Mar 1999 A
5883626 Glaser et al. Mar 1999 A
5894311 Jackson Apr 1999 A
5903267 Fisher May 1999 A
5905863 Knowles et al. May 1999 A
5905991 Reynolds May 1999 A
5948058 Kudoh et al. Sep 1999 A
5966121 Hubbell et al. Oct 1999 A
5999159 Isomura Dec 1999 A
5999179 Kekic et al. Dec 1999 A
6005575 Colleran et al. Dec 1999 A
6008803 Rowe et al. Dec 1999 A
6018341 Berry et al. Jan 2000 A
6026388 Liddy et al. Feb 2000 A
6034688 Greenwood et al. Mar 2000 A
6052121 Webster et al. Apr 2000 A
6078306 Lewis Jun 2000 A
6085205 Peairs et al. Jul 2000 A
6088696 Moon et al. Jul 2000 A
6154740 Shah Nov 2000 A
6184881 Medl Feb 2001 B1
6246996 Stein et al. Jun 2001 B1
6252597 Lokuge Jun 2001 B1
6272490 Yamakita Aug 2001 B1
6272537 Kekic et al. Aug 2001 B1
6289361 Uchida Sep 2001 B1
6295542 Corbin Sep 2001 B1
6300967 Wagner et al. Oct 2001 B1
6301573 McIlwaine et al. Oct 2001 B1
6340979 Beaton et al. Jan 2002 B1
6348935 Malacinski et al. Feb 2002 B1
6377965 Hachamovitch et al. Apr 2002 B1
6380947 Stead Apr 2002 B1
6388682 Kurtzberg et al. May 2002 B1
6396513 Helfman et al. May 2002 B1
6421678 Smiga et al. Jul 2002 B2
6424995 Shuman Jul 2002 B1
6442440 Miller Aug 2002 B1
6463078 Engstrom et al. Oct 2002 B1
6499026 Rivette et al. Dec 2002 B1
6529744 Birkler et al. Mar 2003 B1
6546393 Khan Apr 2003 B1
6549218 Gershony et al. Apr 2003 B1
6563518 Gipalo May 2003 B1
6564213 Ortega et al. May 2003 B1
6582474 LaMarca et al. Jun 2003 B2
6628996 Sezaki et al. Sep 2003 B1
6631398 Klein Oct 2003 B1
6640230 Alexander et al. Oct 2003 B1
6647383 August et al. Nov 2003 B1
6654038 Gajewska et al. Nov 2003 B1
6700591 Sharpe Mar 2004 B1
6701346 Klein Mar 2004 B1
6738787 Stead May 2004 B2
6751604 Barney et al. Jun 2004 B2
6782393 Balabanovic et al. Aug 2004 B1
6789251 Johnson Sep 2004 B1
6820075 Shanahan et al. Nov 2004 B2
6865714 Liu et al. Mar 2005 B1
6889337 Yee May 2005 B1
6907447 Cooperman et al. Jun 2005 B1
6980977 Hoshi et al. Dec 2005 B2
7003506 Fisk et al. Feb 2006 B1
7003737 Chiu et al. Feb 2006 B2
7031963 Bae Apr 2006 B1
7051277 Kephart et al. May 2006 B2
7073129 Robarts et al. Jul 2006 B1
7103835 Yankovich et al. Sep 2006 B1
7107268 Zawadzki et al. Sep 2006 B1
7117445 Berger Oct 2006 B2
7120646 Streepy, Jr. Oct 2006 B2
7127476 Narahara Oct 2006 B2
7127674 Carroll et al. Oct 2006 B1
7137074 Newton et al. Nov 2006 B1
7139800 Bellotti et al. Nov 2006 B2
7146422 Marlatt et al. Dec 2006 B1
7209246 Suda et al. Apr 2007 B2
7243125 Newman et al. Jul 2007 B2
7295995 York et al. Nov 2007 B1
7320105 Sinyak et al. Jan 2008 B1
7353252 Yang et al. Apr 2008 B1
7353397 Herbach Apr 2008 B1
7370274 Stuple et al. May 2008 B1
7380218 Rundell May 2008 B2
7386789 Chao et al. Jun 2008 B2
7392249 Harris et al. Jun 2008 B1
7395507 Robarts et al. Jul 2008 B2
7401131 Robertson et al. Jul 2008 B2
7406659 Klein et al. Jul 2008 B2
7421664 Wattenberg et al. Sep 2008 B2
7421690 Forstall et al. Sep 2008 B2
7428579 Libbey, IV et al. Sep 2008 B2
7441194 Vronay et al. Oct 2008 B2
7451389 Huynh et al. Nov 2008 B2
7454716 Venolia Nov 2008 B2
7480715 Barker et al. Jan 2009 B1
7487145 Gibbs et al. Feb 2009 B1
7487458 Jalon et al. Feb 2009 B2
7499919 Meyerzon et al. Mar 2009 B2
7499940 Gibbs Mar 2009 B1
7505974 Gropper Mar 2009 B2
7512901 Vong et al. Mar 2009 B2
7523126 Rivette et al. Apr 2009 B2
7526559 Phillips Apr 2009 B1
7533064 Boesch May 2009 B1
7636714 Lamping et al. Dec 2009 B1
7647312 Dai Jan 2010 B2
7664786 Oh et al. Feb 2010 B2
7685144 Katragadda Mar 2010 B1
7685516 Fischer Mar 2010 B2
7716236 Sidhu et al. May 2010 B2
7734627 Tong Jun 2010 B1
7756935 Gaucas Jul 2010 B2
7761788 McKnight et al. Jul 2010 B1
7769579 Zhao et al. Aug 2010 B2
7774328 Hogue et al. Aug 2010 B2
7779355 Erol et al. Aug 2010 B1
7783965 Dowd et al. Aug 2010 B1
7818678 Massand Oct 2010 B2
7831834 Hickman et al. Nov 2010 B2
7836044 Kamvar et al. Nov 2010 B2
7836391 Tong Nov 2010 B2
7844906 Berger Nov 2010 B2
7904387 Geering Mar 2011 B2
7908566 Wilcox et al. Mar 2011 B2
7917848 Harmon et al. Mar 2011 B2
7917867 Wattenberg et al. Mar 2011 B2
7921176 Madnani Apr 2011 B2
8020003 Fischer Sep 2011 B2
8020112 Ozzie et al. Sep 2011 B2
8027974 Gibbs Sep 2011 B2
8051088 Tibbetts et al. Nov 2011 B1
8086960 Gopalakrishna et al. Dec 2011 B1
8091020 Kuppusamy et al. Jan 2012 B2
8117535 Beyer et al. Feb 2012 B2
8150928 Fang Apr 2012 B2
8185448 Myslinski May 2012 B1
8199899 Rogers et al. Jun 2012 B2
8224802 Hogue Jul 2012 B2
8229795 Myslinski Jul 2012 B1
8239751 Rochelle et al. Aug 2012 B1
8260785 Hogue et al. Sep 2012 B2
8261192 Djabarov Sep 2012 B2
8281247 Daniell et al. Oct 2012 B2
8310510 Asahina Nov 2012 B2
8346620 King et al. Jan 2013 B2
8346877 Turner Jan 2013 B2
8359550 Meyer et al. Jan 2013 B2
8370275 Bhattacharya et al. Feb 2013 B2
8386914 Baluja et al. Feb 2013 B2
8453066 Ozzie et al. May 2013 B2
8458046 Myslinski Jun 2013 B2
8572388 Boemker et al. Oct 2013 B2
8595174 Gao et al. Nov 2013 B2
8621222 Das Dec 2013 B1
8667394 Spencer Mar 2014 B1
8726179 Yerkes et al. May 2014 B2
8782516 Dozier Jul 2014 B1
8790127 Wilkolaski Jul 2014 B1
8799765 Macinnis et al. Aug 2014 B1
8856640 Barr et al. Oct 2014 B1
8856645 Vandervort et al. Oct 2014 B2
8904284 Grant et al. Dec 2014 B2
9143438 Khan et al. Sep 2015 B2
9143468 Cohen et al. Sep 2015 B1
20010025287 Okabe et al. Sep 2001 A1
20010044741 Jacobs et al. Nov 2001 A1
20020004793 Keith, Jr. Jan 2002 A1
20020010725 Mo Jan 2002 A1
20020019827 Shiman et al. Feb 2002 A1
20020029337 Sudia et al. Mar 2002 A1
20020035714 Kikuchi et al. Mar 2002 A1
20020051015 Matoba May 2002 A1
20020069223 Goodisman et al. Jun 2002 A1
20020070977 Morcos et al. Jun 2002 A1
20020073112 Kariya Jun 2002 A1
20020073157 Newman et al. Jun 2002 A1
20020080187 Lawton Jun 2002 A1
20020084991 Harrison et al. Jul 2002 A1
20020099775 Gupta et al. Jul 2002 A1
20020103914 Dutta et al. Aug 2002 A1
20020120702 Schiavone et al. Aug 2002 A1
20020120858 Porter et al. Aug 2002 A1
20020128047 Gates Sep 2002 A1
20020129100 Dutta et al. Sep 2002 A1
20020138834 Gerba et al. Sep 2002 A1
20020152255 Smith, Jr. et al. Oct 2002 A1
20020161839 Colasurdo et al. Oct 2002 A1
20020174183 Saeidi Nov 2002 A1
20020186252 Himmel et al. Dec 2002 A1
20020187815 Deeds et al. Dec 2002 A1
20020188689 Michael Dec 2002 A1
20020194280 Altavilla et al. Dec 2002 A1
20030014482 Toyota et al. Jan 2003 A1
20030046263 Castellanos et al. Mar 2003 A1
20030058286 Dando Mar 2003 A1
20030069877 Grefenstette et al. Apr 2003 A1
20030101065 Rohall et al. May 2003 A1
20030120719 Yepishin et al. Jun 2003 A1
20030120762 Yepishin et al. Jun 2003 A1
20030146941 Bailey et al. Aug 2003 A1
20030154212 Schirmer et al. Aug 2003 A1
20030156130 James et al. Aug 2003 A1
20030158855 Farnham et al. Aug 2003 A1
20030163537 Rohall et al. Aug 2003 A1
20030167310 Moody et al. Sep 2003 A1
20030172353 Cragun Sep 2003 A1
20030191816 Landress et al. Oct 2003 A1
20030195963 Song et al. Oct 2003 A1
20030200192 Bell et al. Oct 2003 A1
20030226152 Billmaier et al. Dec 2003 A1
20030234822 Spisak Dec 2003 A1
20040046776 Phillips et al. Mar 2004 A1
20040058673 Irlam et al. Mar 2004 A1
20040061716 Cheung et al. Apr 2004 A1
20040062213 Koss Apr 2004 A1
20040068544 Malik et al. Apr 2004 A1
20040073616 Fellenstein et al. Apr 2004 A1
20040119761 Grossman et al. Jun 2004 A1
20040122846 Chess et al. Jun 2004 A1
20040139465 Matthews, III et al. Jul 2004 A1
20040140901 Marsh Jul 2004 A1
20040145607 Alderson Jul 2004 A1
20040153973 Horwitz Aug 2004 A1
20040164991 Rose Aug 2004 A1
20040177319 Horn Sep 2004 A1
20040243926 Trenbeath et al. Dec 2004 A1
20040260756 Forstall et al. Dec 2004 A1
20040268265 Berger Dec 2004 A1
20050004989 Satterfield et al. Jan 2005 A1
20050024487 Chen Feb 2005 A1
20050028081 Arcuri et al. Feb 2005 A1
20050034060 Kotler et al. Feb 2005 A1
20050039191 Hewson et al. Feb 2005 A1
20050044132 Campbell et al. Feb 2005 A1
20050044369 Anantharaman Feb 2005 A1
20050055416 Heikes et al. Mar 2005 A1
20050066037 Song et al. Mar 2005 A1
20050108345 Suzuki May 2005 A1
20050108351 Naick et al. May 2005 A1
20050114753 Kumar et al. May 2005 A1
20050120308 Gibson et al. Jun 2005 A1
20050144162 Liang Jun 2005 A1
20050144569 Wilcox et al. Jun 2005 A1
20050144570 Loverin et al. Jun 2005 A1
20050144571 Loverin et al. Jun 2005 A1
20050144572 Wattenberg et al. Jun 2005 A1
20050144573 Moody et al. Jun 2005 A1
20050149858 Stern et al. Jul 2005 A1
20050160065 Seeman Jul 2005 A1
20050160158 Firebaugh et al. Jul 2005 A1
20050183001 Carter et al. Aug 2005 A1
20050183006 Rivers-Moore et al. Aug 2005 A1
20050198589 Heikes et al. Sep 2005 A1
20050210256 Meier et al. Sep 2005 A1
20050223058 Buchheit et al. Oct 2005 A1
20050246420 Little, II Nov 2005 A1
20050246653 Gibson et al. Nov 2005 A1
20060005142 Karstens Jan 2006 A1
20060010865 Walker Jan 2006 A1
20060020548 Flather Jan 2006 A1
20060035632 Sorvari et al. Feb 2006 A1
20060041836 Gordon et al. Feb 2006 A1
20060047682 Black et al. Mar 2006 A1
20060080303 Sargent et al. Apr 2006 A1
20060106778 Baldwin May 2006 A1
20060123091 Ho Jun 2006 A1
20060136552 Krane et al. Jun 2006 A1
20060150087 Cronenberger et al. Jul 2006 A1
20060190435 Heidloff et al. Aug 2006 A1
20060200523 Tokuda et al. Sep 2006 A1
20060213993 Tomita Sep 2006 A1
20060248070 Dejean et al. Nov 2006 A1
20060248573 Pannu et al. Nov 2006 A1
20060271381 Pui Nov 2006 A1
20070005581 Arrouye et al. Jan 2007 A1
20070005697 Yuan et al. Jan 2007 A1
20070033200 Gillespie Feb 2007 A1
20070143317 Hogue et al. Jun 2007 A1
20070150800 Betz et al. Jun 2007 A1
20070156761 Smith, III Jul 2007 A1
20070162907 Herlocker et al. Jul 2007 A1
20070168355 Dozier et al. Jul 2007 A1
20070192423 Karlson Aug 2007 A1
20070198343 Collison et al. Aug 2007 A1
20070198952 Pittenger Aug 2007 A1
20070220259 Pavlicic Sep 2007 A1
20070233786 Rothley Oct 2007 A1
20070280205 Howell et al. Dec 2007 A1
20070291297 Harmon et al. Dec 2007 A1
20080022107 Pickles et al. Jan 2008 A1
20080028284 Chen Jan 2008 A1
20080034213 Boemker et al. Feb 2008 A1
20080059539 Chin et al. Mar 2008 A1
20080077571 Harris et al. Mar 2008 A1
20080082907 Sorotokin et al. Apr 2008 A1
20080114838 Taylor May 2008 A1
20080120319 Drews et al. May 2008 A1
20080172608 Patrawala et al. Jul 2008 A1
20080208969 Van Riel Aug 2008 A1
20080239413 Vuong et al. Oct 2008 A1
20080270935 Wattenberg et al. Oct 2008 A1
20080320397 Do et al. Dec 2008 A1
20090006936 Parker et al. Jan 2009 A1
20090013244 Cudich et al. Jan 2009 A1
20090030872 Brezina et al. Jan 2009 A1
20090044143 Karstens Feb 2009 A1
20090044146 Patel et al. Feb 2009 A1
20090083245 Ayotte et al. Mar 2009 A1
20090094178 Aoki Apr 2009 A1
20090100009 Karp Apr 2009 A1
20090132273 Boesch May 2009 A1
20090132560 Vignet May 2009 A1
20090192845 Gudipaty et al. Jul 2009 A1
20090198670 Shiffer et al. Aug 2009 A1
20090204818 Shin et al. Aug 2009 A1
20090282144 Sherrets et al. Nov 2009 A1
20090287780 Gawor et al. Nov 2009 A1
20090292673 Carroll Nov 2009 A1
20100070372 Watfa et al. Mar 2010 A1
20100070448 Omoigui Mar 2010 A1
20100070881 Hanson et al. Mar 2010 A1
20100076946 Barker et al. Mar 2010 A1
20100100743 Ali et al. Apr 2010 A1
20100121888 Cutting et al. May 2010 A1
20100131523 Yu et al. May 2010 A1
20100180200 Donneau-Golencer et al. Jul 2010 A1
20100191744 Meyerzon et al. Jul 2010 A1
20100198821 Loritz et al. Aug 2010 A1
20100223541 Clee et al. Sep 2010 A1
20100251086 Haumont et al. Sep 2010 A1
20100268700 Wissner et al. Oct 2010 A1
20100269035 Meyer et al. Oct 2010 A1
20100274628 Kunz et al. Oct 2010 A1
20100275109 Morrill Oct 2010 A1
20100281353 Rubin Nov 2010 A1
20100306265 Jones, III Dec 2010 A1
20110016106 Xia Jan 2011 A1
20110023022 Harper et al. Jan 2011 A1
20110043652 King et al. Feb 2011 A1
20110060584 Ferrucci et al. Mar 2011 A1
20110072338 Caldwell Mar 2011 A1
20110082876 Lu et al. Apr 2011 A1
20110087973 Martin et al. Apr 2011 A1
20110099510 Wilcox et al. Apr 2011 A1
20110126093 Ozzie et al. May 2011 A1
20110137751 Stein et al. Jun 2011 A1
20110166939 Junkin et al. Jul 2011 A1
20110173210 Ahn et al. Jul 2011 A1
20110179378 Wheeler et al. Jul 2011 A1
20110191276 Cafarella et al. Aug 2011 A1
20110209064 Jorgensen et al. Aug 2011 A1
20110209075 Wan Aug 2011 A1
20110209159 Baratz et al. Aug 2011 A1
20110219291 Lisa Sep 2011 A1
20110225192 Imig et al. Sep 2011 A1
20110225482 Chan et al. Sep 2011 A1
20110225490 Meunier Sep 2011 A1
20110246361 Geering Oct 2011 A1
20110252312 Lemonik et al. Oct 2011 A1
20110276538 Knapp et al. Nov 2011 A1
20110296291 Melkinov et al. Dec 2011 A1
20110306028 Galimore Dec 2011 A1
20120078826 Ferrucci et al. Mar 2012 A1
20120084644 Robert et al. Apr 2012 A1
20120095979 Aftab et al. Apr 2012 A1
20120116812 Boone et al. May 2012 A1
20120124053 Ritchford et al. May 2012 A1
20120166924 Larson et al. Jun 2012 A1
20120185473 Ponting et al. Jul 2012 A1
20120191777 Iwasaki et al. Jul 2012 A1
20120203734 Spivack et al. Aug 2012 A1
20120226646 Donoho et al. Sep 2012 A1
20120253896 Killoran, Jr. et al. Oct 2012 A1
20120253916 Ayloo Oct 2012 A1
20120254730 Sunderland et al. Oct 2012 A1
20120254770 Ophir Oct 2012 A1
20120284602 Seed et al. Nov 2012 A1
20120290979 Devecka Nov 2012 A1
20120304046 Neill et al. Nov 2012 A1
20120317046 Myslinski Dec 2012 A1
20130007198 Gupta et al. Jan 2013 A1
20130013456 Boesch Jan 2013 A1
20130024452 Defusco et al. Jan 2013 A1
20130036344 Ahmed et al. Feb 2013 A1
20130041685 Yegnanarayanan Feb 2013 A1
20130041764 Donovan et al. Feb 2013 A1
20130054354 Kunz et al. Feb 2013 A1
20130132566 Olsen et al. May 2013 A1
20130165086 Doulton Jun 2013 A1
20130212062 Levy et al. Aug 2013 A1
20130246346 Khosrowshahi et al. Sep 2013 A1
20130268830 Khosrowshahi et al. Oct 2013 A1
20140006977 Adams Jan 2014 A1
20140013197 McAfee et al. Jan 2014 A1
20140032913 Tenenboym et al. Jan 2014 A1
20140040249 Ploesser et al. Feb 2014 A1
20140053228 Mahadevan et al. Feb 2014 A1
20140143684 Oh et al. May 2014 A1
20140172628 Argue et al. Jun 2014 A1
20140244638 Yerkes et al. Aug 2014 A1
20150012805 Bleiweiss et al. Jan 2015 A1
20150304369 Sandholm et al. Oct 2015 A1
Foreign Referenced Citations (14)
Number Date Country
1194703 Sep 1998 CN
1285557 Feb 2001 CN
1077417 Feb 2001 EP
1232434 Aug 2002 EP
H08286871 Nov 1996 JP
H09326822 Dec 1997 JP
2001325296 Nov 2001 JP
2003271526 Sep 2003 JP
20020050785 Jun 2002 KR
9724684 Jul 1997 WO
0123995 Apr 2001 WO
2011049399 Apr 2011 WO
2012057726 May 2012 WO
2014072767 May 2014 WO
Non-Patent Literature Citations (33)
Entry
Ashman H., “Electronic Document Addressing: Dealing with Change,” AMC Computing Surveys, Sep. 2000, vol. 32, No. 3, 12 Pages.
Bohman P., “Introduction to Web Accessibility,” WebAIM, Oct. 2003, pp. 1-6, [Retrieved on Apr. 17, 2004], Retrieved from URL: http://www.webaim.org/intro/?templatetype=3.
“Cascading Style Sheets Level 2 Revision 1 (CSS 2.1) Specification,” W3C, Apr. 15, 2011, 487 pages.
Cipriani J., “How to Send Money via Gmail,” CNET, May 21, 2013, Retrieved from URL: https://www.cnet.com/culture/how-to-send-money-via-gmail/.
Cornwell J., “A Preview of Gmails New Look,” The Official Gmail Blog, Jun. 30, 2011, 3 Pages, Retrieved from URL: http:/web.archive.org/web/20110703043327/ http:/gmailblog.blogspot.com/2011/06/preview-of-gmails-new-look.html.
ETSI: “Electronic Signatures and Infrastructures ESI; PDF Advanced Electronic Signature Profiles; Part 4: PAdES Long Ter PAdES-LTV Provile,” ETSI TS 102 778-4, V1.1.1, Technical Specification, Jul. 2009, 19 Pages.
Fox P., “Creating Dynamic Client-side Maps Mashups with Google Spreadsheets,” Maps API Blog, Mar. 30, 2007, 3 Pages, [Retrieved on Dec. 5, 2011] Retrieved from URL: http://googlemapsapi.globspot.com/2007/03/creating-dynamic-client-side-maps.html.
Francik E., “Computer-& Screen-Based Interfaces: Universal Design Filter,” Human Factors Engineering, Pacific Bell, Version 2, Jun. 6, 1996, pp. 1-27.
Geek rant.org: “How to Embed a Word Document in Another Word Document,” Geek Rant dot org (Online), Sep. 14, 2005, 6 Pages, [Retrieved on Dec. 5, 2011] Retrieved from URL: http://www.geekrant.org/2005/09/14/word-embed-document/.
Gina V., et al., “Understanding Sequence and Reply Relationship within Email Converstations: A Mixed-Model Visualization,” Paper: Intergrating Tools and Tasks, Ft. Lauderdale, Florida, Apr. 5-10, 2003, vol. No. 5, No. 1, pp. 361-368.
Griesser A., “A Generic Editor Full Text,” Conference on Object Oriented Programming Languages and Applications Archive, ACM Press NewYork, NY, USA, 1997, pp. 50-55.
Herrick D.R., “Google This Using Google Apps for Collaboration and Productivity,” Proceedings of the Acm SIGUCCS Fall Conference on User Services Conference, SIGUCCS '09, Oct. 11-14, 2009, pp. 55-63.
http://www.google.com/wallet/send-money/. https://web.archive.org/web/20130521132817/ http://www.google.com/wallet/sent-money/, May 21, 2013.
International Search Report and Written Opinion for International Application No. PCT/US2011/037862, dated Oct. 31, 2011, 11 pages.
Jacobs I., et al., “User Agent Accessibility Guidelines 1.0, W3C Recommendation Dec. 17, 2002,” World Wide Web Consortium, 2002, 115 pages. [Retrieved on Apr. 9, 2004] Retrieved from URL: https://www.w3.org/TR/UAAG10/.
Kappe F., “Hyper-G: A Distributed Hypermedia System,” Proceedings of the International Networking Conference, 1993, 11 Pages, [Retrieved on Oct. 20, 2011] Retrieved from URL: http://ftp.iicm.tugraz.at/pub/papers/inet93.pdf.
Kircher M., “Lazy Acquisition,” Proceedings of the 6th European Conference on Pattern Languages of Programs, Jul. 2011, 11 Pages, XP002587616.
McFarland D.S., “CSS, the Missing Manual,” O'Reilly, Aug. 2009, pp. 7-101, 134-138, 428-429.
Microsoft: “How to Embed and Automate Office Documents with Visual Basic,” Microsodt Support, Mar. 27, 2007, 6 Pages, [Retrieved on Dec. 5, 2011] Retrieved from URL: http://support.microsoft.com/kb/242243.
Microsoft: “OLE Concepts and Requirements Overview,” Oct. 27, 1999, 3 Pages, [Retrieved on Dec. 2, 2011] Retrieved from URL: http:/support.microsoft.com/kb/86008.
Oracle Corporation: “Oracle Provider for OLE DB,” Developer's Guide, Dec. 2003, 10g Release 1 (10.1), 90 Pages.
Parker P., “Google Testing Unique AdWords Format Designed for Gmail,” Waybackmachine, Search Engine Land, Aug. 18, 2011, 12 Pages, Retrieved from URL: http://web.archive.org/web/20111028150326/ http://searchengineland.com/google-testing-unique-adwords-format-designed-for-gmail.
Piers Dillon-Scott; “Gmail's new adds are about to invade your inbox”, http://sociable.co/web/now-gmai-is-sending-ads-to-you-kind-of/, May 27, 2012.
Pinkas D., et al., “CMS Advanced Electronic Signatures (CAdES),” Request for Comments 5126, Network Working Group, Feb. 2008, 142 Pages.
Rohall S.L., et al., “Email Visualizations to Aid Communications,” IEEE Symposium on Information Visualization, Oct. 22-23, 2001, 5 Pages.
Treviranus J., et al., “Authoring Tool Accessibility Guidelines 1.0,” W3C Recommendation, World Wide Web Consortium, Feb. 3, 2000, pp. 1-22, [Retrieved on Apr. 28, 2004] Retrieved from URL: http://www.w3.org/TR/ATAG10/.
Tyson H., “Microsoft Word 2010 Bible,” John Wiley Sons, Jun. 21, 2010, 4 Pages.
“Supplementary Notes for MFC Programming Module 23 and Module 27: Interfaces, COM.COM+ and OLE,” Jan. 6, 2006, 4 Pages, [Retrieved on May 12, 2011] Retrieved from URL: http://www.tenouk.com/visualcplusmfc/mfcsupp/ole.html.
Vincent J.G., “CSI 3140 WWW Structures, Techniques and Standards, Cascading Style Sheets,” Power Point Slides, Published Feb. 16, 2010, 95 Pages.
W3: “Web Content Accessibility Guidelines 2.0,” W3C Working Draft, WorldWide Web Consortium, Mar. 11, 2004, pp. 1-56, [Retrieved on Apr. 9, 2004] Retrieved from URL: http://www.w3.org/TR/WCAG20/.
Waybackmachine: “How to Use Outlook Express,” UCLA, Jan. 11, 2008, 3 Pages, Retrieved from URL: http://web.archive.org/web/20080111060000/ http://www.bol.ucla.edu/software/win/oe/.
Wikipedia: “Backus-Naur form,” Wikipedia, the free encyclopedia, Jul. 14, 2013, 6 Pages, [Retrieved on Sep. 3, 2013] Retrieved from URL: https://en.wikipedia.org/wiki/Backus%E2%80%93Naur_Form.
Wikipedia: “Regular Expression,” Wikipedia the free Encyclopedia, Sep. 2, 2013, 21 Pages, [Retrieved on Sep. 3, 2013] Retrieved from URL: https://en.wikipedia.org/wiki/Regular_expression.
Related Publications (1)
Number Date Country
20230259491 A1 Aug 2023 US
Continuations (3)
Number Date Country
Parent 17666434 Feb 2022 US
Child 18140496 US
Parent 15807891 Nov 2017 US
Child 17666434 US
Parent 14010850 Aug 2013 US
Child 15807891 US