Simultaneous editing/accessing of content by collaborator invitation through a web-based or mobile application to a cloud-based collaboration platform

Information

  • Patent Grant
  • 9519886
  • Patent Number
    9,519,886
  • Date Filed
    Monday, September 30, 2013
    11 years ago
  • Date Issued
    Tuesday, December 13, 2016
    7 years ago
Abstract
Techniques are disclosed for a web or mobile interface enabling users and collaborators to simultaneously comment, edit, or edit content in real time or near real time managed by a cloud-based collaboration platform. In one embodiment, the data to be accessed concurrently is presented or depicted at the multiple physical devices to the collaborators for viewing and accessing the data in real time or near real time. Each of the collaborators is able to view, re-edit, or re-modify in a concurrent fashion, at the collaborator's physical device, edits or modifications made to the data in real time or near real time as a result of any of the other collaborators accessing the data at their respective physical devices. In some instances, additional collaborators are specifiable for the data created for concurrent real time access in addition to those originally associated with the folder.
Description
COPYRIGHT NOTICE

A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the United States Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever. The following notice applies to the software and data as described below and in the drawings that form a part of this document: Copyright 2013, Box, Inc., All Rights Reserved.


BACKGROUND

With the advancement in digital and online technologies, people now manage an abundance of information and are in constant communication with others regardless of time and location. Cloud-based collaboration platforms have been developed to support such lifestyles. It can be useful for such platforms to offer features that enable users to create, view, edit, annotate, store, share and otherwise manage information in a user-friendly and efficient manner.





BRIEF DESCRIPTION OF DRAWINGS

The present embodiments are illustrated by way of example and are not intended to be limited by the figures of the accompanying drawings. In the drawings:



FIG. 1 contains a diagram illustrating an example development and communication environment where users interact with a cloud service, collaboration and/or cloud storage platform.



FIG. 2 contains a diagram illustrating an example web-based or online collaboration platform.



FIG. 3 contains a diagram illustrating example graphical user interfaces (GUIs) for creating a note.



FIGS. 4A and 4B contain diagrams illustrating an example GUI for editing a note.



FIG. 5 contains a diagram illustrating an example GUI for saving a note.



FIG. 6 contains a diagram illustrating an example GUI for commenting on a note.



FIGS. 7A, 7B, 7C and 7D contain diagrams illustrating an example GUI for annotating a note.



FIGS. 8A and 8B contain diagrams illustrating an example GUI for incorporating multimedia into a note.



FIG. 9 contains a diagram illustrating an example GUI supporting multi-user collaboration on a note.



FIG. 10 contains a diagram illustrating example GUIs for quickly and conveniently creating a note from the web application to the cloud-based collaboration platform and sharing it with others/collaborators.



FIG. 11 contains a diagram illustrating example GUIs for viewing a note.



FIG. 12 contains a block diagram illustrating example components in the host server 110 of the web-based collaboration platform.



FIG. 13 contains a flowchart illustrating example operations performed by a GUI module in working with an annotation.



FIG. 14 contains a flowchart illustrating example operations performed by a GUI module in incorporating multimedia into a note.



FIG. 15 is a block diagram illustrating an example machine representing the computer systemization of the development and communication environment.





The same reference numbers and any acronyms identify elements or acts with the same or similar structure or functionality throughout the drawings and specification for ease of understanding and convenience.


DETAILED DESCRIPTION

Techniques are disclosed for a web or mobile interface enabling users and collaborators to simultaneously create, view, edit, annotate, store, share and otherwise manage content in real time or near real time on a cloud-based collaboration platform.


The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding of the disclosure. However, in certain instances, well-known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure can be, but not necessarily are, references to the same embodiment; and, such references mean at least one of the embodiments.


Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which can be exhibited by some embodiments and not by others. Similarly, various requirements are described which can be requirements for some embodiments but not other embodiments.


The terms used in this specification generally have their ordinary meanings in the art, within the context of the disclosure, and in the specific context where each term is used. Certain terms that are used to describe the disclosure are discussed below, or elsewhere in the specification, to provide additional guidance to the practitioner regarding the description of the disclosure. For convenience, certain terms can be highlighted, for example using italics and/or quotation marks. The use of highlighting has no influence on the scope and meaning of a term; the scope and meaning of a term is the same, in the same context, whether or not it is highlighted. It will be appreciated that same thing can be said in more than one way.


Consequently, alternative language and synonyms can be used for any one or more of the terms discussed herein, nor is any special significance to be placed upon whether or not a term is elaborated or discussed herein. Synonyms for certain terms are provided. A recital of one or more synonyms does not exclude the use of other synonyms. The use of examples anywhere in this specification including examples of any terms discussed herein is illustrative only, and is not intended to further limit the scope and meaning of the disclosure or of any exemplified term. Likewise, the disclosure is not limited to various embodiments given in this specification.


Without intent to limit the scope of the disclosure, examples of instruments, apparatus, methods and their related results according to the embodiments of the present disclosure are given below. Note that titles or subtitles can be used in the examples for convenience of a reader, which in no way should limit the scope of the disclosure. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. In the case of conflict, the present document, including definitions will control.



FIG. 1 illustrates an example system 100 having a host server 110 of a cloud-based service/platform, collaboration workspace and/or cloud storage service with capabilities that enable users and collaborators to simultaneously create, view, edit, annotate, store, share and otherwise manage content in real time or near real time on the cloud-based collaboration platform.


The client devices 102 can be any system and/or device, and/or any combination of devices/systems that is able to establish a communication or a connection, including wired, wireless, cellular connections with another device, a server and/or other systems such as the host server 110. The client devices 102 typically include a display and/or other output functionalities to present information and data exchanged between among the client devices 102 and/or the host server 110.


For example, the client devices 102 can include mobile, handheld or portable devices or non-portable devices and can be any of, but not limited to, a server desktop, a desktop computer, a computer cluster, or portable devices including, a notebook, a laptop computer, a handheld computer, a palmtop computer, a mobile phone, a cell phone, a PDA, a smart phone (e.g., a BlackBerry device such as BlackBerry Z10/Q10, an iPhone, Nexus 4, etc.), a Treo, a handheld tablet (e.g. an iPad, iPad Mini, a Galaxy Note, Galaxy Note II, Xoom Tablet, Microsoft Surface, Blackberry PlayBook, Nexus 7, 10 etc.), a phablet (e.g., HTC Droid DNA, etc.), a tablet PC, a thin-client, a hand held console, a hand held gaming device or console (e.g., XBOX live, Nintendo DS, Sony PlayStation Portable, etc.), mobile-enabled powered watch (e.g., iOS, Android or other platform based), Google Glass, a Chromebook and/or any other portable, mobile, hand held devices, etc. running on any platform or any operating system (e.g., Mac-based OS (OS X, iOS, etc.), Windows-based OS (Windows Mobile, Windows 7, Windows 8, etc.), Android, Blackberry OS, Embedded Linux platforms, Palm OS, Symbian platform, Google Chrome OS, and the like. In one embodiment, the client devices 102 and host server 110 are coupled via a network 106. In some embodiments and the client devices 102 and host server 100 may be directly connected to one another.


The input mechanism on client devices 102 can include touch screen keypad (including single touch, multi-touch, gesture sensing in 2D or 3D, etc.), a physical keypad, a mouse, a pointer, a track pad, motion detector (e.g., including 1-axis, 2-axis, 3-axis accelerometer, etc.), a light sensor, capacitance sensor, resistance sensor, temperature sensor, proximity sensor, a piezoelectric device, device orientation detector (e.g., electronic compass, tilt sensor, rotation sensor, gyroscope, accelerometer), or a combination of the above.


Signals received or detected indicating user activity at client devices 102 through one or more of the above input mechanism, or others, can be used by various users or collaborators (e.g., collaborators 108) for accessing, through the network 106, a web-based collaboration environment or online collaboration platform (e.g., hosted by the host server 110). The collaboration environment or platform can have one or more collective settings 105 for an enterprise or an organization where the users belong, and can provide a user interface 104 for the users to access such platform under the settings 105.


In general, the network 106, over which the client devices 102 and the host server 110 communicate may be a cellular network, a telephonic network, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet, or any combination or variation thereof. For example, the Internet can provide file transfer, remote log in, email, news, RSS, cloud-based services, instant messaging, visual voicemail, push mail, VoIP, and other services through any known or convenient protocol, such as, but is not limited to the TCP/IP protocol, Open System Interconnections (OSI), FTP, UPnP, iSCSI, NSF, ISDN, PDH, RS-232, SDH, SONET, etc.


The network 106 can be any collection of distinct networks operating wholly or partially in conjunction to provide connectivity to the client devices 102 and the host server 110 and may appear as one or more networks to the serviced systems and devices. In one embodiment, communications to and from the client devices 102 can be achieved by, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet. In one embodiment, communications can be achieved by a secure communications protocol, such as secure sockets layer (SSL), or transport layer security (TLS).


In addition, communications can be achieved via one or more networks, such as, but are not limited to, one or more of WiMax, a Local Area Network (LAN), Wireless Local Area Network (WLAN), a Personal area network (PAN), a Campus area network (CAN), a Metropolitan area network (MAN), a Wide area network (WAN), a Wireless wide area network (WWAN), or any broadband network, and further enabled with technologies such as, by way of example, Global System for Mobile Communications (GSM), Personal Communications Service (PCS), Bluetooth, WiFi, Fixed Wireless Data, 2G, 2.5G, 3G (e.g., WCDMA/UMTS based 3G networks), 4G, IMT-Advanced, pre-4G, LTE Advanced, mobile WiMax, WiMax 2, WirelessMAN-Advanced networks, enhanced data rates for GSM evolution (EDGE), General packet radio service (GPRS), enhanced GPRS, iBurst, UMTS, HSPDA, HSUPA, HSPA, HSPA+, UMTS-TDD, 1×RTT, EV-DO, messaging protocols such as, TCP/IP, SMS, MMS, extensible messaging and presence protocol (XMPP), real time messaging protocol (RTMP), instant messaging and presence protocol (IMPP), instant messaging, USSD, IRC, or any other wireless data networks, broadband networks, or messaging protocols.



FIG. 2 depicts an example web-based or online collaboration platform deployed in an enterprise or other organizational setting 250 for organizing workspaces 205, 225 and 245 and work items 215, 235 and 255.


The collaboration platform or environment hosts workspaces with work items that one or more users can access (e.g., view, edit, update, revise, comment, download, preview, tag, or otherwise manipulate, etc.). A work item can generally include any type of digital or electronic content that can be viewed or accessed via an electronic device (e.g., client devices 202). For example, the work items 215 and 235 include general dicigital content, such as .pdf files, .doc, slides (e.g., Powerpoint slides), images, audio files, multimedia content, web pages, blogs, etc. On the other hand, the work items 255 comprise “notes” or documents of a proprietary format, which support advanced and unique capabilities of data management and promote collaboration A workspace can generally refer to any grouping of a set of digital content managed by the collaboration platform. For example, the workspaces A 205 and B 225 include general digital content while the workspace 245, referred to as a “notebook”, includes notes only. The grouping can be created, identified, or specified by a user or through other means. This user may be a creator user or administrative user, for example.


In general, a workspace can be associated with a set of users or collaborators (e.g., collaborators 108) who have access to the content included therein. The levels of access (e.g., based on permissions or rules) of each user or collaborator to access the content in a given workspace may be the same or may vary among the users. Each user may have their own set of access rights to every piece of content in the workspace, or each user may have different access rights to different pieces of content. Access rights may be specified by a user associated with a workspace and/or a user who created/uploaded a particular piece of content to the workspace, or any other designated user or collaborator.


In general, the collaboration platform allows multiple users or collaborators to access or collaborate on work items such that each user can remotely see edits, revisions, comments, or annotations being made to specific work items through their own user devices. For example, a user can upload a document to a workspace for other users to access (e.g., for viewing, editing, commenting, signing-off, or otherwise manipulating). The user can login to the online platform and upload the document (or any other type of work item) to an existing workspace or to a new workspace. The document can be shared with existing users or collaborators in a workspace.


The web-based platform for collaborating on projects or jointly working on documents can be used by individual users and shared among collaborators. In addition, the collaboration platform can be deployed in an organized setting including but not limited to, a company (e.g., an enterprise setting), a department in a company, an academic institution, a department in an academic institution, a class or course setting, or any other types of organizations or organized setting.


When deployed in an organizational setting, multiple workspaces (e.g., workspace A, B C) can be created to support different projects or a variety of work flows. Each workspace can have its own associate work items. For example, workspace A 205 can be associated with work items 215, workspace B 225 can be associated with work items 235, and workspace 245 can be associated with work items 255. The work items 215, 235, and 255 can be unique to each workspace but need not be. For example, a particular work item or a note can be associated with only one workspace or it can be associated with multiple workspaces.


In general, each workspace has a set of users or collaborators associated with it. For example, workspace A 205 is associated with multiple users or collaborators 206. In some instances, workspaces deployed in an enterprise can be department specific. For example, workspace B can be associated with department 210 and some users shown as example user A 208, and workspace N 245 can be associated with departments 212 and 216 and users shown as example user B 214.


In the case of a note in a notebook, collaborators of the notebook can have simultaneous read/write access to the note. Specifically, in a concurrent fashion, each of the collaborators is able to make changes to the note or even edit the changes made by other collaborators. In addition, a list of collaborators can be specified at the note level, so that different notes within the same notebook can be associated with different sets of collaborators.


In each workspace A, B . . . N, when an action is performed on a work item by a given user or any other activity is detected in the workspace, other users in the same workspace can be notified (e.g., in real time or in near real time, or not in real time). Activities which trigger real time notifications can include, by way of example but not limitation, adding, deleting, or modifying collaborators in the workspace, uploading, downloading, adding, deleting a work item in the workspace, creating a discussion topic in the workspace.


In some embodiments, items or content downloaded or edited can cause notifications to be generated. Such notifications can be sent to relevant users to notify them of actions surrounding a download, an edit, a change, a modification, a new file, a conflicting version, an upload of an edited or modified file.


In one embodiment, in a user interface to the web-based collaboration platform where notifications are presented, users can, via the same interface, create action items (e.g., tasks) and delegate the action items to other users including collaborators pertaining to a work item 215, for example. The collaborators 206 can be in the same workspace A 205 and can invite a new collaborator to join the workspace, for example. Similarly, in the same user interface where discussion topics can be created in a workspace (e.g., workspace A, B or N, etc.), actionable events on work items can be created and/or delegated/assigned to other users such as collaborators of a given workspace 206 or other users. Through the same user interface, task status and updates from multiple users or collaborators can be indicated and reflected. In some instances, the users can perform the tasks (e.g., review or approve or reject, etc.) via the same user interface.



FIG. 12 contains a block diagram illustrating example components in the web-based collaboration platform hosted by the host server 110. The collaboration platform can include, for example, a network interface 1202, a communication module 1403, a graphical user interface (GUI) module 1206 and a user module 1208. The GUI module 1206 may further include an elements manager 1210, a layout manager 1212, a text manager 1214, and a graphics manager 1216. More or fewer components/modules/engines can be included in the host server 110 and each illustrated component.


The network interface 1202 can be a networking module that enables the host server 110 to mediate data in a network with an entity that is external to the host server 110, through any known and/or convenient communications protocol supported by the host and the external entity. The network interface 402 can communicate with one or more of a network adaptor card, a wireless network interface card (e.g., SMS interface, WiFi interface, interfaces for various generations of mobile communication standards including but not limited to 1G, 2G, 3G, 3.5G, 4G, LTE, etc.,), Bluetooth, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, bridge router, a hub, a digital media receiver, and/or a repeater.


The collaborator management module 1208 manages information regarding the users of the collaborator platform. It may maintain each user's basic information. It may also organize the users by folder or by note, to keep track of the list of collaborators for each workspace or each work item, for example. The concurrent access module 1204 manages the communication among the users of the collaboration platform. For example, it may keep track of which user is editing which note at any given time and allow the collaborators of one note to edit the note at the same time. In one implementation, it automatically resolves any conflicts between the edits of different users. In another implementation, it enables an administrator or one or more of the collaborators to accept or reject the edits of different users and allows the collaborators to further edit the note afterwards.


The GUI module handles all aspects of GUIs, including creation and removal of GUIs. Specifically, the elements manager 1210 manages graphical elements in a GUI, such as buttons, text fields, drop-down lists, dialogs, etc., the text manager 1214 handles text elements in a GUI, the graphics manager 1216 handles graphics elements in a GUI, and the layout manager 1212 controls positioning of various elements in a GUI.


As used herein, a “module,” “a manager,” an “interface,” and so on can be implemented by a general purpose, dedicated or shared processor, or, typically, firmware or software modules embodied in a computer-readable (storage) medium for execution by a processor. Depending upon implementation-specific or other considerations, the implementation can be centralized or distributed.



FIGS. 3-10 illustrate example GUIs for users or collaborators to work with notes. FIG. 3 contains a diagram illustrating an example GUI for creating a note. In one embodiment, with a GUI for a notebook, a user requests the creation of a note. In response, the GUI module 1206 presents a new GUI for a note. In the new GUI, there can be four graphical elements: one for entering the list of collaborators, such as a text field 301, one for entering the title, such as a text field 304, one for providing the content, such as a multimedia field together with a scroll bar 302, and one offering a list of editing and formatting options, such as an inline toolbar 303. There can also be an additional graphical element for confirming the creation of the note, such as a submit button. In particular, the graphical element for entering the list of collaborators allows a user to specify the list of collaborators at the note level and during the creation of a note. The specification can be done manually or with the assistance of an existing directory through one or more of a text field, a drop-down list, a browse dialogue, and other graphical elements. In addition, the graphical element for providing the content permits the inclusion of different types of media into the content, including text, graphics, images, videos, and is amenable to various manipulations through the graphical element offering the list of editing and formatting options.


In one embodiment, the GUI for the note is displayed independent of the GUI for the notebook, such as in a separate tab. In another embodiment, once the note is created, as a result of pressing the submit button, for example, the GUI module 1206 stores the note and notifies each specified collaborator. For example, each collaborator may receive an email message with a hyperlink to the stored note.



FIGS. 4A and 4B contain diagrams illustrating an example GUI for editing a note. In one embodiment, users can format the content of a note in the GUI for the note. The formatting may include changing the font size and style of any text or use different colors for the text, and can be done using one or more graphical elements, such as menus and buttons 401 and 402 within an inline toolbar. In another embodiment, users are also allowed to organize the content in various ways, in terms of alignment, indentation, listing, and so on.



FIG. 5 contains a diagram illustrating an example GUI concerning the saving of a note. In one embodiment, a user requests saving a note in the GUI for the note using a graphical element, such as a save button. In response to the user request or at certain predetermined intervals, the GUI module 1206 performs the save action, indicating the progress along the way. In one embodiment, the progress is shown through various icons 501 and 502 on one side of the graphical element for entering the content, in terms of the amount of content saved. When the saving is complete, the progress may be represented by a checkmark icon 503, for example. In another embodiment, the progress might be shown through icons of changing colors, degrees of brightness, and so on. It can also be shown through discrete signals, such as a ring at the beginning of a save action and two rings at the end. These features inform a user of the timing and status of saving of a note.



FIG. 6 contains a diagram illustrating an example GUI for commenting on a note. In one embodiment, the user requests the creation of a comment in the GUI for the note using a graphical element, such as a create button. In response, the GUI module 1206 presents certain graphical elements, such as a text box for entering the comment and a submit button, near or overlaying the graphical element for entering the content. In one embodiment, once a comment is created, as a result of pressing the submit button, for example, the GUI module 1206 saves and displays the comment. As one example of displaying a comment and related information in 601, information about the author and the date of the comment are shown in addition to the text of the comment. As another example, a list of comments is presented by author, by date, etc. In other embodiments, the GUI module 1206 allows a user to modify a comment created by the user or respond to another's user's comment. For example, a user may request an update of his or her comment using various graphical elements, such as by right-clicking on the display of the comment and choosing an update option from a pop-up menu. In response, the GUI module 1206 may present one or more graphical elements for the user to revise the comment.



FIGS. 7A, 7B, 7C and 7D contain diagrams illustrating an example GUI for working with annotations of a note, which are similar to comments but can be tied to specific portions of the note. In one embodiment as illustrated in FIG. 7A, when a user selects a portion of the note in the GUI for the note, the GUI module 1206 allows the user to enter an annotation using a graphical element near the display of the selected portion and overlaying the display of some other portion, such as a button 701 or 702. In one embodiment, an annotation can be represented as a hyperlink, by text, and so on. When the user chooses a representation as a hyperlink, by pressing the button 702, for example, the GUI module 1206 can allow the user to enter and submit a URL or any other linkable address using one or more graphical elements. As illustrated in FIG. 7B, when the user chooses to represent the annotation by text, by pressing the button 701, for example, the GUI module 1206 can allow the user to enter and submit the text using one or more graphical elements near the display of the selected portion and overlaying the display of some other portion, such as a text field and a submit button 703.


In one embodiment as illustrated in FIG. 7C, once an annotation is created, as a result of pressing a submit button, for example, the GUI module 1206 saves the annotation and displays the portion of the note that is associated with the annotation in a distinct way, such as in highlight 704, in bold, and so on, to inform the users of the existence of annotations for portions of the note. In another embodiment as illustrated in FIG. 7D, the GUI module 1206 displays the saved annotation upon user request. For example, when the user hovers over a highlighted portion of the note, information about the author and the date of an associated annotation are shown in addition to the annotation text overlaying the display of a portion of the note. In yet another embodiment, the GUI module 1206 shows all the annotations next to the graphical element for entering the note and moves the focus of the note to a portion of the note when an associated annotation is selected.


According to other embodiments, the GUI module 1206 allows a user to modify an annotation created by the user or respond to another's user's annotation. For example, a user may request an update of his or her annotation using various graphical elements, such as by right-clicking on the display of a portion of a note that is associated with an annotation and choosing an update option from a pop-up menu. In response, the GUI module 1206 may present one or more graphical elements for the user to revise the annotation. FIG. 13 contains a flowchart illustrating example operations performed by the GUI module 1206 in working with an annotation of a note.



FIGS. 8A and 8B contain diagrams illustrating an example GUI for incorporating multimedia into a note. In one embodiment, the user requests the insertion of multimedia, such as an image or a video, into the note in the GUI for the note using a graphical element, such as an insert button. In response, the GUI module 1206 presents a browsing dialog, overlying a portion of the note, for example, for the user to select one or more pieces of multimedia data. As illustrated in FIG. 8A, the GUI module 1206 may populate the browsing dialog 801 with a list of multimedia data from a local database. As illustrated in FIG. 8B, it may also allow a user to enter a query using a graphical element, such as a text field 802, run a search using any known technique against an external database, such as a remote server accessible through the internet, and populate the browsing dialog 803 with the list of search hits. The pieces of multimedia may be displayed in a browsing dialog by location, by search score, by type, and by other relevant criteria. The selected one or more pieces of multimedia are then inserted into a designated location within the note, such as the user's current focus position. FIG. 14 contains a flowchart illustrating example operations performed by the GUI module 1206 in incorporating multimedia into a note.



FIG. 9 contains a diagram illustrating an example GUI which facilitates multi-user collaboration on a note. As the GUI module 1206 allows more than one user to work on the same work item or specifically the same note at the same time, in one embodiment, it indicates which user is working on which portion of the note in the GUI for the note. Each user can be represented by several graphical elements, including a string and a thumbnail coupled with a color. According to one example, a thumbnail 901, which is a photo of a user enclosed in a frame of the color associated with the user, is shown together with a string 902, which is the name of the user, next to display of the portion of the note being edited by the user, and a cursor 903 of that color is shown at the current position of the user's editing.


The editing may include any update to the note, including the removal of content and the change of format, as well as the addition or removal of an annotation. In one embodiment, the string is not shown or the cursor is not blinking unless the user is actively editing the content, which can be measured by the number of keystrokes per second or other means. In another embodiment, according to the user's instruction, the photo is repeatedly flashed as a warning to other users against editing the same portion of the note. In addition, the display can be refreshed at different rates, such as every five minutes, every time a user performs a keystroke, etc.



FIG. 10 contains a diagram illustrating example GUIs for quickly and conveniently creating a note from the web application and sharing it with others/collaborators. As discussed above with FIG. 3, with the GUI 1001 for a notebook, a user requests the creation of a note. In response, the GUI module 1206 presents a new GUI 1002 for a note. However, in one embodiment, the note is meant to be a “quick” one and thus the GUI 1002 is displayed in an easily-accessible manner, such as being overlaid on the interface 1001 from the web application to the cloud-based platform rather than in a separate tab. In one embodiment, the GUI 1002 does not support any manipulation of the note, including the organization of paragraphs in the note, the addition of comments on the note, and so on. Therefore, it does not include a graphical element offering the list of editing and formatting options, for example. This may be useful feature when a user wants to quickly share a task list with another user, for example. Once the note is created, however, it can be reopened, formatted and enriched and otherwise manipulated with the regular interface for a note.



FIG. 11 contains a diagram illustrating example GUIs for viewing a note. In one embodiment, in working with a GUI 1102 for a notebook which contains one or more notes, a user requests the viewing of one of the notes using a graphical element, such as a menu obtained from right-clicking the listing of the note. In response, the GUI module 1206 displays a GUI 1101 for the note using the “lightbox” technique. Specifically, the GUI 1101, which includes a modal dialog, is displayed as the foreground while the GUI 1102 is displayed as the dark background, and user interaction with the GUI 1101 is required before control is returned to the interface 1002. In one embodiment, the GUI 1101 essentially contains a snapshot of the GUI illustrated in FIG. 9, allowing users to visualize the note and its current state of collaboration. In another embodiment, the GUI module 1206 shows the specified note in the GUI 1101 initially but allows a user to navigate through the notebook to view other notes via certain graphical elements, such as forward and backward navigation keys 1103.



FIG. 15 shows a diagrammatic representation 1500 of a machine in the example form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, can be executed.


In alternative embodiments, the machine operates as a standalone device or can be connected (e.g., networked) to other machines. In a networked deployment, the machine can operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.


The machine can be a server computer, a client computer, a personal computer (PC), a user device, a tablet, a phablet, a laptop computer, a set-top box (STB), a personal digital assistant (PDA), a thin-client device, a cellular telephone, an iPhone, an iPad, a Blackberry, a processor, a telephone, a web appliance, a network router, switch or bridge, a console, a hand-held console, a (hand-held) gaming device, a music player, any portable, mobile, hand-held device, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.


While the machine-readable medium or machine-readable storage medium is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” and “machine-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” and “machine-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the presently disclosed technique and innovation.


In general, the routines executed to implement the embodiments of the disclosure, can be implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions referred to as “computer programs.” The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processing units or processors in a computer, cause the computer to perform operations to execute elements involving the various aspects of the disclosure.


Moreover, while embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms, and that the disclosure applies equally regardless of the particular type of machine or computer-readable media used to actually effect the distribution.


Further examples of machine-readable storage media, machine-readable media, or computer-readable (storage) media include, but are not limited to, recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.), among others, and transmission type media such as digital and analog communication links.


The network interface device enables the machine 2800 to mediate data in a network with an entity that is external to the host server, through any known and/or convenient communications protocol supported by the host and the external entity. The network interface device can include one or more of a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, bridge router, a hub, a digital media receiver, and/or a repeater.


The network interface device can include a firewall which can, in some embodiments, govern and/or manage permission to access/proxy data in a computer network, and track varying levels of trust between different machines and/or applications. The firewall can be any number of modules having any combination of hardware and/or software components able to enforce a predetermined set of access rights between a particular set of machines and applications, machines and machines, and/or applications and applications, for example, to regulate the flow of traffic and resource sharing between these varying entities. The firewall can additionally manage and/or have access to an access control list which details permissions including for example, the access and operation rights of an object by an individual, a machine, and/or an application, and the circumstances under which the permission rights stand.


Other network security functions can be performed or included in the functions of the firewall, can be, for example, but are not limited to, intrusion-prevention, intrusion detection, next-generation firewall, personal firewall, etc. without deviating from the novel art of this disclosure.


Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof, means any connection or coupling, either direct or indirect, between two or more elements; the coupling of connection between the elements can be physical, logical, or a combination thereof. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number can also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.


The above detailed description of embodiments of the disclosure is not intended to be exhaustive or to limit the teachings to the precise form disclosed above. While specific embodiments of, and examples for, the disclosure are described above for illustrative purposes, various equivalent modifications are possible within the scope of the disclosure, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative embodiments can perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks can be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks can be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks can instead be performed in parallel, or can be performed at different times. Further, any specific numbers noted herein are only examples: alternative implementations can employ differing values or ranges.


The teachings of the disclosure provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various embodiments described above can be combined to provide further embodiments.


Any patents and applications and other references noted above, including any that can be listed in accompanying filing papers, are incorporated herein by reference. Aspects of the disclosure can be modified, if necessary, to employ the systems, functions, and concepts of the various references described above to provide yet further embodiments of the disclosure.


These and other changes can be made to the disclosure in light of the above Detailed Description. While the above description describes certain embodiments of the disclosure, and describes the best mode contemplated, no matter how detailed the above appears in text, the teachings can be practiced in many ways. Details of the system can vary considerably in its implementation details, while still being encompassed by the subject matter disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the disclosure should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the disclosure with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the disclosure to the specific embodiments disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the disclosure encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the disclosure under the claims.


While certain aspects of the disclosure are presented below in certain claim forms, the inventors contemplate the various aspects of the disclosure in any number of claim forms. For example, while only one aspect of the disclosure is recited as a means-plus-function claim under 35 U.S.C. §112, ¶6, other aspects can likewise be embodied as a means-plus-function claim, or in other forms, such as being embodied in a computer-readable medium. (Any claim intended to be treated under 35 U.S.C. §112, ¶6 begins with the words “means for”.) Accordingly, the applicant reserves the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the disclosure.

Claims
  • 1. A method comprising: providing, in a user interface, an element for requesting a creation of a shared document in a folder shared among collaborators in a cloud-based collaboration platform, wherein the shared document is concurrently accessible by the collaborators in real time via their respective physical devices;in response to a detection of an action to create the shared document for a concurrent access, generating the shared document,presenting the shared document to each of the collaborators for viewing or accessing in real time,wherein the presentation of the shared document includes a content portion and a margin portion adjacent to the content portion;wherein each of the collaborators is able to view, edit, and modify the content portion of the shared document concurrently using the collaborator's respective physical device, andwherein edits or modifications made to the content portion of the shared document by a particular collaborator are accessible to other collaborators at their respective physical devices in real time,marking a current position of editing the content portion of the shared document by each of the collaborators, wherein the current position marking identifies each of the collaborators with a photo enclosed in a frame of a color associated with each of the collaborators;wherein the markings for each of the collaborators are presented in the margin portion of the shared document and aligned with their respective current positions of editing the content portion of the shared document, so as not to obstruct the presentation of the content portion of the shared document to other collaborators; andwherein if two or more collaborators are editing a same line or paragraph in the content portion of the shared document, their respective markings are presented next to each other in the margin portion;identifying an active editing in the content portion of the shared document by the particular collaborator at the particular collaborator's current position, andresponsive to identifying the active editing by the particular collaborator, modifying the particular collaborator's current position marking such that the marking identifies the particular collaborator with a string including the particular collaborator's name in addition to the photo enclosed in the frame of the color associated with the particular collaborator and such that the photo is repeatedly flashed while the particular collaborator is actively editing.
  • 2. The method of claim 1, wherein the collaborators are specified via another user interface element in a list associated with the shared document.
  • 3. The method of claim 1, wherein one or more of the physical devices include a mobile device.
  • 4. The method of claim 1, further comprising disabling a write access to the shared document by one or more of the collaborators.
  • 5. The method of claim 4, wherein the write access is disabled for the one or more of the collaborators by an administrator of the shared document.
  • 6. The method of claim 1, wherein identifying the active editing in the content portion of the shared document by the particular collaborator is based on a number of keystrokes detected at the particular collaborator's physical device in a predetermined time period.
  • 7. The method of claim 6, wherein the identified active editing is presented to the other collaborators after the number of keystrokes detected at the particular collaborator's physical device in a predetermined time period.
  • 8. The method of claim 1, wherein additional collaborators are specifiable for the shared document created for the concurrent real time access in addition to those originally associated with the shared document.
  • 9. The method of claim 1, wherein the shared document is presented to the collaborators via a second user interface that is overlaid on the user interface.
  • 10. The method of claim 9, wherein the second user interface is overlaid on the user interface using a lightbox technique.
  • 11. The method of claim 9, wherein the shared document comprises a note and the second interface comprises a quick interface.
  • 12. The method of claim 11, wherein the quick interface disallows a manipulation by the collaborators.
  • 13. The method of claim 11, wherein the quick interface provides the collaborators with a snapshot of the note in its current state of collaboration.
  • 14. The method of claim 1, wherein the user interface provides a visibility into a notebook in the cloud-based collaboration platform and the data shared document comprises a note associated with one or more notes of the notebook.
  • 15. The method of claim 14, wherein the user interface includes a graphical element that facilitates a collaborator navigation through the one or more notes of the notebook.
  • 16. The method of claim 15, further comprising: receiving a request initiated by a first collaborator of the collaborators via the graphical element, wherein the request indicates a particular note of the one or more notes of the notebook; andresponsive to receiving the request, presenting the particular note to the first collaborator.
  • 17. A system, comprising: one or more processors; andone or more memory units, the one or more memory units having instructions stored thereon, which when executed by the one or more processors, cause the system to: provide in a first generated graphical user interface (GUI), a graphical element for requesting a creation of a shared document in a folder shared among collaborators in a cloud-based collaboration platform, wherein the shared document is concurrently accessible by the collaborators in real time via their respective physical devices;in response to a detection of an action to create the shared document for a concurrent access, generate the shared document, andpresent the shared document to each of the collaborators for viewing or accessing in real time,wherein the presentation of the shared document includes a content portion and a margin portion adjacent to the content portion;wherein each of the collaborators is able to view, edit, or modify the content portion of the shared document concurrently via the collaborator's respective physical device, and any edits or modifications made to the content portion of the shared document by a particular collaborator are accessible to other collaborators via their respective physical devices;mark a current position of editing the content portion of the shared document by each of the collaborators, wherein the current position marking identifies each of the collaborators with a photo enclosed in a frame of a color associated with each of the collaborators; andwherein the markings for each of the collaborators are presented in the margin portion of the shared document and aligned with the respective current position of editing the content portion of the shared document by, so as not to obstruct the presentation of the content portion of the shared document to other collaborators; andwherein if two or more collaborators are editing a same line or paragraph in the content portion of the shared document, their respective markings are presented next to each other in the margin portion;identify an active editing in the content portion of the shared document by the particular collaborator at the particular collaborator's current position, andresponsive to identifying the active editing by the particular collaborator, modifying the particular collaborator's current position marking such that the marking identifies the particular collaborator with a string including the particular collaborator's name in addition to the photo enclosed in the frame of the color associated with the particular collaborator, wherein the photo associated with the particular collaborator is repeatedly flashed while the particular collaborator is actively editing.
  • 18. The system of claim 17, wherein the one or more memory units have further instructions stored thereon, which when executed by the one or more processors, cause the system to further, automatically resolve any conflicting edits concurrently made by the collaborators.
  • 19. The system of claim 17, wherein the collaborators are specified via another graphical element in a list associated with the shared document.
  • 20. The system of claim 19, wherein the list is generated in a second GUI separate from the first GUI.
  • 21. The system of claim 20, wherein the one or more memory units have further instructions stored thereon, which when executed by the one or more processors, cause the system to further, require, an interaction with the second GUI prior to returning a control to the first GUI.
  • 22. A non-transitory machine-readable medium having instructions stored thereon, which when executed by a processor of a system, direct the system to: provide a user interface including an element for requesting a creation of a shared document in a folder shared among collaborators in a cloud-based collaboration platform, wherein the shared document is concurrently accessible to the collaborators in real time via their respective physical devices;detect an action to create the shared document for a concurrent access;in response to the detection, generate the shared document, andpresent the shared document to each of the collaborators for accessing in real time,wherein the presentation of the shared document includes a content portion and a margin portion adjacent to the content portion;wherein each of the collaborators is able to view, edit, and modify the shared document concurrently using the collaborator's respective physical device, andwherein edits or modifications made to the content portion of the shared document by a particular collaborator are accessible to other collaborators at their respective physical devices in real time, andmark a current position of editing the content portion of the shared document by each of the collaborators, wherein the current position marking identifies each of the collaborators with a photo enclosed in a frame of a color associated with each of the collaborators;wherein the markings for each of the collaborators are presented in the margin portion of the shared document and aligned with their respective current positions of editing the content portion of the shared document, so as not to obstruct the presentation of the content portion of the shared document to other collaborators; andwherein if two or more collaborators are editing a same line or paragraph in the content portion of the shared document, their respective markings are presented next to each other in the margin portion;identify an active editing in the content portion of the shared document by the particular collaborator at the particular collaborator's current position, andresponsive to identifying the active editing by the particular collaborator, modify the particular collaborator's current position marking such that the marking identifies the particular collaborator with a string including the collaborator's name in addition to the photo enclosed in the frame of the color associated with the particular collaborator and such that the photo is repeatedly flashed while the particular collaborator is actively editing.
  • 23. The non-transitory machine-readable medium of claim 22, wherein the instructions, when executed by the processor, further cause the system to automatically resolve concurrent edits or modifications made to the content portion of the shared document.
  • 24. The non-transitory machine-readable medium of claim 22, wherein the collaborators are specified via another user interface element in a list associated with the shared document.
  • 25. The non-transitory machine-readable medium of claim 22, wherein one or more of the physical devices include a mobile device.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation application of U.S. patent application Ser. No. 14/027,149, entitled “SIMULTANEOUS EDITING/ACCESSING OF CONTENT BY COLLABORATOR INVITATION THROUGH A WEB-BASED OR MOBILE APPLICATION TO A CLOUD-BASED COLLABORATION PLATFORM”, filed Sep. 13, 2013, which is hereby incorporated by reference in its entirety.

US Referenced Citations (517)
Number Name Date Kind
5301348 Jaaskelainen Apr 1994 A
5576946 Bender et al. Nov 1996 A
5787175 Carter Jul 1998 A
5799320 Klug Aug 1998 A
5848415 Guck Dec 1998 A
5864870 Guck Jan 1999 A
5893126 Drews et al. Apr 1999 A
5999908 Abelow Dec 1999 A
6016467 Newsted et al. Jan 2000 A
6034621 Kaufman Mar 2000 A
6055543 Christensen et al. Apr 2000 A
6073161 DeBoskey et al. Jun 2000 A
6097390 Marks Aug 2000 A
6098078 Gehani Aug 2000 A
6233600 Salas et al. May 2001 B1
6260040 Kauffman et al. Jul 2001 B1
6289345 Yasue Sep 2001 B1
6292803 Richardson et al. Sep 2001 B1
6336124 Alam et al. Jan 2002 B1
6342906 Kumar et al. Jan 2002 B1
6345386 Delo et al. Feb 2002 B1
6370543 Hoffert et al. Apr 2002 B2
6374260 Hoffert et al. Apr 2002 B1
6385606 Inohara et al. May 2002 B2
6396593 Laverty et al. May 2002 B1
6515681 Knight Feb 2003 B1
6539381 Prasad Mar 2003 B1
6584466 Serbinis et al. Jun 2003 B1
6636872 Heath et al. Oct 2003 B1
6654737 Nunez Nov 2003 B1
6662186 Esquibel et al. Dec 2003 B1
6687878 Eintracht et al. Feb 2004 B1
6714968 Prust Mar 2004 B1
6735623 Prust May 2004 B1
6742181 Koike et al. May 2004 B1
6760721 Chasen et al. Jul 2004 B1
6789109 Samra et al. Sep 2004 B2
6859909 Lerner Feb 2005 B1
6947162 Rosenberg et al. Sep 2005 B2
6952724 Prust Oct 2005 B2
6996768 Elo et al. Feb 2006 B1
7010752 Ly Mar 2006 B2
7020697 Goodman et al. Mar 2006 B1
7039806 Friedman et al. May 2006 B1
7069393 Miyata et al. Jun 2006 B2
7107549 Deaton et al. Sep 2006 B2
7130831 Howard et al. Oct 2006 B2
7133834 Abelow Nov 2006 B1
7149787 Mutalik et al. Dec 2006 B1
7152182 Ji et al. Dec 2006 B2
7155435 Day et al. Dec 2006 B1
7155483 Friend et al. Dec 2006 B1
7165107 Pouyoul et al. Jan 2007 B2
7213206 Fogg May 2007 B2
7222078 Abelow May 2007 B2
7243299 Rubin et al. Jul 2007 B1
7246118 Chastain et al. Jul 2007 B2
7275244 Charles Bell et al. Sep 2007 B1
7296025 Kung et al. Nov 2007 B2
7305436 Willis Dec 2007 B2
7346778 Guiter et al. Mar 2008 B1
7353252 Yang et al. Apr 2008 B1
7370269 Prabhu et al. May 2008 B1
7401117 Dan et al. Jul 2008 B2
7437421 Bhogal et al. Oct 2008 B2
7467415 Carter Dec 2008 B2
7496830 Rubin et al. Feb 2009 B2
7496841 Hadfield et al. Feb 2009 B2
7543000 Castro et al. Jun 2009 B2
7581221 Lai et al. Aug 2009 B2
7620565 Abelow Nov 2009 B2
7627831 Chiu et al. Dec 2009 B2
7647559 Yozell-Epstein et al. Jan 2010 B2
7650367 Arruza Jan 2010 B2
7661088 Burke Feb 2010 B2
7665093 Maybee et al. Feb 2010 B2
7676542 Moser et al. Mar 2010 B2
7698363 Dan et al. Apr 2010 B2
7734600 Wise et al. Jun 2010 B1
7756843 Palmer Jul 2010 B1
7774412 Schnepel Aug 2010 B1
7814426 Huesken et al. Oct 2010 B2
7886287 Davda Feb 2011 B1
7890964 Vogler-Ivashchanka et al. Feb 2011 B2
7930418 Samra et al. Apr 2011 B2
7937663 Parker et al. May 2011 B2
7958453 Taing Jun 2011 B1
7962853 Bedi et al. Jun 2011 B2
7996374 Jones et al. Aug 2011 B1
8024661 Bibliowicz Sep 2011 B2
8027976 Ding et al. Sep 2011 B1
RE42904 Stephens, Jr. Nov 2011 E
8065739 Bruening et al. Nov 2011 B1
8090361 Hagan Jan 2012 B2
8103662 Eagan et al. Jan 2012 B2
8108779 Rein Jan 2012 B1
8117261 Briere et al. Feb 2012 B2
8140513 Ghods et al. Mar 2012 B2
8151183 Chen et al. Apr 2012 B2
8185830 Saha et al. May 2012 B2
8214747 Yankovich et al. Jul 2012 B1
8230348 Peters et al. Jul 2012 B2
8347276 Schadow Jan 2013 B2
8358701 Chou et al. Jan 2013 B2
8374944 Robb Feb 2013 B2
8429540 Yankovich et al. Apr 2013 B1
8464161 Giles et al. Jun 2013 B2
8527549 Cidon Sep 2013 B2
8549066 Donahue et al. Oct 2013 B1
8549511 Seki et al. Oct 2013 B2
8607306 Bridge et al. Dec 2013 B1
8682973 Kikin-Gil et al. Mar 2014 B2
8706810 Vishnubhatla et al. Apr 2014 B2
8738706 Grieve et al. May 2014 B1
8756513 Schmieder et al. Jun 2014 B1
8849955 Prahlad et al. Sep 2014 B2
8892679 Destagnol et al. Nov 2014 B1
8943197 Taylor Jan 2015 B1
9053079 Bailor et al. Jun 2015 B2
9063912 Seibert, Jr. et al. Jun 2015 B2
9069743 Kotler et al. Jun 2015 B2
9224073 Okajima Dec 2015 B2
9224129 Sitrick Dec 2015 B2
9235268 Arrasvuori et al. Jan 2016 B2
9252962 Valeti Feb 2016 B1
9256341 Megiddo Feb 2016 B2
20010027492 Gupta Oct 2001 A1
20020049786 Bibliowicz Apr 2002 A1
20020091738 Rohrabaugh et al. Jul 2002 A1
20020099552 Rubin et al. Jul 2002 A1
20020099772 Deshpande et al. Jul 2002 A1
20020133509 Johnston et al. Sep 2002 A1
20020147770 Tang Oct 2002 A1
20020194177 Sherman et al. Dec 2002 A1
20030009459 Chastain et al. Jan 2003 A1
20030041095 Konda et al. Feb 2003 A1
20030084306 Abburi et al. May 2003 A1
20030093404 Bader et al. May 2003 A1
20030108052 Inoue et al. Jun 2003 A1
20030110264 Whidby et al. Jun 2003 A1
20030115326 Verma et al. Jun 2003 A1
20030135536 Lyons Jul 2003 A1
20030135565 Estrada Jul 2003 A1
20030154306 Perry Aug 2003 A1
20030204490 Kasriel Oct 2003 A1
20030217171 Von Stuermer et al. Nov 2003 A1
20040021686 Barberis Feb 2004 A1
20040088647 Miller et al. May 2004 A1
20040103147 Flesher et al. May 2004 A1
20040111415 Scardino et al. Jun 2004 A1
20040122949 Zmudzinski et al. Jun 2004 A1
20040128359 Horvitz et al. Jul 2004 A1
20040172588 Mattaway Sep 2004 A1
20040177138 Salle et al. Sep 2004 A1
20040181579 Huck et al. Sep 2004 A1
20040221239 Hachigian et al. Nov 2004 A1
20040246532 Inada Dec 2004 A1
20040250201 Caspi Dec 2004 A1
20040267836 Armangau et al. Dec 2004 A1
20050005276 Morgan Jan 2005 A1
20050010860 Weiss et al. Jan 2005 A1
20050018828 Nierhaus Jan 2005 A1
20050022229 Gabriel et al. Jan 2005 A1
20050028006 Leser et al. Feb 2005 A1
20050033813 Bhogal et al. Feb 2005 A1
20050050228 Perham et al. Mar 2005 A1
20050063083 Dart et al. Mar 2005 A1
20050097225 Glatt et al. May 2005 A1
20050102328 Ring et al. May 2005 A1
20050108406 Lee et al. May 2005 A1
20050114305 Haynes et al. May 2005 A1
20050114378 Elien et al. May 2005 A1
20050182966 Pham et al. Aug 2005 A1
20050198299 Beck et al. Sep 2005 A1
20050209808 Kelbon et al. Sep 2005 A1
20050234864 Shapiro Oct 2005 A1
20050234943 Clarke Oct 2005 A1
20050261933 Magnuson Nov 2005 A1
20060005163 Huesken et al. Jan 2006 A1
20060026502 Dutta Feb 2006 A1
20060026535 Hotelling et al. Feb 2006 A1
20060036568 Moore et al. Feb 2006 A1
20060041603 Paterson et al. Feb 2006 A1
20060047804 Fredricksen et al. Mar 2006 A1
20060053088 Ali et al. Mar 2006 A1
20060053380 Spataro et al. Mar 2006 A1
20060070083 Brunswig et al. Mar 2006 A1
20060075071 Gillette Apr 2006 A1
20060080432 Spataro Apr 2006 A1
20060123062 Bobbitt et al. Jun 2006 A1
20060168550 Muller et al. Jul 2006 A1
20060174051 Lordi et al. Aug 2006 A1
20060174054 Matsuki Aug 2006 A1
20060179070 George et al. Aug 2006 A1
20060242204 Karas et al. Oct 2006 A1
20060259524 Horton Nov 2006 A1
20060265719 Astl et al. Nov 2006 A1
20060271510 Harward et al. Nov 2006 A1
20070016680 Burd et al. Jan 2007 A1
20070038934 Fellman Feb 2007 A1
20070079242 Jolley et al. Apr 2007 A1
20070100830 Beedubail et al. May 2007 A1
20070115845 Hochwarth et al. May 2007 A1
20070118598 Bedi et al. May 2007 A1
20070124460 McMullen et al. May 2007 A1
20070124737 Wensley et al. May 2007 A1
20070124781 Casey et al. May 2007 A1
20070126635 Houri Jun 2007 A1
20070130143 Zhang et al. Jun 2007 A1
20070130163 Perez et al. Jun 2007 A1
20070198609 Black et al. Aug 2007 A1
20070208878 Barnes-Leon et al. Sep 2007 A1
20070214180 Crawford Sep 2007 A1
20070220016 Estrada et al. Sep 2007 A1
20070220590 Rasmussen et al. Sep 2007 A1
20070240057 Satterfield et al. Oct 2007 A1
20070250762 Mansfield Oct 2007 A1
20070256065 Heishi et al. Nov 2007 A1
20070266304 Fletcher et al. Nov 2007 A1
20070282848 Kiilerich et al. Dec 2007 A1
20070283443 McPherson et al. Dec 2007 A1
20070288290 Motoyama et al. Dec 2007 A1
20080005135 Muthukrishnan et al. Jan 2008 A1
20080005195 Li Jan 2008 A1
20080016146 Gan et al. Jan 2008 A1
20080021959 Naghi et al. Jan 2008 A1
20080028323 Rosen et al. Jan 2008 A1
20080040173 Aleong et al. Feb 2008 A1
20080040503 Kleks et al. Feb 2008 A1
20080046828 Bibliowicz Feb 2008 A1
20080059656 Saliba et al. Mar 2008 A1
20080063210 Goodman et al. Mar 2008 A1
20080065881 Dawson et al. Mar 2008 A1
20080077631 Petri Mar 2008 A1
20080084984 Levy Apr 2008 A1
20080091763 Devonshire et al. Apr 2008 A1
20080091790 Beck Apr 2008 A1
20080104277 Tian May 2008 A1
20080114720 Smith et al. May 2008 A1
20080133674 Knauerhase et al. Jun 2008 A1
20080140732 Wilson et al. Jun 2008 A1
20080147790 Malaney et al. Jun 2008 A1
20080147810 Kumar et al. Jun 2008 A1
20080151817 Fitchett et al. Jun 2008 A1
20080154873 Redlich et al. Jun 2008 A1
20080182628 Lee et al. Jul 2008 A1
20080183467 Yuan et al. Jul 2008 A1
20080184130 Tien et al. Jul 2008 A1
20080194239 Hagan Aug 2008 A1
20080215883 Fok et al. Sep 2008 A1
20080222654 Xu et al. Sep 2008 A1
20080243855 Prahlad et al. Oct 2008 A1
20080250333 Reeves et al. Oct 2008 A1
20080250348 Alimpich et al. Oct 2008 A1
20080263099 Brady-Kalnay et al. Oct 2008 A1
20080271095 Shafton Oct 2008 A1
20080276158 Lim et al. Nov 2008 A1
20090015864 Hasegawa Jan 2009 A1
20090019093 Brodersen et al. Jan 2009 A1
20090019426 Baeumer et al. Jan 2009 A1
20090030710 Levine Jan 2009 A1
20090044128 Baumgarten et al. Feb 2009 A1
20090049131 Lyle et al. Feb 2009 A1
20090094546 Anzelde et al. Apr 2009 A1
20090106642 Albornoz et al. Apr 2009 A1
20090111509 Mednieks Apr 2009 A1
20090119322 Mills et al. May 2009 A1
20090125469 McDonald et al. May 2009 A1
20090132651 Roger et al. May 2009 A1
20090138808 Moromisato et al. May 2009 A1
20090150417 Ghods et al. Jun 2009 A1
20090150627 Benhase et al. Jun 2009 A1
20090158142 Arthursson et al. Jun 2009 A1
20090164438 Delacruz Jun 2009 A1
20090171983 Samji et al. Jul 2009 A1
20090193107 Srinivasan et al. Jul 2009 A1
20090193345 Wensley et al. Jul 2009 A1
20090198772 Kim et al. Aug 2009 A1
20090210459 Nair et al. Aug 2009 A1
20090214115 Kimura et al. Aug 2009 A1
20090235167 Boyer et al. Sep 2009 A1
20090235181 Saliba et al. Sep 2009 A1
20090235189 Aybes et al. Sep 2009 A1
20090249224 Davis et al. Oct 2009 A1
20090254589 Nair et al. Oct 2009 A1
20090260060 Smith et al. Oct 2009 A1
20090271708 Peters et al. Oct 2009 A1
20090276771 Nickolov et al. Nov 2009 A1
20090282212 Peterson Nov 2009 A1
20090300356 Crandell Dec 2009 A1
20090300527 Malcolm et al. Dec 2009 A1
20090319910 Escapa Dec 2009 A1
20090327358 Lukiyanov et al. Dec 2009 A1
20090327961 De Vorchik et al. Dec 2009 A1
20100011292 Marinkovich Jan 2010 A1
20100011447 Jothimani Jan 2010 A1
20100017262 Iyer et al. Jan 2010 A1
20100036929 Scherpa et al. Feb 2010 A1
20100040217 Aberg Feb 2010 A1
20100042720 Stienhans et al. Feb 2010 A1
20100057560 Skudlark et al. Mar 2010 A1
20100057785 Khosravy et al. Mar 2010 A1
20100076946 Barker et al. Mar 2010 A1
20100082634 Leban Apr 2010 A1
20100083136 Komine et al. Apr 2010 A1
20100088150 Mazhar et al. Apr 2010 A1
20100092126 Kaliszek et al. Apr 2010 A1
20100093310 Gbadegesin et al. Apr 2010 A1
20100107225 Spencer et al. Apr 2010 A1
20100131868 Chawla et al. May 2010 A1
20100151431 Miller Jun 2010 A1
20100153835 Xiong et al. Jun 2010 A1
20100162365 Del Real Jun 2010 A1
20100162374 Nair Jun 2010 A1
20100169269 Chandrasekaran Jul 2010 A1
20100179940 Gilder et al. Jul 2010 A1
20100185463 Noland et al. Jul 2010 A1
20100185932 Coffman et al. Jul 2010 A1
20100191689 Cortes et al. Jul 2010 A1
20100198783 Wang et al. Aug 2010 A1
20100198871 Stiegler et al. Aug 2010 A1
20100198944 Ho et al. Aug 2010 A1
20100205537 Knighton et al. Aug 2010 A1
20100223378 Wei Sep 2010 A1
20100229085 Nelson et al. Sep 2010 A1
20100235526 Carter et al. Sep 2010 A1
20100235539 Carter et al. Sep 2010 A1
20100241611 Zuber Sep 2010 A1
20100241972 Spataro et al. Sep 2010 A1
20100250120 Waupotitsch et al. Sep 2010 A1
20100251340 Martin et al. Sep 2010 A1
20100257457 De Goes Oct 2010 A1
20100262582 Garcia-Ascanio et al. Oct 2010 A1
20100267588 Nelson et al. Oct 2010 A1
20100274765 Murphy et al. Oct 2010 A1
20100274772 Samuels Oct 2010 A1
20100281118 Donahue et al. Nov 2010 A1
20100290623 Banks et al. Nov 2010 A1
20100306379 Ferris Dec 2010 A1
20100318893 Matthews et al. Dec 2010 A1
20100322252 Suganthi et al. Dec 2010 A1
20100325155 Skinner et al. Dec 2010 A1
20100325527 Estrada et al. Dec 2010 A1
20100325559 Westerinen et al. Dec 2010 A1
20100325655 Perez Dec 2010 A1
20100332401 Prahlad et al. Dec 2010 A1
20100332962 Hammer et al. Dec 2010 A1
20100333116 Prahlad et al. Dec 2010 A1
20110001763 Murakami Jan 2011 A1
20110016409 Grosz et al. Jan 2011 A1
20110022559 Andersen et al. Jan 2011 A1
20110022812 van der Linden et al. Jan 2011 A1
20110029883 Lussier et al. Feb 2011 A1
20110040812 Phillips Feb 2011 A1
20110041083 Gabai et al. Feb 2011 A1
20110047413 McGill et al. Feb 2011 A1
20110047484 Mount et al. Feb 2011 A1
20110052155 Desmarais et al. Mar 2011 A1
20110054968 Galaviz Mar 2011 A1
20110055299 Phillips Mar 2011 A1
20110055721 Jain et al. Mar 2011 A1
20110061045 Phillips Mar 2011 A1
20110061046 Phillips Mar 2011 A1
20110065082 Gal et al. Mar 2011 A1
20110066951 Ward-Karet et al. Mar 2011 A1
20110083167 Carpenter et al. Apr 2011 A1
20110093567 Jeon et al. Apr 2011 A1
20110099006 Sundararaman et al. Apr 2011 A1
20110113320 Neff et al. May 2011 A1
20110119313 Sung et al. May 2011 A1
20110134204 Rodriguez Jun 2011 A1
20110137991 Russell Jun 2011 A1
20110142410 Ishii Jun 2011 A1
20110145744 Haynes et al. Jun 2011 A1
20110161289 Pei et al. Jun 2011 A1
20110167125 Achlioptas Jul 2011 A1
20110167353 Grosz et al. Jul 2011 A1
20110167435 Fang Jul 2011 A1
20110185292 Chawla et al. Jul 2011 A1
20110202424 Chun et al. Aug 2011 A1
20110202599 Yuan et al. Aug 2011 A1
20110208958 Stuedi et al. Aug 2011 A1
20110209052 Parker et al. Aug 2011 A1
20110209064 Jorgensen et al. Aug 2011 A1
20110213765 Cui et al. Sep 2011 A1
20110219419 Reisman Sep 2011 A1
20110225417 Maharajh et al. Sep 2011 A1
20110238458 Purcell et al. Sep 2011 A1
20110238621 Agrawal Sep 2011 A1
20110238759 Spataro et al. Sep 2011 A1
20110239135 Spataro et al. Sep 2011 A1
20110246294 Robb et al. Oct 2011 A1
20110246950 Luna et al. Oct 2011 A1
20110249024 Arrasvuori et al. Oct 2011 A1
20110252071 Cidon Oct 2011 A1
20110252320 Arrasvuori et al. Oct 2011 A1
20110252339 Lemonik Oct 2011 A1
20110258461 Bates Oct 2011 A1
20110258561 Ladouceur et al. Oct 2011 A1
20110282710 Akkiraju et al. Nov 2011 A1
20110289433 Whalin et al. Nov 2011 A1
20110296022 Ferris et al. Dec 2011 A1
20110313803 Friend et al. Dec 2011 A1
20110320197 Conejero et al. Dec 2011 A1
20120036370 Lim et al. Feb 2012 A1
20120036423 Haynes et al. Feb 2012 A1
20120064879 Panei Mar 2012 A1
20120072436 Pierre et al. Mar 2012 A1
20120079095 Evans et al. Mar 2012 A1
20120089659 Halevi et al. Apr 2012 A1
20120110005 Kuo et al. May 2012 A1
20120110436 Adler, III et al. May 2012 A1
20120110443 Lemonik et al. May 2012 A1
20120117626 Yates et al. May 2012 A1
20120124306 Abercrombie et al. May 2012 A1
20120124547 Halbedel May 2012 A1
20120130900 Tang et al. May 2012 A1
20120134491 Liu May 2012 A1
20120136936 Quintuna May 2012 A1
20120144283 Hill et al. Jun 2012 A1
20120150888 Hyatt et al. Jun 2012 A1
20120151551 Readshaw et al. Jun 2012 A1
20120158908 Luna et al. Jun 2012 A1
20120159178 Lin et al. Jun 2012 A1
20120159310 Chang et al. Jun 2012 A1
20120173625 Berger Jul 2012 A1
20120179981 Whalin et al. Jul 2012 A1
20120185355 Kilroy Jul 2012 A1
20120185913 Martinez et al. Jul 2012 A1
20120192055 Antebi et al. Jul 2012 A1
20120192086 Ghods et al. Jul 2012 A1
20120192099 Carbonera et al. Jul 2012 A1
20120203908 Beaty et al. Aug 2012 A1
20120204032 Wilkins et al. Aug 2012 A1
20120214444 McBride et al. Aug 2012 A1
20120218885 Abel et al. Aug 2012 A1
20120221789 Felter Aug 2012 A1
20120221937 Patterson et al. Aug 2012 A1
20120226767 Luna et al. Sep 2012 A1
20120233155 Gallmeier et al. Sep 2012 A1
20120233205 McDermott Sep 2012 A1
20120233543 Vagell et al. Sep 2012 A1
20120240061 Hillenius et al. Sep 2012 A1
20120257249 Natarajan Oct 2012 A1
20120263166 Cho et al. Oct 2012 A1
20120266203 Elhadad et al. Oct 2012 A1
20120284638 Cutler et al. Nov 2012 A1
20120284664 Zhao Nov 2012 A1
20120291011 Quine Nov 2012 A1
20120309540 Holme et al. Dec 2012 A1
20120311157 Erickson et al. Dec 2012 A1
20120317239 Mulder et al. Dec 2012 A1
20120317487 Lieb et al. Dec 2012 A1
20120328259 Seibert, Jr. et al. Dec 2012 A1
20120331177 Jensen Dec 2012 A1
20120331441 Adamson Dec 2012 A1
20130007245 Malik et al. Jan 2013 A1
20130007471 Grab et al. Jan 2013 A1
20130007894 Dang et al. Jan 2013 A1
20130013560 Goldberg et al. Jan 2013 A1
20130014023 Lee et al. Jan 2013 A1
20130024418 Sitrick et al. Jan 2013 A1
20130031208 Linton et al. Jan 2013 A1
20130042106 Persaud et al. Feb 2013 A1
20130047093 Reuschel Feb 2013 A1
20130055127 Saito et al. Feb 2013 A1
20130067232 Cheung et al. Mar 2013 A1
20130073403 Tuchman et al. Mar 2013 A1
20130080913 Rodrig et al. Mar 2013 A1
20130080919 Kiang et al. Mar 2013 A1
20130080966 Kikin-Gil et al. Mar 2013 A1
20130091440 Kotler et al. Apr 2013 A1
20130097481 Kotler et al. Apr 2013 A1
20130117337 Dunham May 2013 A1
20130117376 Filman May 2013 A1
20130124638 Barreto et al. May 2013 A1
20130138608 Smith May 2013 A1
20130138615 Gupta et al. May 2013 A1
20130151940 Bailor et al. Jun 2013 A1
20130155071 Chan Jun 2013 A1
20130159411 Bowen Jun 2013 A1
20130163289 Kim et al. Jun 2013 A1
20130167253 Seleznev et al. Jun 2013 A1
20130169742 Wu Jul 2013 A1
20130185347 Romano Jul 2013 A1
20130185558 Seibert et al. Jul 2013 A1
20130191339 Haden et al. Jul 2013 A1
20130198600 Lockhart et al. Aug 2013 A1
20130212486 Joshi et al. Aug 2013 A1
20130218978 Weinstein et al. Aug 2013 A1
20130239049 Perrodin et al. Sep 2013 A1
20130246932 Zaveri et al. Sep 2013 A1
20130262210 Savage et al. Oct 2013 A1
20130262862 Hartley Oct 2013 A1
20130268480 Dorman Oct 2013 A1
20130268491 Chung et al. Oct 2013 A1
20130275398 Dorman et al. Oct 2013 A1
20130275429 York et al. Oct 2013 A1
20130275509 Micucci et al. Oct 2013 A1
20130305039 Gauda Nov 2013 A1
20130326344 Masselle et al. Dec 2013 A1
20140013112 Cidon et al. Jan 2014 A1
20140019497 Cidon et al. Jan 2014 A1
20140019498 Cidon et al. Jan 2014 A1
20140019882 Chew Jan 2014 A1
20140026025 Smith Jan 2014 A1
20140032489 Hebbar Jan 2014 A1
20140032616 Nack Jan 2014 A1
20140033277 Xiao et al. Jan 2014 A1
20140033291 Liu Jan 2014 A1
20140052939 Tseng et al. Feb 2014 A1
20140068589 Barak Mar 2014 A1
20140150023 Gudorf et al. May 2014 A1
20140156373 Roberts et al. Jun 2014 A1
20140172595 Beddow et al. Jun 2014 A1
20140280463 Hunter et al. Sep 2014 A1
20140310345 Megiddo Oct 2014 A1
Foreign Referenced Citations (41)
Number Date Country
2724521 Nov 2009 CA
101997924 Mar 2011 CN
102264063 Nov 2011 CN
0921661 Jun 1999 EP
1349088 Oct 2003 EP
1528746 May 2005 EP
2372574 Oct 2011 EP
2610776 Jul 2013 EP
2453924 Apr 2009 GB
2471282 Dec 2010 GB
09-101937 Apr 1997 JP
11-025059 Jan 1999 JP
2003273912 Sep 2003 JP
2004310272 Nov 2004 JP
09-269925 Oct 2007 JP
2008250944 Oct 2008 JP
20020017444 Mar 2002 KR
20040028036 Apr 2004 KR
20050017674 Feb 2005 KR
20060070306 Jun 2006 KR
20060114871 Nov 2006 KR
20070043353 Apr 2007 KR
20070100477 Oct 2007 KR
20100118836 Nov 2010 KR
20110074096 Jun 2011 KR
20110076831 Jul 2011 KR
WO-0007104 Feb 2000 WO
WO-0219128 Mar 2002 WO
WO-2004097681 Nov 2004 WO
WO-2006028850 Mar 2006 WO
WO-2007024438 Mar 2007 WO
WO-2007035637 Mar 2007 WO
WO-2007113573 Oct 2007 WO
WO-2008011142 Jan 2008 WO
WO-2008076520 Jun 2008 WO
WO-2011109416 Sep 2011 WO
WO-2012167272 Dec 2012 WO
WO-2013009328 Jan 2013 WO
WO-2013013217 Jan 2013 WO
WO-2013041763 Mar 2013 WO
WO-2013166520 Nov 2013 WO
Non-Patent Literature Citations (100)
Entry
“Comparison of Lightbox-type modules” by Matt V. (https://web.archive.org/web/20130510120527/http://drupal.org/node/266126; dated May 10, 2013; last accessed Jun. 23, 2015).
“Conceptboard”, One-Step Solution for Online Collaboration, retrieved from websites http://conceptboard.com and https://www.youtube.com/user/ConceptboardApp?feature=watch, printed on Jun. 13, 2013, 9 pages.
“How-to Geek, How to Sync Specific Folders With Dropbox,” downloaded from the internet http://www.howtogeek.com, Apr. 23, 2013, 5 pages.
“Microsoft Office SharePoint 2007 User Guide,” Feb. 16, 2010, pp. 1-48.
“Understanding Metadata,” National Information Standards Organization, NISO Press, 2004, 20 pages.
Cisco, “FTP Load Balancing on ACE in Routed Mode Configuration Example,” DocWiki, Jun. 2011, 7 pages.
Conner, “Google Apps: The Missing Manual,” published by O'Reilly Media, May 27, 2008, 24 pages.
Exam Report for EP13158415.3, Applicant: Box, Inc. Mailed Jun. 4, 2013, 8 pages.
Exam Report for GB1300188.8, Applicant: Box, Inc. Mailed May 31, 2013, 8 pages.
Exam Report for GB1306011.6, Applicant: Box, Inc. Mailed Apr. 18, 2013, 8 pages.
Exam Report for GB1310666.1, Applicant: Box, Inc. Mailed Aug. 30, 2013, 10 pages.
Exam Report for GB1313559.5, Applicant: Box, Inc., Mailed Aug. 22, 2013, 19 pages.
Google Docs, http://web.Archive.org/web/20100413105758/http://en.wikipedia.org/wiki/Google—docs, Apr. 13, 2010, 6 pages.
International Search Report and Written Opinion for PCT/US2008/012973 dated Apr. 30, 2009, pp. 1-11.
International Search Report and Written Opinion for PCT/US2011/039126 mailed on Oct. 6, 2011, pp. 1-13.
International Search Report and Written Opinion for PCT/US2011/041308 Mailed Jul. 2, 2012, pp. 1-16.
International Search Report and Written Opinion for PCT/US2011/047530, Applicant: Box, Inc., Mailed Mar. 22, 2013, pp. 1-10.
International Search Report and Written Opinion for PCT/US2011/056472 mailed on Jun. 22, 2012, pp. 1-12.
International Search Report and Written Opinion for PCT/US2011/057938, Applicant: Box, Inc., Mailed Mar. 29, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2011/060875 Mailed Oct. 30, 2012, pp. 1-10.
International Search Report and Written Opinion for PCT/US2012/056955, Applicant: Box, Inc., Mailed Mar. 27, 2013, pp. 1-11.
International Search Report and Written Opinion for PCT/US2012/063041, Applicant: Box, Inc., Mailed Mar. 29, 2013, 12 pages.
International Search Report and Written Opinion for PCT/US2012/065617, Applicant: Box, Inc., Mailed Mar. 29, 2013, 9 pages.
International Search Report and Written Opinion for PCT/US2012/067126, Applicant: Box, Inc., Mailed Mar. 29, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2012/070366, Applicant: Box, Inc., Mailed Apr. 24, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2013/020267, Applicant: Box, Inc., Mailed May 7, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2013/023889, Applicant: Box, Inc., Mailed Jun. 24, 2013, 13 pages.
International Search Report and Written Opinion for PCT/US2013/029520, Applicant: Box, Inc., Mailed Jun. 26, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2013/034662, Applicant: Box, Inc., Mailed May 31, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2013/035404, Applicant: Box, Inc., Mailed Jun. 26, 2013, 13 pages.
International Search Report and Written Opinion for PCT/US2013/039782, Applicant: Box, Inc., Mailed Aug. 28, 2013, 15 pages.
Internet Forums, http://web.archive.org/web/20100528195550/http://en.wikipedia.org/wiki/Internet—forums, Wikipedia, May 30, 2010, pp. 1-20.
Langfeld L. et al., “Microsoft SharePoint 2003 Unleashed,” Chapters 11 and 15, Jun. 2004, pp. 403-404, 557-561, 578-581.
Lars, “35 Very Useful Online Tools for Improving your project Management and Team Collaboration,” Apr. 31, 2010, tripwiremagazine.com, pp. 1-32.
Palmer, “Load Balancing FTP Servers,” BlogNav, Oct. 2008, 2 pages.
Parr, “Google Docs Improves Commenting, Adds E-mail Notifications,” Apr. 16, 2011, mashable.com, pp. 1-6.
Partial International Search Report for PCT/US2011/041308 dated Feb. 27, 2012, pp. 1-2.
Supplementary European Search Report European Application No. EP 08 85 8563 dated Jun. 20, 2011 pp. 1-5.
Wayback, “Wayback machine,” Wayback, Jun. 1, 2011, 1 page.
Wiki, http://web.archive.org/web/20100213004936/http://en.wikipedia.org/wiki/Wiki, Feb. 13, 2010, pp. 1-16.
Yahoo! Groups, http://web.archive.org/web/20090320101529/http://en.wikipedia.org/wiki/Yahoo!—Groups, Wikipedia, Mar. 20, 2009, pp. 1-6.
“PaperPort Professional 14,” PC Mag. Com review, published Feb. 2012, Ziff Davis, Inc., 8 pages.
“PaperPort,” Wikipedia article (old revision), published May 19, 2012, Wikipedia Foundation, 2 pages.
“Quickoffice Enhances Android Mobile office Application for Improved Productivity on latest Smartphone and Table Devices,” QuickOffice Press Release, Nov. 21, 2011, QuickOffice Inc., 2 pages.
“QuickOffice,” Wikipedia Article (old revision), published May 9, 2012, Wikipedia Foundation, 2 pages.
Exam Report for EP13168784.0, Applicant: Box, Inc. Mailed Nov. 21, 2013, 7 pages.
Exam Report for EP13185269.1, Applicant: Box, Inc. Mailed Jan. 28, 7 pages.
Exam Report for GB1309209.3, Applicant: Box, Inc. Mailed Oct. 30, 2013, 11 pages.
Exam Report for GB1311417.8, Applicant: Box, Inc. Mailed Dec. 20, 2013, 5 pages.
Exam Report for GB1312095.1, Applicant: Box, Inc. Mailed Dec. 12, 2013, 7 pages.
Exam Report for GB1312874.9, Applicant: Box, Inc. Mailed Dec. 20, 2013, 11 pages.
Exam Report for GB1316532.9, Applicant: Box, Inc. Mailed Oct. 31, 2013, 10 pages.
Exam Report for GB1316533.7, Applicant: Box, Inc. Mailed Oct. 8, 2013, 9 pages.
Exam Report for GB1316971.9, Applicant: Box, Inc. Mailed Nov. 26, 2013, 10 pages.
Exam Report for GB1317600.3, Applicant: Box, Inc. Mailed Nov. 21, 2013, 8 pages.
Exam Report for GB1318373.6, Applicant: Box, Inc. Mailed Dec. 17, 2013, 4 pages.
Exam Report for GB1320902.8, Applicant: Box, Inc. Mailed Dec. 20, 2013, 4 pages.
Gedymin, “Cloud computing with an emphasis on Google App Engine,” Master Final Project, Sep. 2011, 146 pages.
International Search Report and Written Opinion for PCT/US2013/034765, Applicant: Box, Inc., Mailed Jan. 20, 2014, 15 pages.
Patent Court Document of Approved Judgment for GB0602349.3 and GB0623571.7; Mar. 3, 2009, 17 pages.
Exam Report for GB1314771.5, Applicant: Box, Inc. Mailed Feb. 17, 2014, 7 pages.
Exam Report for GB1308842.2, Applicant: Box, Inc. Mailed Mar. 10, 2014, 4 pages.
Burns, “Developing Secure Mobile Applications for Android,” Oct. 2008, Version 1.0, 1-28 pages.
“Revolving sync conflicts; frequently asked questions,” Microsoft Tech Support, Jul. 16, 2012, retrieved from the Internet: http://web.archive.org/web, 2 pages.
“Troubleshoot sync problems,” Microsoft Tech Support: May 2, 2012, retrieved from the internet, http://web. Archive.org/web, 3 pages.
“Tulsa TechFest 2012—Agenda,” retrieved from the website, http://web.archive.org, Oct. 2, 2012, 2 pages.
Cohen, “Debating the Definition of Cloud Computing Platforms,” retrieved from the internet, http://forbes.com, Feb. 3, 2014, 7 pages.
Delendik, “Evolving with Web Standards—The Story of PDF.JS,” retrieved from the internet, http://people.mozilla.org, Oct. 12, 2012, 36 pages.
Delendik, “My PDF.js talk slides from Tulsa TechFest,” retrieved from the internet, http://twitter.com, Oct. 12, 2012, 2 pages.
Duffy, “The Best File-Syncing Services,” pcmag.com, retrieved from the internet: http://www.pcmag.com, Sep. 28, 2012, 7 pages.
Exam Report for EP13177108.1, Applicant: Box, Inc. Mailed May 26, 2014, 6 pages.
Exam Report for GB1318792.7, Applicant: Box, Inc. Mailed May 22, 2014, 2 pages.
Exam Report for GB1410569.6 Applicant: Box, Inc. Mailed Jul. 11, 2014, 9 pages.
Extended Search Report for EP131832800, Applicant: Box, Inc. Mailed Aug. 25, 2014, 7 pages.
Extended Search Report for EP141509422, Applicant: Box, Inc. Mailed Aug. 26, 2014, 12pages.
Partial Search Report for EP131832800, Applicant: Box, Inc. Mailed May 8, 2014, 5 pages.
Pyle et al. “How to enable Event logging for Offline Files (Client Side Caching) in Windows Vista,” Feb. 18, 2009, retrieved from the internet: http://blogs.technet.com, 3 pages.
Search Report for EP 13189144.2 Applicant: Box, Inc. Mailed Sep. 1, 2014, 9 pages.
Search Report for EP141509422, Applicant: Box, Inc. Mailed May 8, 2014, 7 pages.
Sommerer, “Presentable Document Format: Improved On-demand PDF to HTML Conversion,” retrieved from the internet, http://research.microsoft.com, Nov. 2004, 8 pages.
Tulloch et al., “Windows Vista Resource Kit,” Apr. 8, 2007, Microsoft Press, XP055113067, 6 pages.
“Average Conversion Time for a D60 RAW file?” http://www.dpreview.com, Jul. 22, 2002, 4 pages.
Comes, “MediaXchange User's Manual,” Version 1.15.15, Feb. 1, 2009, pp. 1-90.
Search Report for EP 11729851.3, Applicant: Box, Inc. Mailed Feb. 7, 2014, 9 pages.
Exam Report for GB1312874.9 Applicant: Box, Inc. Mailed Sep. 26, 2014, 2 pages.
Exam Report for GB1415126.0 Applicant: Box, Inc. Mailed Oct. 2, 2014, 8 pages.
Exam Report for GB1415314.2 Applicant: Box, Inc. Mailed Oct. 7, 2014, 6 pages.
Exam Report for GB1309209.3 Applicant: Box, Inc. Mailed Oct. 7, 2014, 3 pages.
Exam Report for GB1315232.7 Applicant: Box, Inc. Mailed Oct. 9, 2014, 5 pages.
Exam Report for GB1318789.3 Applicant: Box, Inc. Mailed Oct. 30, 2014, 6 pages.
Microsoft Windows XP Professional Product Documentation: How Inheritance Affects File and Folder Permissions, Apr. 11, 2014, 2 pages.
Exam Report for GB1317393.5 Applicant: Box, Inc. Mailed Nov. 7, 2014, 6 pages.
Exam Report for GB1311417.8 Applicant: Box, Inc. Mailed Nov. 7, 2014, 2 pages.
Exam Report for GB1311421.0 Applicant: Box, Inc. Mailed Nov. 7, 2014, 4 pages.
Exam Report for GB1316682.2 Applicant: Box, Inc. Mailed Nov. 19, 2014, 6 pages.
Exam Report for GB1312095.1 Applicant: Box, Inc. Mailed Nov. 19, 2014, 5 pages.
Exam Report for GB1313559.5 Applicant: Box, Inc. Mailed Nov. 4, 2014, 2 pages.
User's Guide for Smart Board Software for Windows, published Dec. 2004, 90 pages.
Zambonini et al., “Automated Measuring of Interaction with User Interfaces,” Published as WO2007113573 Oct. 2007, 19 pages.
Exam Report for GB1415314.2; Applicant: Box, Inc., Mailed Aug. 14, 2015, 2 pages.
Related Publications (1)
Number Date Country
20150082196 A1 Mar 2015 US
Continuations (1)
Number Date Country
Parent 14027149 Sep 2013 US
Child 14042473 US