Mobile device, methods and user interfaces thereof in a mobile device platform featuring multifunctional access and engagement in a collaborative environment provided by a cloud-based platform

Information

  • Patent Grant
  • 8892679
  • Patent Number
    8,892,679
  • Date Filed
    Friday, September 13, 2013
    11 years ago
  • Date Issued
    Tuesday, November 18, 2014
    10 years ago
Abstract
Techniques are disclosed for implementing an intuitive interface which can facilitate collaboration among the multiple users and collaborators as well as enable utilization of content in a shared space among multiple users in a more effective way. In one embodiment, a method comprises receiving updates regarding activities performed by a user and one or more collaborators on contents in a workspace. The method further comprises displaying, on an interactive user interface, lists of information based on the updates to facilitate interaction from the user with respect to the updates. An example of the lists of information can include an activity and, if one or more files are associated with the activity, thumbnails that represent previews of the one or more files. In some embodiments, the thumbnails can enable the user to interact with the files and/or the collaborators.
Description
BACKGROUND

With the advancements in digital technologies, data proliferation and the ever increasing mobility of user platforms have created enormous amounts of information traffic over mobile and computer networks. This is particularly relevant with the increase of electronic and digital content being used in social settings or shared environments of digital content compared to traditional stand-alone personal computers and mobile devices. As a result, content is shared across multiple devices among multiple users.


However, content sharing and synchronization currently lacks an intuitive interface which facilitates collaboration among the multiple users and collaborators as well as enables utilization of content in a shared space among multiple users in a more effective way.





BRIEF DESCRIPTION OF DRAWINGS

The present embodiments are illustrated by way of example and are not intended to be limited by the figures of the accompanying drawings. In the drawings:



FIG. 1 depicts an example diagram of a system having a host server of a cloud service, collaboration and/or cloud storage accounts with capabilities that facilitate collaboration among users as well as enable utilization of content in the workspace in an intuitive, effective manner;



FIG. 2 depicts an example diagram of a web-based or online collaboration platform deployed in an enterprise or other organizational setting for organizing work items and workspaces;



FIG. 3 depicts an example diagram of a workspace in an online or web-based collaboration environment accessible by multiple collaborators through various devices;



FIG. 4 depicts a block diagram illustrating an example of components in a mobile device with an interactive mobile user interface utilizing one or more techniques disclosed herein that facilitate collaboration among users as well as enable utilization of content in the workspace in an intuitive, effective manner;



FIG. 5A-5V respectively depict screenshots showing example user interfaces embodying one or more techniques disclosed herein for a mobile device of a small form factor;



FIG. 6A-6Q respectively depict screenshots showing example user interfaces embodying one or more techniques disclosed herein for a mobile device of a large form factor;



FIG. 7 depicts a flowchart illustrating an example process for a mobile device in implementing the techniques disclosed herein for facilitating collaboration among users as well as enabling utilization of content in the workspace in an intuitive, effective manner; and



FIG. 8 depicts a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, can be executed.





The same reference numbers and any acronyms identify elements or acts with the same or similar structure or functionality throughout the drawings and specification for ease of understanding and convenience.


DETAILED DESCRIPTION

The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding of the disclosure. However, in certain instances, well-known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure can be, but not necessarily are, references to the same embodiment; and, such references mean at least one of the embodiments.


Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which can be exhibited by some embodiments and not by others. Similarly, various requirements are described which can be requirements for some embodiments but not other embodiments.


The terms used in this specification generally have their ordinary meanings in the art, within the context of the disclosure, and in the specific context where each term is used. Certain terms that are used to describe the disclosure are discussed below, or elsewhere in the specification, to provide additional guidance to the practitioner regarding the description of the disclosure. For convenience, certain terms can be highlighted, for example using italics and/or quotation marks. The use of highlighting has no influence on the scope and meaning of a term; the scope and meaning of a term is the same, in the same context, whether or not it is highlighted. It will be appreciated that same thing can be said in more than one way.


Consequently, alternative language and synonyms can be used for any one or more of the terms discussed herein, nor is any special significance to be placed upon whether or not a term is elaborated or discussed herein. Synonyms for certain terms are provided. A recital of one or more synonyms does not exclude the use of other synonyms. The use of examples anywhere in this specification including examples of any terms discussed herein is illustrative only, and is not intended to further limit the scope and meaning of the disclosure or of any exemplified term. Likewise, the disclosure is not limited to various embodiments given in this specification.


Without intent to limit the scope of the disclosure, examples of instruments, apparatus, methods and their related results according to the embodiments of the present disclosure are given below. Note that titles or subtitles can be used in the examples for convenience of a reader, which in no way should limit the scope of the disclosure. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. In the case of conflict, the present document, including definitions will control.


Techniques are disclosed for implementing an intuitive interface which can facilitate collaboration among the multiple users and collaborators as well as enable utilization of content in a shared space among multiple users in a more effective way. In one embodiment, a method comprises receiving updates regarding activities performed by a user and one or more collaborators on contents in a workspace. The method further comprises displaying, on an interactive user interface, lists of information based on the updates to facilitate interaction from the user with respect to the updates. An example of the lists of information can include an activity and, if one or more files are associated with the activity, thumbnails that represent previews of the one or more files. In some embodiments, the thumbnails can enable the user to interact with the files and/or the collaborators.


Among other advantages, embodiments disclosed herein provide the ability, for each person of a user and his or her collaborators, to receive real-time updates about the collaborative workspace which is shared among them in a way that provides easy-to-understand information, enables intuitive utilization of files, and promotes social interactions with respect to the real-time updates, all of which can enhance the users' collaboration experience.



FIG. 1 illustrates an example diagram of a system 100 having a host server 110 of a cloud service/platform, collaboration and/or cloud storage service with capabilities that facilitate collaboration among users as well as enable utilization of content in the workspace in an intuitive, effective manner.


The client devices 102 can be any system and/or device, and/or any combination of devices/systems that is able to establish a connection, including wired, wireless, cellular connections with another device, a server and/or other systems such as host server 110. Client devices 102 typically include a display and/or other output functionalities to present information and data exchanged between among the devices 102, and/or the host server 110.


For example, the client devices 102 can include mobile, hand held or portable devices or non-portable devices and can be any of, but not limited to, a server desktop, a desktop computer, a computer cluster, or portable devices including, a notebook, a laptop computer, a handheld computer, a palmtop computer, a mobile phone, a cell phone, a PDA, a smart phone (e.g., a BlackBerry device such as BlackBerry Z10/Q10, an iPhone, Nexus 4, etc.), a Treo, a handheld tablet (e.g. an iPad, iPad Mini, a Galaxy Note, Galaxy Note II, Xoom Tablet, Microsoft Surface, Blackberry PlayBook, Nexus 7, 10 etc.), a phablet (e.g., HTC Droid DNA, etc.), a tablet PC, a thin-client, a hand held console, a hand held gaming device or console (e.g., XBOX live, Nintendo DS, Sony PlayStation Portable, etc.), iOS powered watch, Google Glass, a Chromebook and/or any other portable, mobile, hand held devices, etc. running on any platform or any operating system (e.g., Mac-based OS (OS X, iOS, etc.), Windows-based OS (Windows Mobile, Windows 7, Windows 8, etc.), Android, Blackberry OS, Embedded Linux platforms, Palm OS, Symbian platform, Google Chrome OS, and the like. In one embodiment, the client devices 102, and host server 110 are coupled via a network 106. In some embodiments, the devices 102 and host server 110 can be directly connected to one another.


The input mechanism on client devices 102 can include touch screen keypad (including single touch, multi-touch, gesture sensing in 2D or 3D, etc.), a physical keypad, a mouse, a pointer, a track pad, motion detector (e.g., including 1-axis, 2-axis, 3-axis accelerometer, etc.), a light sensor, capacitance sensor, resistance sensor, temperature sensor, proximity sensor, a piezoelectric device, device orientation detector (e.g., electronic compass, tilt sensor, rotation sensor, gyroscope, accelerometer), or a combination of the above.


Signals received or detected indicating user activity at client devices 102 through one or more of the above input mechanism, or others, can be used by various users or collaborators (e.g., collaborators 108) for accessing, through network 106, a web-based collaboration environment or online collaboration platform (e.g., hosted by the host server 110). The collaboration environment or platform can have one or more collective settings 105 for an enterprise or an organization that the users belong, and can provide an user interface 104 (e.g., via a webpage accessible by the web browsers of devices 102) for the users to access such platform under the settings 105. Additionally, a client software can be provided (e.g., through downloading from the host server 110 via the network 106) to run on the client devices 102 to provide cloud-based platform access functionalities. The users and/or collaborators can access the collaboration platform via a client software user interface 107, which can be provided by the execution of the client software on the devices 102.


The collaboration platform or environment hosts workspaces with work items that one or more users can access (e.g., view, edit, update, revise, comment, download, preview, tag, or otherwise manipulate, etc.). A work item can generally include any type of digital or electronic content that can be viewed or accessed via an electronic device (e.g., device 102). The digital content can include .pdf files, .doc, slides (e.g., Powerpoint slides), images, audio files, multimedia content, web pages, blogs, etc. A workspace can generally refer to any grouping of a set of digital content in the collaboration platform. The grouping can be created, identified, or specified by a user or through other means. This user can be a creator user or administrative user, for example. The host server 110 typically is equipped with or is coupled to a repository 130 for storing the work items and/or for hosting the workspace. The repository 130 can include, for example, one or more hard drives, a centralized or distributed data cluster, or other suitable storage systems suitable for storing digital data.


In general, a workspace can be associated with a set of users or collaborators (e.g., collaborators 108) which have access to the content included therein. The levels of access (e.g., based on permissions or rules) of each user or collaborator to access the content in a given workspace can be the same or can vary among the users. Each user can have their own set of access rights to every piece of content in the workspace, or each user can be different access rights to different pieces of content. Access rights can be specified by a user associated with a workspace and/or a user who created/uploaded a particular piece of content to the workspace, or any other designated user or collaborator.


In general, the collaboration platform allows multiple users or collaborators to access or collaborate efforts on work items such each user can see, remotely, edits, revisions, comments, or annotations being made to specific work items through their own user devices. For example, a user can upload a document to a workspace for other users to access (e.g., for viewing, editing, commenting, signing-off, or otherwise manipulating). The user can login to the online platform and upload the document (or any other type of work item) to an existing workspace or to a new workspace. The document can be shared with existing users or collaborators in a workspace.


In general, network 106, over which the client devices 102 and the host server 110 communicate can be a cellular network, a telephonic network, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet, or any combination or variation thereof. For example, the Internet can provide file transfer, remote log in, email, news, RSS, cloud-based services, instant messaging, visual voicemail, push mail, VoIP, and other services through any known or convenient protocol, such as, but is not limited to the TCP/IP protocol, Open System Interconnections (OSI), FTP, UPnP, iSCSI, NSF, ISDN, PDH, RS-232, SDH, SONET, etc.


The network 106 can be any collection of distinct networks operating wholly or partially in conjunction to provide connectivity to the client devices 102 and the host server 110 and can appear as one or more networks to the serviced systems and devices. In one embodiment, communications to and from the client devices 102 can be achieved by, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet. In one embodiment, communications can be achieved by a secure communications protocol, such as secure sockets layer (SSL), or transport layer security (TLS).


In addition, communications can be achieved via one or more networks, such as, but are not limited to, one or more of WiMax, a Local Area Network (LAN), Wireless Local Area Network (WLAN), a Personal area network (PAN), a Campus area network (CAN), a Metropolitan area network (MAN), a Wide area network (WAN), a Wireless wide area network (WWAN), enabled with technologies such as, by way of example, Global System for Mobile Communications (GSM), Personal Communications Service (PCS), Digital Advanced Mobile Phone Service (D-Amps), Bluetooth, Wi-Fi, Fixed Wireless Data, 2G, 2.5G, 3G, 4G, IMT-Advanced, pre-4G, 3G LTE, 3GPP LTE, LTE Advanced, mobile WiMax, WiMax 2, WirelessMAN-Advanced networks, enhanced data rates for GSM evolution (EDGE), General packet radio service (GPRS), enhanced GPRS, iBurst, UMTS, HSPDA, HSUPA, HSPA, UMTS-TDD, 1xRTT, EV-DO, messaging protocols such as, TCP/IP, SMS, MMS, extensible messaging and presence protocol (XMPP), real time messaging protocol (RTMP), instant messaging and presence protocol (IMPP), instant messaging, USSD, IRC, or any other wireless data networks or messaging protocols.


The embodiments disclosed herein recognize that, with the growing prevalence of the communication networks (e.g., the Internet) and smart portable devices (e.g., smart phones), there are many instances when it is desirable for a user (and the user's collaborators) to receive real-time updates about the collaborative workspace which is shared among them in a way that provides easy-to-understand information, enables intuitive utilization of files, and promotes social interactions with respect to the real-time updates. These functionalities can enhance the user's and the collaborators' collaboration experience of the cloud-based workspace.


In particular, in a large institute setting (e.g., an international corporation) where potentially a high volume of files and data are shared among users and collaborators, it is desirable to provide user-friendly interactive interface for accessing the content stored in the shared workspace in a manner that is as intuitive, straightforward, and easy as possible.


Accordingly, by taking into account many examples of visualization criteria discussed herein, the embodiments of the present disclosure can receive real-time or close-to-real-time updates about activities that took place (either by a user or the user's collaborators) in the workspace in an easy-to-understand manner. Additionally, some embodiments provide intuitive ways to allow the user to interact with the updates (including interacting with collaborators as well as utilizing the contents in the workspace) to enhance productivity.


The advantages provided by the techniques disclosed herein are particularly beneficial when a large number of collaborators are sharing the workspace, because each of a user and his or her collaborators, can receive a large number of updates, and can have a large variety of files in the shared workspace. By offering unified, multi-functional managing user interface such as the disclosed embodiments, the time to process and interact with an update information and its associated files can be reduced. Furthermore, effort to manage downloads, uploads, versions of files, and to find and learn different software for accessing different types of files, as well as the time spent on switching among software can be reduced.


In accordance with some embodiments, the devices 102 are configured to provide a multi-functional managing user interface to access the cloud-based collaboration platform. In particular, some embodiments of the devices 102 can receive updates regarding activities performed by the user and the one or more collaborators on contents in the workspace. Further, the device 102 can display, on the interactive user interface (e.g., interface 107), lists of information based on the updates to facilitate interaction from the user with respect to the updates. An example of a list of information can include an activity (e.g., upload, download, preview, comment, etc.) and, if one or more files (e.g., a JPEG photo, an AVI video, or a PDF document) are associated with the activity, thumbnails that represent previews of the one or more files. One or more embodiments provide that the thumbnails can enable the user to interact with the files, such as open, edit, play, highlight, comment, save, and so forth.


In some additional embodiments, the devices 102 are configured so that the thumbnails are to be displayed in the interactive user interface 107 based on one or more visualization criteria. For example, the client software that runs on the device 102 can detect a size of a screen on which the interactive user interface is displayed. In some embodiments, if the screen's size is what is generally regarded as a small form factor mobile device (e.g., less than or equal to 7.25 inches of screen size when measured diagonally), then a thumbnail is to occupy larger than 20% of a height of the screen. Moreover, if the screen's size is what is generally regarded as a large form factor mobile device (e.g., more than 7.25 inches of screen size when measured diagonally), then a thumbnail is to occupy larger than 10% but less than 20% of a height of the screen.


It is noted, however, that a person having ordinary skill in the art will understand that techniques disclosed herein can be selectively adopted by a client software such that the software can exclusively serve or run on a mobile device of a particular size (e.g., that belongs to a large form factor, or a small form factor).


In some embodiments, the lists of information can be sorted based on identities of who perform the activities. Additionally or alternatively, the lists of information can be further sorted based on timestamps of the activities.


More implementation details on the multi-functional interface which can be provided on the mobile client devices 102 for accessing the workspace, the files and folders stored therein, and on the social interacting functions which can be facilitated by the interface between the user and the collaborators, are discussed in fuller detail below, and particularly with regard to FIG. 4.



FIG. 2 depicts an example diagram of a web-based or online collaboration platform deployed in an enterprise or other organizational setting 250 for organizing work items 215, 235, 255 and workspaces 205, 225, 245.


The web-based platform for collaborating on projects or jointly working on documents can be used by individual users and shared among collaborators. In addition, the collaboration platform can be deployed in an organized setting including but not limited to, a company (e.g., an enterprise setting), a department in a company, an academic institution, a department in an academic institution, a class or course setting, or any other types of organizations or organized setting.


When deployed in an organizational setting, multiple workspaces (e.g., workspace A, B C) can be created to support different projects or a variety of work flows. Each workspace can have its own associate work items. For example, workspace A 205 can be associated with work items 215, workspace B 225 can be associated with work items 235, and workspace N can be associated with work items 255. The work items 215, 235, and 255 can be unique to each workspace but need not be. For example, a particular word document can be associated with only one workspace (e.g., workspace A 205) or it can be associated with multiple workspaces (e.g., Workspace A 205 and workspace B 225, etc.).


In general, each workspace has a set of users or collaborators associated with it. For example, workspace A 205 is associated with multiple users or collaborators 206. In some instances, workspaces deployed in an enterprise can be department specific. For example, workspace B can be associated with department 210 and some users shown as example user A 208 and workspace N 245 can be associated with departments 212 and 216 and users shown as example user B 214.


Each user associated with a workspace can generally access the work items associated with the workspace. The level of access will depend on permissions associated with the specific workspace, and/or with a specific work item. Permissions can be set for the workspace or set individually on a per work item basis. For example, the creator of a workspace (e.g., one of user A 208 who creates workspace B) can set one permission setting applicable to all work items 235 for other associated users and/or users associated with the affiliate department 210, for example. Creator user A 208 can also set different permission settings for each work item, which can be the same for different users, or varying for different users.


In each workspace A, B . . . N, when an action is performed on a work item by a given user or any other activity is detected in the workspace, other users in the same workspace can be notified (e.g., in real time or in near real time, or not in real time). Activities which trigger real time notifications can include, by way of example but not limitation, adding, deleting, or modifying collaborators in the workspace, uploading, downloading, adding, deleting a work item in the workspace, creating a discussion topic in the workspace.


In some embodiments, items or content downloaded or edited can cause notifications to be generated. Such notifications can be sent to relevant users to notify them of actions surrounding a download, an edit, a change, a modification, a new file, a conflicting version, an upload of an edited or modified file.


In one embodiment, in a user interface to the web-based collaboration platform where notifications are presented, users can, via the same interface, create action items (e.g., tasks) and delegate the action items to other users including collaborators pertaining to a work item 215, for example. The collaborators 206 can be in the same workspace A 205 or the user can include a newly invited collaborator. Similarly, in the same user interface where discussion topics can be created in a workspace (e.g., workspace A, B or N, etc.), actionable events on work items can be created and/or delegated/assigned to other users such as collaborators of a given workspace 206 or other users. Through the same user interface, task status and updates from multiple users or collaborators can be indicated and reflected. In some instances, the users can perform the tasks (e.g., review or approve or reject, etc.) via the same user interface.



FIG. 3A depicts an example diagram of a workspace 302 in an online or web-based collaboration environment accessible by multiple collaborators 322 through various devices.


Each of users 316, 318, and 320 can individually use multiple different devices to access and/or manipulate work items 324 in the workspace 302 with which they are associated with. For example users 316, 318, 320 can be collaborators on a project to which work items 324 are relevant. Since the work items 324 are hosted by the collaboration environment (e.g., a cloud-based environment), each user can access the work items 324 anytime, and from any physical location using any device (e.g., including devices they own or any shared/public/loaner device).


Work items to be edited or viewed can be accessed from the workspace 302. Users can also be notified of access, edit, modification, and/or upload related-actions performed on work items 324 by other users or any other types of activities detected in the workspace 302. For example, if user 316 modifies a document, one or both of the other collaborators 318 and 320 can be notified of the modification in real time, or near real-time, or not in real time. The notifications can be sent through any of all of the devices associated with a given user, in various formats including, one or more of, email, SMS, or via a pop-up window in a user interface in which the user uses to access the collaboration platform. In the event of multiple notifications, each notification can be depicted preferentially (e.g., ordering in the user interface) based on user preferences and/or relevance to the user (e.g., implicit or explicit).


For example, a notification of a download, access, read, write, edit, or uploaded related activities can be presented in a feed stream among other notifications through a user interface on the user device according to relevancy to the user determined based on current or recent activity of the user in the web-based collaboration environment.


In one embodiment, the notification feed stream further enables users to create or generate actionable events (e.g., as task) which are or can be performed by other users 316 or collaborators 322 (e.g., including admin users or other users not in the same workspace), either in the same workspace 302 or in some other workspace. The actionable events such as tasks can also be assigned or delegated to other users via the same user interface.


For example, a given notification regarding a work item 324 can be associated with user interface features allowing a user 316 to assign a task related to the work item 324 (e.g., to another user 316, admin user 318, creator user 320 or another user). In one embodiment, a commenting user interface or a comment action associated with a notification can be used in conjunction with user interface features to enable task assignment, delegation, and/or management of the relevant work item or work items in the relevant workspaces, in the same user interface.



FIG. 4 depicts a block diagram illustrating an example of components in a mobile device (e.g., device 102, FIG. 1; device 202, FIG. 2; devices 304-314, FIG. 3) with an interactive mobile user interface (e.g., interface 107, FIG. 1) utilizing one or more techniques disclosed herein that facilitate collaboration among users (e.g., user 208, or collaborators 206, FIG. 2; user 316 and collaborators 322, FIG. 3) as well as enable utilization of content (e.g., work items 215, 235, 255, FIG. 2; item 324, FIG. 3) in the workspace (e.g., workspace 205, 225, 245, FIG. 2; workspace 302, FIG. 3) in an intuitive, effective manner.



FIG. 5A-5V respectively depict screenshots showing example user interfaces embodying one or more techniques disclosed herein for a mobile device of a small form factor. FIG. 6A-6Q respectively depict screenshots showing example user interfaces embodying one or more techniques disclosed herein for a mobile device of a large form factor. With reference to FIGS. 1-3, 5A-5V and 6A-6Q, the multi-functional managing interface within which one or more embodiments disclosed herein can be implemented is described below.


The mobile device 400 can include, for example, a bus 402, and a memory 404 among other components. The memory 404 may include a user interface module 410, an activity monitor 420 and a content access manager 430. The memory 404 can also include a communication module 425 that facilitates communication between the mobile device 400 and the host server 110 using any of the communication protocols supported by the mobile device 400 and the host server 100. The memory 404 may also include other device modules 427 such as a GPS module for determining and providing location information, text input module for accepting and processing inputs provided using different input mechanisms of the mobile device, and the like for handling various functions of the mobile device 400. Additional or less components/modules/engines can be included in the mobile device 400 and each illustrated component.


The bus 402 is a subsystem for transferring data between the components of the mobile device 400. For example, the bus 402 facilitates the transfer of data between the memory 404 and other components of the mobile device such as the processor and/or the input/output components that utilize the data.


As used herein, a “module,” “a manager,” a “handler,” a “detector,” an “interface,” or an “engine” includes a general purpose, dedicated or shared processor and, typically, firmware or software modules that are executed by the processor. Depending upon implementation-specific or other considerations, the module, manager, handler, or engine can be centralized or its functionality distributed. The module, manager, hander, or engine can include general or special purpose hardware, firmware, or software embodied in a computer-readable (storage) medium for execution by the processor. As used herein, a computer-readable medium or computer-readable storage medium is intended to include all media that are statutory (e.g., in the United States, under 35 U.S.C. §101), and to specifically exclude all media that are non-statutory in nature to the extent that the exclusion is necessary for a claim that includes the computer-readable (storage) medium to be valid. Known statutory computer-readable mediums include hardware (e.g., registers, random access memory (RAM), non-volatile (NV) storage, to name a few), but may or may not be limited to hardware.


As previously described, overall, the mobile device 400 can provide a multi-functional managing interface (e.g., interface 107, such as generated by a client software running on device 400) to its user(s) for accessing contents stored in the cloud-based workspace as well as contents stored locally on the device 400; in addition, the interface includes integrated functions for enhancing user's file access experience, and for facilitating social interactions with one or more collaborators.


In one embodiment, the user interface module 410 can include a user interface rendering engine 410a and a user input detector 410b. The user interface rendering engine 410a includes program codes that accept data in Extensible Markup Language (XML), JavaScript Object Notation (JSON) or other forms and formatting or style information (e.g., Cascading Style Sheets (CSS)) to display the formatted content on the screen of the mobile device. An example of the rendering engine 410a is the webkit layout engine used in the Android platform. The rendering engine 410a may utilize C/C++ libraries such as SQL lite and graphics libraries such as OpenGL ES to render user interface graphics. The user input detector 410b can be coupled to one or more suitable pieces of hardware, for example, an actuatable button, a keyboard, a touchscreen, a gesture capturing device, a camera, a mouse, a microphone, and so forth, to receive user inputs for selecting and performing actions on the contents, whether stored in the cloud-based workspace 302 or locally on the device 400.


As an optional embodiment, the user interface module 410 can include a device form factor detector module 410c for determining a form factor of the device 400 (e.g., a size of a screen on which the interactive user interface 107 can be displayed). As is described in fuller detail below, in some embodiments, the user interface rendering engine 410a can adjust the layout and arrangement of the multi-functional managing interface 107 based on the detected form factor. It is noted, however, that a person having ordinary skill in the art will understand that techniques disclosed herein can be selectively adopted by a client software such that the software can exclusively serve or run on a mobile device of a particular size (e.g., that belongs to a large form factor, or a small form factor). It is also noted that the phrases “small form factor” and “large form factor” are used in a relative sense, although the present disclosure recognizes that, in general, a device with a screen size is less than or equal to 7.25 inches when measured diagonally can be referred to as a small(er) form factor device, and a device with a screen size is greater than 7.25 inches when measured diagonally can be referred to as a large(r) form factor device.


An example screenshot of a login view 500 of such multi-functional managing interface 107 displayed in a small form factor device is illustrated in FIG. 5A. An example screenshot of a login view 600 of such multi-functional managing interface 107 displayed in a large form factor device is illustrated in FIG. 6A. After a user logs in, contents that are stored in the workspace 302, and/or contents that are stored locally on the device 400 (e.g., as shown in FIG. 5B as local folder 502; in FIG. 6B as local folder 602) can be displayed, on the interactive user interface 107. Further, in some embodiments (e.g., of those implemented on small form factor devices) a multi-functional side bar (e.g., side bar 506, FIG. 5C) can be activated upon the user selecting a button 504. In some other embodiments (e.g., of those implemented on large form factor devices), the multi-functional side bar (e.g., side bar 606, FIG. 6B) can be displayed without any user activation.


The activity monitor 420 enables the devices 400 to receive updates regarding activities performed by the user and the one or more collaborators (e.g., collaborators 322) on contents in the workspace 302. The activity monitor 420 can include an activity tracking module 422 and an activity feed generator/aggregator module 424. The activity status tracking module 422 can track the status of an activity by periodically querying (e.g., via the optional activity information requestor module 422a) the host server or local storage areas of the mobile device 400 to obtain an update about an activity performed on one or more work items 324 in the workspace 302, such as a transfer of a file (e.g., for upload) to or (e.g., for upload) from the host server 110. For example, if the activity is an upload/download of a work item to or from the host server of the cloud-based platform, then the activity status tracking module 422 tracks the information of the upload/download by determining (e.g., via an activity information processing module 422b) parameters of the activity. Examples of these parameters can include activity type (e.g., upload, download, preview, comment, highlight, etc), identity of the performer of the activity, time of the activity's performance, and the file(s) associated with the activity. In the alternative embodiment which does not require the activity information requestor module 422a, the activity monitor 420 can receive updates from the host server 110 passively (e.g., via a “push” notice) or via other suitable means (e.g., by a notification server, not depicted for simplicity).


The activity feed generator module 424 can aggregate and process the activity information derived from the activity tracking module 422 and generate lists of information for display on the multi-functional managing interface 107 (e.g., via the user interface module 410) in an intuitive manner. In particular, the device 400 can display, on the interactive user interface (e.g., interface 107), lists of information based on the updates to facilitate interaction from the user with respect to the updates. An example of a list of information can include an activity (e.g., upload, download, preview, comment, etc.) and, if one or more files (e.g., a JPEG photo, an AVI video, or a PDF document) are associated with the activity, thumbnails that represent previews of the one or more files.


Examples of a preview can include a miniature representation of a first page (or a first few pages) of a document file, a miniature and perhaps lower resolution version of a photo file, a cover photo of an album that a music file belongs, a first frame (or even a short animation consisting a few frames) of a video file, or an default icon that represents a type of the file (e.g., an executable software). In this way, the preview in the updates can depict actual content of the files in the updates so as to enable the user to take a quick glance at the actual content of the files without the need for the user to access the file. This can reduce time and effort for the user in reading and responding to the updates when collaborating with his or her collaborators on items shared on the workspace 302.


Example screenshots of an update page which displays lists of information based on the received activity updates are shown respectively in FIG. 5D (for a small form factor device) and in FIG. 6C (for a large form factor device). The update page that contains can be activated by selecting an update button (e.g., button 508, FIG. 5C; button 608, FIG. 6B) on the side bar 506, 606.


As illustrated in the embodiment of FIG. 5D, a list of information 512a-512c each displays an activity, and if one or more files are associated with the activity, thumbnails 510a-510c that represent previews of the one or more files associated with the activity. Furthermore, in some implementations, the lists of information 512a-512c can be sorted based on identities of who perform the activities; additionally, the lists of information 512a-512c can be further sorted based on timestamps of the activities. For example, list 512a is shown to indicate that a collaborator “Nick Rolph” downloaded 2 files (from the workspace 302) today, and the two files that are downloaded by Nick are shown in thumbnails 510a and 510b. A portrait or an icon of Nick Rolph can be shown next to his name on the list 512a, in some examples, to enable easier identification of the activity performer. According to some embodiments, the thumbnails can show a preview of the file when such preview is available and/or appropriate. For example, thumbnail 510a displays a miniature version of the photographic file that is downloaded by Nick. In some embodiments, a scroll bar 514 (shown as hidden in FIG. 5D) can be activated (e.g., upon a touch on the screen or a scrolling swipe-up gesture) to show more lists of information (e.g., list 512c). Optionally, lists 512a-512c can include a contract/expand element 516a-516c, which can be operable (e.g., via a touchscreen) so as to hide/show the thumbnails 510a-510c. The scroll bar 514 can also be configured to show itself whenever the lists of information 512a-512c occupy over a page.


In some additional embodiments, the devices 400 are configured so that the thumbnails 510a-510c are to be displayed in the interactive user interface 107 based on one or more visualization criteria. For example, the client software that runs on the device 400 can detect a size of a screen on which the interactive user interface is displayed. In some embodiments, if the screen's size is what is generally regarded as a small form factor mobile device (e.g., less than or equal to 7.25 inches of screen size when measured diagonally), then a thumbnail is to occupy larger than 20% of a height of the screen; such example is generally illustrated in FIG. 5D. Moreover, if the screen's size is what is generally regarded as a large form factor mobile device (e.g., more than 7.25 inches of screen size when measured diagonally), then a thumbnail is to occupy larger than 10% but less than 20% of a height of the screen; such example is generally illustrated in FIG. 6C.


One or more embodiments provide that the thumbnails can enable the user to interact with the files, such as open, edit, play, highlight, comment, save, and so forth. For example, a user can click on thumbnail 510a to access to the picture, or on thumbnail 510c to access the PDF document. An example screenshot showing such integrated access is shown in FIG. 5E (for small form factor devices) and in FIG. 6D (for large form factor devices).


In some embodiments, a cloud status button can be provided in the interface 107 (e.g., the status button 536 in FIG. 5D) can be provided to the user to allow viewing of the mobile device 400's current upload/download activities to/from the host server 110, as well as other suitable activities like copying or moving contents between the host server 110 and device 400. An example of such status page is shown in the screenshot of FIG. 5K.


Furthermore, according to one or more embodiments, the content access manager 430 together with the communication module 425 can also enter an “always on” mode which enable the mobile device 400 to stay active (e.g., and not enter a stand-by mode) when there are uploading/downloading activities being performed on the mobile device 400 to increase the probability of successful operations, especially when the mobile device 400 is connected to a power outlet. In some embodiments, however, if the mobile device 400 runs on battery and the battery power become too low (e.g., <10%), or if the activity is draining too much battery power (e.g., 20%), then the device 400 can still go into the stand-by mode to conserver power.


In accordance with one or more embodiments, the multi-functional managing interface 107 can provide integrated access to contents stored in the workspace 302 and/or on the mobile device 400 in a way that is intuitive, efficient and collaboration-promoting. The present embodiments provide, through the managing interface 107, several ways to access the contents. As discussed above, a user can gain access to the contents via interacting with the lists of information 512a-512c that contain activity updates. Another way is to select to access content from a file managing page of the interface 107, which is accessible via selecting a file managing button 518 on the side bar 506 (or button 618 on the side bar 606). Example screenshots of the file managing page are shown in FIG. 5L (for small form factor devices) and FIG. 6J (for large form factor devices), respectively.


The content access manager 430 which can respond to the content access requests can include a file type detector 432, a social interaction module 436, and optimized software modules 434 including, for example, a photo module 434a, a music module 434b, a video module 434c, and a document module 434d.


In some embodiments, upon receiving a request from the user of a file in the contents for access, the file type detector 432 can detect a type of the file, and the content access manager 430 can determine whether the file's type is compatible with the mobile device 400. If the file's type is incompatible (e.g., if the file is not compiled for execution on the device 400, or if device 400 lacks proper tools to access such file), then the content access manager 430 can display an alert to the user before downloading the file for access. Additionally, the manager 430 can prompt the user with options, such as canceling the access, or downloading appropriate tool applications. If it is the case that the device 400 lacks proper tools to access such files, then the manager 430 can direct the user to a preferred application page (e.g., as shown in FIG. 5U and FIG. 6P, also respectively accessible via button 548, 648). In one or more embodiments, after a preferred application is downloaded and installed, it can be initiated within, or in conjunction with, the interface 107 (e.g., via an “open-in” function, described below) in accessing the file.


The social interaction module 436 can enable the user to interact with the collaborators during the integrated access to promote collaboration experience. According to some embodiments, when the user interacts with the list of information 512a by selecting the thumbnail 510a, or when the user selects to access a file from the file managing page and thereby entering an access page (as shown in FIG. 5E and FIG. 6D) for the file, the mobile device 400 can provide multiple social (e.g., via module 436) as well as content access functions (e.g., via manager 430) to the user. In some embodiments, functions provided in the example access page can be hidden or shown by toggling a function button (e.g., button 617 in FIG. 6D).


More specifically, the example access page as shown in FIG. 5E includes an open-in button 518, a share button 520, a comment button 522, and a “more function” button 524. Some embodiments of the mobile device 400 also includes a thumbnail browse button 526. Similarly, the function page as shown in FIG. 6E (which can be accessed via button 617 of FIG. 6D) includes an open-in button 618, a share button 620, a comment button 622, and a “more function” button 624. These functional buttons have similar functionalities regardless of the form factor of device 400; however, various screenshots are provided herein to provide a comparison of how the layout differences between a small form factor device (e.g., as shown in FIGS. 5A-5V) and a large form factor device (e.g., as shown FIGS. 6A-6Q) can be.


The open-in button 518, 618 can initiate access functionalities, within the multi-functional managing interface 107, including viewing, editing, or other suitable functions on, the selected content using one or more third-party applications that are installed in the mobile device 400. An example screenshot that allows a user to access the selected content using a third-party application is shown in FIG. 6G. An option can also be prompted to the user to add/download an application that is not currently installed on the mobile device 400, such as an “add apps” options illustrated in the screenshot of FIG. 6G. Although such functional page is not shown for a small form factor device, the open-in button 518 has similar functionalities as the open-in button 618.


The share button 520, 620 can initiate sharing functionalities, within the multi-functional managing interface 107, including attaching to email, emailing link, short-messaging link, copying link to clipboard, or other suitable functions, for the selected content. The links can be a public link or semi-public link (e.g., available for limited users within a certain domain only) that is provided form the workspace 302. An example screenshot that allows a user to share the selected content either via email attachment or via sharing a link provided by a cloud-based platform provider is shown in FIG. 6H.


The comment button 522, 622 can bring the user to a comment page, such as screenshots shown in FIGS. 5F and 6E. The comment page includes comments 522a, 622a recorded from other collaborators of the user on a particular content (e.g., a file, a folder, an assigned task, an activity, etc.). A user can also be prompted on the comment page to enter (e.g., via comment line 522b, 622b) his or her own comment so that other collaborators may see.


The more function button 524, 624 can bring the user to a “more function” page, such as screenshots shown in FIG. 5G and FIG. 6F. The more function page includes a list of actions that are available to the user to perform on a file. The identity of as well as other relevant information regarding the file can be displayed (e.g., on the top of FIG. 5G, or on the right-top corner of FIG. 6F) on the more function page. Examples of actions include “open-in,” “share,” “comment,” “favorite,” as well as file managing functions such as save, copy to clipboard, copy to folder, move, rename, delete, and so forth. It is noted that the actions available on the more function page may include those functions having quick buttons on the access page, such as “comment,” or “open-in.”


The thumbnail browse button 526, 626 can bring the user to a thumbnail browse page, such as screenshots shown in FIG. 5H. The thumbnail browse page can allow the user to browse the contents accessible through the multi-functional managing interface 107 in a visual and more intuitive manner. In some embodiments, the thumbnail browsing can be selected to be limited to browsing of photos and/or videos. The thumbnails can include preview of the first page or a few selected pages, pages, or other suitable visual miniature representation of the contents being viewed, such as an album cover of a music file.


In addition, the optimized software modules 434 can enable, accelerate and/or enhance the user experience of various types of access with the multi-functional managing interface 107.


In some embodiments, if the file that the user attempts to access is pictorial (e.g., a BMP file, a JPG file, or the like), the photo module 434a can start displaying the file before the file is downloaded completely by displaying a lower resolution version of the file first, and then the photo module 434a can gradually transition from the lower resolution version of the file to a full resolution of the file as the downloading of the file completes. In some embodiments, the photo module 434a can transition from the lower resolution version to the full resolution version when the downloading of the file is completed. Some embodiments provides that the transition can gradually takes place without the user's notice. More specifically, it is recognized in one or more embodiments of the photo module 434a that there is a rate limit for human eyes to adapt and see the details of a photo even when a full resolution photo is presented. Therefore, the photo module 434a can accelerate the display of photographic or pictorial files by displaying the lower resolution. In a preferred embodiment, the lower resolution version of the file still includes enough amount of details so that most humans cannot tell the difference between the lower and full resolution versions, especially when they are displayed in a thumbnail such as thumbnails 510a-510c.


In some embodiments, if the file that the user attempts to access is music or audio (e.g., an MP3 file, a WMA file, or the like), the music module 434b can start fetching metadata (e.g., ID3 tags) of the file as soon as the file starts to load, and then the music module 434b can display information related to the file based on the fetched metadata. In this way, the user can access information before the file finishes loading, such as seeing the album's cover art (e.g., downloaded from online database), or emailing/sharing the cover art with another. Examples of a music playback access page enabled by the music module 434b are shown in FIG. 5T and FIG. 6O.


In some embodiments, if the file that the user attempts to access is video or audiovisual (e.g., an AVI file, a VOB file, or the like), the video module 434b can pre-fetch a first frame, a first few frames, or a selected few frames from different segments, of the file so that a short preview can be shown to the user before a user attempts to access it. Also, similar to the music module 434b, the video module 434c can start fetching metadata of the video files as soon as the file starts to load. An example of a video playback access page enabled by the video module 434c is shown in FIG. 5I.


In some embodiments, if the file that the user attempts to access is of a portable document format (PDF), the document module 434d can provide, via the user interface 107 (e.g., in an PDF access page), a brightness button to adjust the brightness for displaying the PDF file. In some embodiments, the mobile device 400 can memorize a brightness profile or history on a per-user or a per-file basis. Examples of the brightness button are illustrated in FIG. 5J and FIG. 6I as button 530 and button 630, respectively. The document module 434d can also provide a search function to search the content of the PDF file. The search function button is illustrated in FIG. 5J as button 532, and in FIG. 6I as button 632. In addition, the document module 434d can allow the user to bookmark pages of the PDF file for easier later access. The bookmark function button is illustrated in FIG. 5J as button 534, and in FIG. 6I as button 634.


In some embodiments, the document module 434d can enable a thumbnail preview of the PDF file as previously described (e.g., showing the first page or a first few pages as one or several thumbnails representing the PDF file in situations that the file is associated, such as in an activity update). Some embodiments of the document module 434d can also provide note writing functions on the PDF file.


As an additional or alternative embodiment, the document module 434d can also recognize whether a file is a source code (e.g., from a file extension such as “.c”, “.cxx”, “.java”, etc.), and if the file is a source code, the document module 434d can color the source code based on one or more syntax highlighting rules. Other examples of source code types can include Javascript, Ruby, PHP, HTML, CSS, and so forth. The user can also customize the rules and/or install his or her own rules for syntax highlighting.


As previously mentioned, the multi-functional managing interface 107 can provide integrated access to contents stored in the workspace 302 and/or on the mobile device 400 in a way that is intuitive, efficient and collaboration-promoting. Accordingly, the file managing page of the interface 107 (see FIGS. 5L and 6J), which is accessible via selecting a file managing button 518 on the side bar 506 (or button 618 on the side bar 606), can provide various functionalities that improves the user experience.


For example, the user can easily switch among different folders by clicking or tapping on the button which shows the current folder, such as illustrated in the screenshots of FIG. 5M and FIG. 6K. Additionally, the user can perform a selected number of primary actions simply by tapping or selecting each file, such as illustrated in the screenshot of FIG. 5N. More specifically, in some embodiments, after tapping on a file, besides sharing (which is described above) and deleting, the user can add the selected file into the user's “favorite list.” The “add to favorite” button is shown, as examples, in FIG. 5N as button 529. The user's favorite list can be accessed through the favorite page, which can be entered by button 528 on the side bar 506 (FIG. 5C), or by button 628 on the side bar 606 (FIG. 6B).


Further, the file managing page can include an “add new” button (e.g., button 540, FIG. 5L; button 640, FIG. 6J), which can provide the user with action options such as add new folder, add new note, add new photo & video, add new audio recording, or import from library. Example screenshots depicting the add new page which can be triggered by the buttons 540, 640 are shown in FIGS. 50 and 6K, respectively. Example screenshots of a “new note” page, which can be activated via the add new note option, are shown in FIGS. 5P And 6L. The notes can be either saved locally on the mobile device 400 or in the workspace 302. Example screenshots of a finished note are shown in FIGS. 5Q and 6M.


The file managing page can also include a “bulk action” button (e.g., button 542, FIG. 5L; button 642, FIG. 6J), which can provide the user with action options that can be performed on multiple selected files, such as copy, cut, paste, move, or delete. Example screenshots depicting the bulk action page which can be triggered by the buttons 542, 642 are shown in FIG. 5R (for small form factor devices) and FIG. 6N (for large form factor devices). In some embodiments, for small for factor devices, the action options for bulk action can be displayed only after at least one file is selected, such as shown in FIG. 5S.


Further enhancing convenience and user experience, the user can select one or more files to be available for editing or viewing in an offline mode. Specifically, the content access manager 430 can coordinate with the communication module 425 to download the selected files onto the device 400 so as to prepare the selected files for the offline mode access. An offline access page, which can be entered via button 538 on the side bar 506 (FIG. 5C), or via button 638 on the side bar 606 (FIG. 6B), can display these offline accessible files for the user to access during an offline period, so that although workspace 302 may be temporarily unreachable, the user can still enjoy the various functionalities provided by the multi-functional managing interface 107. In some embodiments, after the device 400 becomes back online, the device 400 can automatically synchronize with the host server 110 and performs updates, including the aforementioned activity updates, to contents in the workspace 302.


In some embodiments, the preferred application page can include an option to allow the user to create his or her own categories. Examples of this function are shown in FIG. 5V and FIG. 6Q.


In some embodiments, the content access manager 430 can allow the user to compress (e.g., using a ZIP algorithm, an RAR algorithm, or other suitable file compression algorithms) select file or files, whether stored locally or in the workspace 302.



FIG. 7 depicts a flowchart illustrating an example process 700 for a mobile device (e.g., devices 102, FIG. 1; mobile device 400, FIG. 4) in implementing the techniques disclosed herein for facilitating collaboration among users as well as enabling utilization of content in the workspace (e.g., workspace 302, FIG. 3) in an intuitive, effective manner. The process 700 is performed, for example, by a processor that is included on the mobile device 102, 400. Workspace 302 (e.g., workspaces A 205, B 225, or N 245, FIG. 2) is shared among a user of the client devices 102, 400 and one or more collaborators (e.g., collaborators 108, FIG. 1; collaborators 322, FIG. 3) of the user. The host server 110 is a server that hosts the cloud-based environment which includes workspace 302.


In accordance with some embodiments, the devices 102, 400 are configured to provide a multi-functional managing user interface (e.g., interface 107, FIG. 1) to access the workspace 302 and/or local files on device 102, 400. In particular, some embodiments of the devices 102, 400 can receive (710) updates regarding activities performed by the user and the one or more collaborators on contents in the workspace 302 (e.g., via an activity monitor 420, FIG. 4).


Optionally, the device 102, 400 can detect (720) a size of a screen on which the interface 107 is displayed (e.g., via the device form factor detector module 410c, FIG. 4).


Further, the device 102, 400 can display (730) (e.g., via an UI module 410, FIG. 4), on the interactive user interface 107, lists of information (e.g., lists 512a-512c, FIG. 5D) based on the updates to facilitate interaction (such as various aforementioned functionalities provided by the content access manager 430, FIG. 4) from the user with respect to the updates.


A list of information (e.g., list 512a) can include (732) an activity (e.g., upload, download, preview, comment, etc.) and, if one or more files (e.g., a JPEG photo, an AVI video, or a PDF document) are associated with the activity, thumbnails (e.g., 510a-510c, FIG. 5D) that represent previews of the one or more files. One or more embodiments provide that the thumbnails 510a-510c can enable (734) the user to interact with the files, such as open, edit, play, highlight, comment, save, and so forth.


In some additional embodiments, the devices 102, 400 are configured (736) so that the thumbnails 510a-510c are to be displayed in the interactive user interface 107 based on one or more visualization criteria.


For example, in some embodiments, if the screen's size is what is generally regarded as a small form factor mobile device (e.g., less than or equal to 7.25 inches of screen size when measured diagonally), then a thumbnail (e.g., 510a) is to occupy larger than 20% of a height of the screen, such as shown in FIG. 5D. Moreover, if the screen's size is what is generally regarded as a large form factor mobile device (e.g., more than 7.25 inches of screen size when measured diagonally), then a thumbnail is to occupy larger than 10% but less than 20% of a height of the screen, such as shown in FIG. 6C.


Overall, among other advantages, embodiments disclosed herein provide the ability, for each person of a user and his or her collaborators, to receive real-time updates about the collaborative workspace which is shared among them in a way that provides easy-to-understand information, enables intuitive utilization of files, and promotes social interactions with respect to the real-time updates, all of which can enhance the users' collaboration experience.



FIG. 8 shows a diagrammatic representation 800 of a machine in the example form of a computer system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, can be executed.


In alternative embodiments, the machine operates as a standalone device or can be connected (e.g., networked) to other machines. In a networked deployment, the machine can operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.


The machine can be a server computer, a client computer, a personal computer (PC), a user device, a tablet, a phablet, a laptop computer, a set-top box (STB), a personal digital assistant (PDA), a thin-client device, a cellular telephone, an iPhone, an iPad, aBlackberry, a processor, a telephone, a web appliance, a network router, switch or bridge, a console, a hand-held console, a (hand-held) gaming device, a music player, any portable, mobile, hand-held device, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.


While the machine-readable medium or machine-readable storage medium is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” and “machine-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” and “machine-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the presently disclosed technique and innovation.


In general, the routines executed to implement the embodiments of the disclosure, can be implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions referred to as “computer programs.” The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processing units or processors in a computer, cause the computer to perform operations to execute elements involving the various aspects of the disclosure.


Moreover, while embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms, and that the disclosure applies equally regardless of the particular type of machine or computer-readable media used to actually effect the distribution.


Further examples of machine-readable storage media, machine-readable media, or computer-readable (storage) media include, but are not limited to, recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.), among others, and transmission type media such as digital and analog communication links.


The network interface device enables the machine 2800 to mediate data in a network with an entity that is external to the host server, through any known and/or convenient communications protocol supported by the host and the external entity. The network interface device can include one or more of a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, bridge router, a hub, a digital media receiver, and/or a repeater.


The network interface device can include a firewall which can, in some embodiments, govern and/or manage permission to access/proxy data in a computer network, and track varying levels of trust between different machines and/or applications. The firewall can be any number of modules having any combination of hardware and/or software components able to enforce a predetermined set of access rights between a particular set of machines and applications, machines and machines, and/or applications and applications, for example, to regulate the flow of traffic and resource sharing between these varying entities. The firewall can additionally manage and/or have access to an access control list which details permissions including for example, the access and operation rights of an object by an individual, a machine, and/or an application, and the circumstances under which the permission rights stand.


Other network security functions can be performed or included in the functions of the firewall, can be, for example, but are not limited to, intrusion-prevention, intrusion detection, next-generation firewall, personal firewall, etc. without deviating from the novel art of this disclosure.


Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof, means any connection or coupling, either direct or indirect, between two or more elements; the coupling of connection between the elements can be physical, logical, or a combination thereof. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number can also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.


The above detailed description of embodiments of the disclosure is not intended to be exhaustive or to limit the teachings to the precise form disclosed above. While specific embodiments of, and examples for, the disclosure are described above for illustrative purposes, various equivalent modifications are possible within the scope of the disclosure, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative embodiments can perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks can be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks can be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks can instead be performed in parallel, or can be performed at different times. Further, any specific numbers noted herein are only examples: alternative implementations can employ differing values or ranges.


The teachings of the disclosure provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various embodiments described above can be combined to provide further embodiments.


Any patents and applications and other references noted above, including any that can be listed in accompanying filing papers, are incorporated herein by reference. Aspects of the disclosure can be modified, if necessary, to employ the systems, functions, and concepts of the various references described above to provide yet further embodiments of the disclosure.


These and other changes can be made to the disclosure in light of the above Detailed Description. While the above description describes certain embodiments of the disclosure, and describes the best mode contemplated, no matter how detailed the above appears in text, the teachings can be practiced in many ways. Details of the system can vary considerably in its implementation details, while still being encompassed by the subject matter disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the disclosure should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the disclosure with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the disclosure to the specific embodiments disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the disclosure encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the disclosure under the claims.


While certain aspects of the disclosure are presented below in certain claim forms, the inventors contemplate the various aspects of the disclosure in any number of claim forms. For example, while only one aspect of the disclosure is recited as a means-plus-function claim under 35 U.S.C. §112, ¶6, other aspects can likewise be embodied as a means-plus-function claim, or in other forms, such as being embodied in a computer-readable medium. (Any claim intended to be treated under 35 U.S.C. §112, ¶6 begins with the words “means for”.) Accordingly, the applicant reserves the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the disclosure.

Claims
  • 1. A method for providing access to a cloud-based workspace via an interactive user interface on a mobile device, the cloud-based workspace being shared among a user at the mobile device and collaborators of the user, the method comprising: receiving, in near real time, updates regarding activities performed by the collaborators of the user on files in the cloud-based workspace;wherein the updates include generated lists of information based on the updates;displaying, on the interactive user interface of the mobile device, a visual feed of the lists of information;wherein each list of information includes: a description of the activities performed by the collaborators of the user on the files in the cloud-based workspace;an identification of the collaborators who performed the activities on the files; andthumbnails that illustrate previews of the files associated with the activities, wherein the previews depict actual content of the files; andupon receiving a selection from the user of one of the displayed lists of information: prompting the user with an option to perform one or more actions on the one or more files associated with the selected list of information; andprompting the user with an option to communicate with the one or more collaborators associated with the selected list of information;displaying, on the interactive user interface of the mobile device, the files in the workspace; andupon receiving a request from the user of a file in the workspace for access, determining that the file's type is compatible before downloading the file for access.
  • 2. The method of claim 1, wherein the thumbnails enable the user at the mobile device to interact with the files.
  • 3. The method of claim 1, wherein the thumbnails are configured to be displayed in the interactive user interface of the mobile device based on a plurality of visualization criteria.
  • 4. The method of claim 3, further comprising: detecting a size of a screen on which the interactive user interface of the mobile device is displayed.
  • 5. The method of claim 4, wherein a thumbnail is to occupy larger than 20% of a height of the screen if the screen's size is less than or equal to 7.25 inches when measured diagonally.
  • 6. The method of claim 4, wherein a thumbnail is to occupy larger than 10% but less than 20% of a height of the screen if the screen's size is more than 7.25 inches when measured diagonally.
  • 7. The method of claim 1, wherein the list of information is sorted based on identities of who perform the activities.
  • 8. The method of claim 7, wherein the list of information is further sorted based on timestamps of the activities.
  • 9. The method of claim 1, further comprising: if the file's type is incompatible, displaying an alert to the user before the downloading.
  • 10. The method of claim 1, further comprising: if the file's type is pictorial: starting displaying the file before downloading of the file completes by displaying a lower resolution version of the file first; andgradually transitioning from the lower resolution version of the file to a full resolution of the file as the downloading of the file completes.
  • 11. The method of claim 1, further comprising: if the file's type is audio:starting fetching metadata of the file as soon as loading of the file starts; anddisplaying information related to the file based on the metadata.
  • 12. The method of claim 1, further comprising: if the file's type is textual:identifying whether the file is a source code; andcoloring the source code based on a plurality of syntax highlighting rules.
  • 13. The method of claim 1, further comprising: if the file's type is of a portable document format (PDF):providing, via the user interface of the mobile device, a selectable element to adjust a brightness for the PDF file.
  • 14. The method of claim 1, wherein the user at the mobile device is prompted with an option to select more than one files.
  • 15. The method of claim 1, wherein the one or more actions include obtaining a link to the workspace for public access to the selected file.
  • 16. The method of claim 1, wherein the one or more actions include preparing the selected file for an offline mode access.
  • 17. The method of claim 1, wherein the one or more actions include at least one of: (a) writing a note for the selected file, wherein the note is to be stored in the workspace; (b) uploading the file to the workspace, if the selected file is stored locally; (c) downloading the file from the workspace, if the selected file is stored in the workspace; and (d) sharing the selected file via email to the collaborators.
  • 18. The method of claim 1, further comprising: upon receiving a selection from the user at the mobile device of a file in the workspace, allowing the user at the mobile device to add the selection into a favorite list of the user.
  • 19. A mobile device having an interactive user interface for providing access to a cloud-based workspace, the cloud-based workspace being shared among a user at the mobile device and collaborators of the user, the mobile device comprising: a processor; anda memory unit having instructions stored thereon which when executed by the processor, causes the processor to generate the interactive user interface on the mobile device and to:receive, in near real time, updates regarding activities performed by the collaborators of the user on files in the cloud-based workspace;wherein the updates include generated lists of information based on the updates;display, on the interactive user interface, a visual feed of the lists of information,wherein each list of information includes:a description of the activities performed by the collaborators of the user on the files in the cloud-based workspace;an identification of the collaborators who performed the activities on the files; andthumbnails that illustrate previews of the files associated with the activities,
  • 20. The mobile device of claim 19, wherein the thumbnails enable the user at the mobile device to interact with the files.
  • 21. The mobile device of claim 19, wherein the thumbnails are configured to be displayed in the interactive user interface of the mobile device based on a plurality of visualization criteria.
  • 22. The mobile device of claim 21, wherein the processor is further configured to: detect a size of a screen on which the interactive user interface of the mobile device is displayed.
  • 23. The mobile device of claim 22, wherein a thumbnail is to occupy larger than 20% of a height of the screen if the screen's size is less than or equal to 7.25 inches when measured diagonally.
  • 24. The mobile device of claim 22, wherein a thumbnail is to occupy larger than 10% but less than 20% of a height of the screen if the screen's size is more than 7.25 inches when measured diagonally.
  • 25. A computer system having an interactive user interface for providing access to a cloud-based workspace being shared among a user at the computer system and collaborators of the user, the computer system comprising: a processor; anda memory unit having instructions stored thereon which when executed by the processor causes the processor to generate and interactive user interface on the computer system and to: receive in near real time, updates regarding activities performed by the collaborators of the user on files in the cloud-based workspace;wherein the updates include generated lists of information based on the updates;display, on the interactive user interface of the mobile device, a visual feed of the lists of information;wherein each list of information includes:a description of the activities performed by the collaborators of the user on the files in the cloud-based workspace;an identification of the collaborators who performed the activities on the files; andthumbnails that illustrate previews of the files associated with the activities, wherein the previews depict actual content of the files;upon receiving a selection from the user of one of the displayed lists of information; prompt the user with an option to perform actions on the files associated with the selected list of information; andprompt the user with an option to communicate with the collaborators associated with the selected list of information; anddisplay, on the interactive user interface, the files in the workspace; andupon receiving a request from the user of a file in the workspace for access, determining that the file's type is compatible before downloading the file for access;wherein the processor is further configured to:(a) if the file's type is pictorial: start to display the file before downloading of the file completes by displaying a lower resolution version of the file first; andgradually transition from the lower resolution version of the file to a full resolution of the file as the downloading of the file completes;(b) if the file's type is audio: start to fetch metadata of the file as soon as loading of the file starts; anddisplay information related to the file based on the metadata; and(c) if the file's type is textual: identify whether the file is a source code; andcolor the source code based on a plurality of syntax highlighting rules.
  • 26. The computer system of claim 25, wherein the thumbnails enable the user to interact with the files.
  • 27. The computer system of claim 25, wherein the thumbnails are configured to be displayed in the interactive user interface based on a plurality of visualization criteria.
US Referenced Citations (372)
Number Name Date Kind
5799320 Klug Aug 1998 A
5848415 Guck Dec 1998 A
5999908 Abelow Dec 1999 A
6034621 Kaufman Mar 2000 A
6073161 DeBoskey et al. Jun 2000 A
6233600 Salas et al. May 2001 B1
6289345 Yasue Sep 2001 B1
6292803 Richardson et al. Sep 2001 B1
6336124 Alam et al. Jan 2002 B1
6342906 Kumar et al. Jan 2002 B1
6345386 Delo et al. Feb 2002 B1
6370543 Hoffert et al. Apr 2002 B2
6374260 Hoffert et al. Apr 2002 B1
6385606 Inohara et al. May 2002 B2
6396593 Laverty et al. May 2002 B1
6515681 Knight Feb 2003 B1
6584466 Serbinis et al. Jun 2003 B1
6636872 Heath et al. Oct 2003 B1
6654737 Nunez Nov 2003 B1
6662186 Esquibel et al. Dec 2003 B1
6687878 Eintracht et al. Feb 2004 B1
6714968 Prust Mar 2004 B1
6735623 Prust May 2004 B1
6742181 Koike et al. May 2004 B1
6760721 Chasen et al. Jul 2004 B1
6947162 Rosenberg et al. Sep 2005 B2
6952724 Prust Oct 2005 B2
6996768 Elo et al. Feb 2006 B1
7010752 Ly Mar 2006 B2
7020697 Goodman et al. Mar 2006 B1
7039806 Friedman et al. May 2006 B1
7069393 Miyata et al. Jun 2006 B2
7130831 Howard et al. Oct 2006 B2
7133834 Abelow Nov 2006 B1
7149787 Mutalik et al. Dec 2006 B1
7152182 Ji et al. Dec 2006 B2
7155483 Friend et al. Dec 2006 B1
7165107 Pouyoul et al. Jan 2007 B2
7222078 Abelow May 2007 B2
7275244 Charles Bell et al. Sep 2007 B1
7296025 Kung et al. Nov 2007 B2
7346778 Guiter et al. Mar 2008 B1
7353252 Yang et al. Apr 2008 B1
7370269 Prabhu et al. May 2008 B1
7401117 Dan et al. Jul 2008 B2
7543000 Castro et al. Jun 2009 B2
7581221 Lai et al. Aug 2009 B2
7620565 Abelow Nov 2009 B2
7650367 Arruza Jan 2010 B2
7661088 Burke Feb 2010 B2
7665093 Maybee et al. Feb 2010 B2
7676542 Moser et al. Mar 2010 B2
7698363 Dan et al. Apr 2010 B2
7734600 Wise et al. Jun 2010 B1
7756843 Palmer Jul 2010 B1
7774412 Schnepel Aug 2010 B1
7814426 Huesken et al. Oct 2010 B2
7886287 Davda Feb 2011 B1
7890964 Vogler-Ivashchanka et al. Feb 2011 B2
7937663 Parker et al. May 2011 B2
7958453 Taing Jun 2011 B1
7996374 Jones et al. Aug 2011 B1
8027976 Ding et al. Sep 2011 B1
RE42904 Stephens, Jr. Nov 2011 E
8065739 Bruening et al. Nov 2011 B1
8090361 Hagan Jan 2012 B2
8103662 Eagan et al. Jan 2012 B2
8117261 Briere et al. Feb 2012 B2
8140513 Ghods et al. Mar 2012 B2
8151183 Chen et al. Apr 2012 B2
8185830 Saha et al. May 2012 B2
8214747 Yankovich et al. Jul 2012 B1
8230348 Peters et al. Jul 2012 B2
8347276 Schadow Jan 2013 B2
8358701 Chou et al. Jan 2013 B2
8429540 Yankovich et al. Apr 2013 B1
8464161 Giles et al. Jun 2013 B2
8549066 Donahue et al. Oct 2013 B1
8549511 Seki et al. Oct 2013 B2
8607306 Bridge et al. Dec 2013 B1
20010027492 Gupta Oct 2001 A1
20020091738 Rohrabaugh et al. Jul 2002 A1
20020099772 Deshpande et al. Jul 2002 A1
20020133509 Johnston et al. Sep 2002 A1
20020147770 Tang Oct 2002 A1
20020194177 Sherman et al. Dec 2002 A1
20030041095 Konda et al. Feb 2003 A1
20030093404 Bader et al. May 2003 A1
20030108052 Inoue et al. Jun 2003 A1
20030110264 Whidby et al. Jun 2003 A1
20030135536 Lyons Jul 2003 A1
20030135565 Estrada Jul 2003 A1
20030154306 Perry Aug 2003 A1
20030204490 Kasriel Oct 2003 A1
20030217171 Von Stuermer et al. Nov 2003 A1
20040021686 Barberis Feb 2004 A1
20040088647 Miller et al. May 2004 A1
20040103147 Flesher et al. May 2004 A1
20040111415 Scardino et al. Jun 2004 A1
20040117438 Considine et al. Jun 2004 A1
20040122949 Zmudzinski et al. Jun 2004 A1
20040128359 Horvitz et al. Jul 2004 A1
20040181579 Huck et al. Sep 2004 A1
20040230624 Frolund et al. Nov 2004 A1
20040246532 Inada Dec 2004 A1
20040267836 Armangau et al. Dec 2004 A1
20050005276 Morgan Jan 2005 A1
20050010860 Weiss et al. Jan 2005 A1
20050022229 Gabriel et al. Jan 2005 A1
20050050228 Perham et al. Mar 2005 A1
20050063083 Dart et al. Mar 2005 A1
20050097225 Glatt et al. May 2005 A1
20050114305 Haynes et al. May 2005 A1
20050114378 Elien et al. May 2005 A1
20050198299 Beck et al. Sep 2005 A1
20050198452 Watanabe Sep 2005 A1
20050234864 Shapiro Oct 2005 A1
20050234943 Clarke Oct 2005 A1
20050261933 Magnuson Nov 2005 A1
20060026502 Dutta Feb 2006 A1
20060036568 Moore et al. Feb 2006 A1
20060041603 Paterson et al. Feb 2006 A1
20060047804 Fredricksen et al. Mar 2006 A1
20060053088 Ali et al. Mar 2006 A1
20060053380 Spataro et al. Mar 2006 A1
20060070083 Brunswig et al. Mar 2006 A1
20060075071 Gillette Apr 2006 A1
20060123062 Bobbitt et al. Jun 2006 A1
20060133340 Rybak et al. Jun 2006 A1
20060168550 Muller et al. Jul 2006 A1
20060174051 Lordi et al. Aug 2006 A1
20060174054 Matsuki Aug 2006 A1
20060179070 George et al. Aug 2006 A1
20060242204 Karas et al. Oct 2006 A1
20060259524 Horton Nov 2006 A1
20060265719 Astl et al. Nov 2006 A1
20060271510 Harward et al. Nov 2006 A1
20070016680 Burd et al. Jan 2007 A1
20070100830 Beedubail et al. May 2007 A1
20070115845 Hochwarth et al. May 2007 A1
20070118598 Bedi et al. May 2007 A1
20070124460 McMullen et al. May 2007 A1
20070124737 Wensley et al. May 2007 A1
20070124781 Casey et al. May 2007 A1
20070126635 Houri Jun 2007 A1
20070130143 Zhang et al. Jun 2007 A1
20070198609 Black et al. Aug 2007 A1
20070208878 Barnes-Leon et al. Sep 2007 A1
20070214180 Crawford Sep 2007 A1
20070220016 Estrada et al. Sep 2007 A1
20070220590 Rasmussen et al. Sep 2007 A1
20070240057 Satterfield et al. Oct 2007 A1
20070256065 Heishi et al. Nov 2007 A1
20070282848 Kiilerich et al. Dec 2007 A1
20070283443 McPherson et al. Dec 2007 A1
20070288290 Motoyama et al. Dec 2007 A1
20080005195 Li Jan 2008 A1
20080016146 Gan et al. Jan 2008 A1
20080021959 Naghi et al. Jan 2008 A1
20080028323 Rosen et al. Jan 2008 A1
20080040173 Aleong et al. Feb 2008 A1
20080046828 Bibliowicz et al. Feb 2008 A1
20080059656 Saliba et al. Mar 2008 A1
20080077631 Petri Mar 2008 A1
20080091763 Devonshire et al. Apr 2008 A1
20080091790 Beck Apr 2008 A1
20080104277 Tian May 2008 A1
20080114720 Smith et al. May 2008 A1
20080133674 Knauerhase et al. Jun 2008 A1
20080140732 Wilson et al. Jun 2008 A1
20080147790 Malaney et al. Jun 2008 A1
20080151817 Fitchett et al. Jun 2008 A1
20080154873 Redlich et al. Jun 2008 A1
20080182628 Lee et al. Jul 2008 A1
20080183467 Yuan et al. Jul 2008 A1
20080194239 Hagan Aug 2008 A1
20080215883 Fok et al. Sep 2008 A1
20080222654 Xu et al. Sep 2008 A1
20080243855 Prahlad et al. Oct 2008 A1
20080250333 Reeves et al. Oct 2008 A1
20080263099 Brady-Kalnay et al. Oct 2008 A1
20080271095 Shafton Oct 2008 A1
20080276158 Lim et al. Nov 2008 A1
20090015864 Hasegawa Jan 2009 A1
20090019093 Brodersen et al. Jan 2009 A1
20090019426 Baeumer et al. Jan 2009 A1
20090049131 Lyle et al. Feb 2009 A1
20090125469 McDonald et al. May 2009 A1
20090132651 Roger et al. May 2009 A1
20090138808 Moromisato et al. May 2009 A1
20090150417 Ghods et al. Jun 2009 A1
20090150627 Benhase et al. Jun 2009 A1
20090158142 Arthursson et al. Jun 2009 A1
20090164438 Delacruz Jun 2009 A1
20090193107 Srinivasan et al. Jul 2009 A1
20090193345 Wensley et al. Jul 2009 A1
20090198772 Kim et al. Aug 2009 A1
20090210459 Nair et al. Aug 2009 A1
20090214115 Kimura et al. Aug 2009 A1
20090235167 Boyer et al. Sep 2009 A1
20090235181 Saliba et al. Sep 2009 A1
20090235189 Aybes et al. Sep 2009 A1
20090249224 Davis et al. Oct 2009 A1
20090254589 Nair et al. Oct 2009 A1
20090260060 Smith et al. Oct 2009 A1
20090271708 Peters et al. Oct 2009 A1
20090282212 Peterson Nov 2009 A1
20090300527 Malcolm et al. Dec 2009 A1
20090327358 Lukiyanov et al. Dec 2009 A1
20090327961 De Vorchik et al. Dec 2009 A1
20100011292 Marinkovich et al. Jan 2010 A1
20100011447 Jothimani Jan 2010 A1
20100017262 Iyer et al. Jan 2010 A1
20100036929 Scherpa et al. Feb 2010 A1
20100042720 Stienhans et al. Feb 2010 A1
20100057785 Khosravy et al. Mar 2010 A1
20100076946 Barker et al. Mar 2010 A1
20100082634 Leban Apr 2010 A1
20100083136 Komine et al. Apr 2010 A1
20100088150 Mazhar et al. Apr 2010 A1
20100092126 Kaliszek et al. Apr 2010 A1
20100093310 Gbadegesin et al. Apr 2010 A1
20100107225 Spencer et al. Apr 2010 A1
20100131868 Chawla et al. May 2010 A1
20100151431 Miller Jun 2010 A1
20100162365 Del Real Jun 2010 A1
20100162374 Nair Jun 2010 A1
20100179940 Gilder et al. Jul 2010 A1
20100185463 Noland et al. Jul 2010 A1
20100191689 Cortes et al. Jul 2010 A1
20100198783 Wang et al. Aug 2010 A1
20100198871 Stiegler et al. Aug 2010 A1
20100198944 Ho et al. Aug 2010 A1
20100205537 Knighton et al. Aug 2010 A1
20100223378 Wei Sep 2010 A1
20100229085 Nelson et al. Sep 2010 A1
20100235526 Carter et al. Sep 2010 A1
20100235539 Carter et al. Sep 2010 A1
20100241611 Zuber Sep 2010 A1
20100241972 Spataro et al. Sep 2010 A1
20100250120 Waupotitsch et al. Sep 2010 A1
20100251340 Martin et al. Sep 2010 A1
20100257457 De Goes Oct 2010 A1
20100262582 Garcia-Ascanio et al. Oct 2010 A1
20100267588 Nelson et al. Oct 2010 A1
20100274765 Murphy et al. Oct 2010 A1
20100274772 Samuels Oct 2010 A1
20100281118 Donahue et al. Nov 2010 A1
20100306379 Ferris Dec 2010 A1
20100322252 Suganthi et al. Dec 2010 A1
20100325155 Skinner et al. Dec 2010 A1
20100325655 Perez Dec 2010 A1
20100332401 Prahlad et al. Dec 2010 A1
20100333116 Prahlad et al. Dec 2010 A1
20110001763 Murakami Jan 2011 A1
20110016409 Grosz et al. Jan 2011 A1
20110022559 Andersen et al. Jan 2011 A1
20110022812 van der Linden et al. Jan 2011 A1
20110029883 Lussier et al. Feb 2011 A1
20110040812 Phillips Feb 2011 A1
20110041083 Gabai et al. Feb 2011 A1
20110047413 McGill et al. Feb 2011 A1
20110052155 Desmarais et al. Mar 2011 A1
20110054968 Galaviz Mar 2011 A1
20110055299 Phillips Mar 2011 A1
20110055721 Jain et al. Mar 2011 A1
20110061045 Phillips Mar 2011 A1
20110061046 Phillips Mar 2011 A1
20110065082 Gal et al. Mar 2011 A1
20110066951 Ward-Karet et al. Mar 2011 A1
20110083167 Carpenter et al. Apr 2011 A1
20110093567 Jeon et al. Apr 2011 A1
20110099006 Sundararaman et al. Apr 2011 A1
20110119313 Sung et al. May 2011 A1
20110142410 Ishii Jun 2011 A1
20110145744 Haynes et al. Jun 2011 A1
20110161289 Pei et al. Jun 2011 A1
20110167125 Achlioptas Jul 2011 A1
20110167353 Grosz et al. Jul 2011 A1
20110167435 Fang Jul 2011 A1
20110185292 Chawla et al. Jul 2011 A1
20110202424 Chun et al. Aug 2011 A1
20110202599 Yuan et al. Aug 2011 A1
20110208958 Stuedi et al. Aug 2011 A1
20110209064 Jorgensen et al. Aug 2011 A1
20110213765 Cui et al. Sep 2011 A1
20110219419 Reisman Sep 2011 A1
20110238458 Purcell et al. Sep 2011 A1
20110238621 Agrawal Sep 2011 A1
20110239135 Spataro et al. Sep 2011 A1
20110246294 Robb et al. Oct 2011 A1
20110246950 Luna et al. Oct 2011 A1
20110258461 Bates Oct 2011 A1
20110258561 Ladouceur et al. Oct 2011 A1
20110282710 Akkiraju et al. Nov 2011 A1
20110289433 Whalin et al. Nov 2011 A1
20110296022 Ferris et al. Dec 2011 A1
20110313803 Friend et al. Dec 2011 A1
20110320197 Conejero et al. Dec 2011 A1
20120036370 Lim et al. Feb 2012 A1
20120064879 Panei Mar 2012 A1
20120072436 Pierre et al. Mar 2012 A1
20120079095 Evans et al. Mar 2012 A1
20120089659 Halevi et al. Apr 2012 A1
20120110436 Adler, III et al. May 2012 A1
20120117626 Yates et al. May 2012 A1
20120124306 Abercrombie et al. May 2012 A1
20120124547 Halbedel May 2012 A1
20120130900 Tang et al. May 2012 A1
20120134491 Liu May 2012 A1
20120136936 Quintuna May 2012 A1
20120150888 Hyatt et al. Jun 2012 A1
20120151551 Readshaw et al. Jun 2012 A1
20120158908 Luna et al. Jun 2012 A1
20120159178 Lin et al. Jun 2012 A1
20120159310 Chang et al. Jun 2012 A1
20120173625 Berger Jul 2012 A1
20120179981 Whalin et al. Jul 2012 A1
20120185355 Kilroy Jul 2012 A1
20120185913 Martinez et al. Jul 2012 A1
20120192055 Antebi et al. Jul 2012 A1
20120192086 Ghods et al. Jul 2012 A1
20120203908 Beaty et al. Aug 2012 A1
20120204032 Wilkins et al. Aug 2012 A1
20120214444 McBride et al. Aug 2012 A1
20120218885 Abel et al. Aug 2012 A1
20120221789 Felter Aug 2012 A1
20120226767 Luna et al. Sep 2012 A1
20120233155 Gallmeier et al. Sep 2012 A1
20120233205 McDermott Sep 2012 A1
20120240061 Hillenius et al. Sep 2012 A1
20120266203 Elhadad et al. Oct 2012 A1
20120284638 Cutler et al. Nov 2012 A1
20120284664 Zhao Nov 2012 A1
20120291011 Quine Nov 2012 A1
20120309540 Holme et al. Dec 2012 A1
20120311157 Erickson et al. Dec 2012 A1
20120317239 Mulder et al. Dec 2012 A1
20120317487 Lieb et al. Dec 2012 A1
20120328259 Seibert, Jr. et al. Dec 2012 A1
20120331177 Jensen Dec 2012 A1
20120331441 Adamson Dec 2012 A1
20130007245 Malik et al. Jan 2013 A1
20130007471 Grab et al. Jan 2013 A1
20130013560 Goldberg et al. Jan 2013 A1
20130014023 Lee et al. Jan 2013 A1
20130042106 Persaud et al. Feb 2013 A1
20130055127 Saito et al. Feb 2013 A1
20130067232 Cheung et al. Mar 2013 A1
20130080919 Kiang et al. Mar 2013 A1
20130117337 Dunham May 2013 A1
20130124638 Barreto et al. May 2013 A1
20130138608 Smith May 2013 A1
20130138615 Gupta et al. May 2013 A1
20130159411 Bowen Jun 2013 A1
20130163289 Kim et al. Jun 2013 A1
20130185347 Romano Jul 2013 A1
20130185558 Seibert et al. Jul 2013 A1
20130191339 Haden et al. Jul 2013 A1
20130198600 Lockhart et al. Aug 2013 A1
20130212486 Joshi et al. Aug 2013 A1
20130246932 Zaveri et al. Sep 2013 A1
20130262862 Hartley Oct 2013 A1
20130268480 Dorman Oct 2013 A1
20130268491 Chung et al. Oct 2013 A1
20130275398 Dorman et al. Oct 2013 A1
20130275429 York et al. Oct 2013 A1
20130275509 Micucci et al. Oct 2013 A1
20130305039 Gauda Nov 2013 A1
20140033291 Liu Jan 2014 A1
20140052939 Tseng et al. Feb 2014 A1
20140068589 Barak Mar 2014 A1
Foreign Referenced Citations (39)
Number Date Country
2724521 Nov 2009 CA
101997924 Mar 2011 CN
102264063 Nov 2011 CN
0921661 Jun 1999 EP
1349088 Oct 2003 EP
1528746 May 2005 EP
2372574 Oct 2011 EP
2610776 Jul 2013 EP
2453924 Apr 2009 GB
2471282 Dec 2010 GB
09-101937 Apr 1997 JP
11-025059 Jan 1999 JP
2003273912 Sep 2003 JP
2004310272 Nov 2004 JP
09-269925 Oct 2007 JP
2008250944 Oct 2008 JP
20020017444 Mar 2002 KR
20040028036 Apr 2004 KR
20050017674 Feb 2005 KR
20060070306 Jun 2006 KR
20060114871 Nov 2006 KR
20070043353 Apr 2007 KR
20070100477 Oct 2007 KR
20100118836 Nov 2010 KR
20110074096 Jun 2011 KR
20110076831 Jul 2011 KR
WO-0007104 Feb 2000 WO
WO-0219128 Mar 2002 WO
WO-2004097681 Nov 2004 WO
WO-2006028850 Mar 2006 WO
WO-2007024438 Mar 2007 WO
WO-2007035637 Mar 2007 WO
WO-2008011142 Jan 2008 WO
WO-2008076520 Jun 2008 WO
WO-2011109416 Sep 2011 WO
WO-2012167272 Dec 2012 WO
WO-2013009328 Jan 2013 WO
WO-2013013217 Jan 2013 WO
WO-2013041763 Mar 2013 WO
Non-Patent Literature Citations (89)
Entry
“Conceptboard”, One-Step Solution for Online Collaboration, retrieved from websites http://conceptboard.com and https://www.youtube.com/user/ConceptboardApp?feature=watch, printed on Jun. 13, 2013, 9 pages.
“How-to Geek, How to Sync Specific Folders With Dropbox,” downloaded from the internet http://www.howtogeek.com, Apr. 23, 2013, 5 pages.
“Microsoft Office SharePoint 2007 User Guide,” Feb. 16, 2010, pp. 1-48.
“Understanding Metadata,” National Information Standards Organization, NISO Press, 2004, 20 pages.
Cisco, “FTP Load Balancing on ACE in Routed Mode Configuration Example,” DocWiki, Jun. 2011, 7 pages.
Conner, “Google Apps: The Missing Manual,” published by O'Reilly Media, May 27, 2008, 24 pages.
Exam Report for EP13158415.3, Applicant: Box, Inc. Mailed Jun. 4, 2013, 8 pages.
Exam Report for GB1300188.8, Applicant: Box, Inc. Mailed May 31, 2013, 8 pages.
Exam Report for GB1306011.6, Applicant: Box, Inc. Mailed Apr. 18, 2013, 8 pages.
Exam Report for GB1310666.1, Applicant: Box, Inc. Mailed Aug. 30, 2013, 10 pages.
Exam Report for GB1313559.5, Applicant: Box, Inc., Mailed Aug. 22, 2013, 19 pages.
Google Docs, http://web.Archive.org/web/20100413105758/http://en.wikipedia.org/wiki/Google—docs, Apr. 13, 2010, 6 pages.
International Search Report and Written Opinion for PCT/US2008/012973 dated Apr. 30, 2009, pp. 1-11.
International Search Report and Written Opinion for PCT/US2011/039126 mailed on Oct. 6, 2011, pp. 1-13.
International Search Report and Written Opinion for PCT/US2011/041308 Mailed Jul. 2, 2012, pp. 1-16.
International Search Report and Written Opinion for PCT/US2011/047530, Applicant: Box, Inc., Mailed Mar. 22, 2013, pp. 1-10.
International Search Report and Written Opinion for PCT/US2011/056472 mailed on Jun. 22, 2012, pp. 1-12.
International Search Report and Written Opinion for PCT/US2011/057938, Applicant: Box, Inc., Mailed Mar. 29, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2011/060875 Mailed Oct. 30, 2012, pp. 1-10.
International Search Report and Written Opinion for PCT/US2012/056955, Applicant: Box, Inc., Mailed Mar. 27, 2013, pp. 1-11.
International Search Report and Written Opinion for PCT/US2012/063041, Applicant: Box, Inc., Mailed Mar. 29, 2013, 12 pages.
International Search Report and Written Opinion for PCT/US2012/065617, Applicant: Box, Inc., Mailed Mar. 29, 2013, 9 pages.
International Search Report and Written Opinion for PCT/US2012/067126, Applicant: Box, Inc., Mailed Mar. 29, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2012/070366, Applicant: Box, Inc., Mailed Apr. 24, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2013/020267, Applicant: Box, Inc., Mailed May 7, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2013/023889, Applicant: Box, Inc., Mailed Jun. 24, 2013, 13 pages.
International Search Report and Written Opinion for PCT/US2013/029520, Applicant: Box, Inc., Mailed Jun. 26, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2013/034662, Applicant: Box, Inc., Mailed May 31, 2013, 10 pages.
International Search Report and Written Opinion for PCT/US2013/035404, Applicant: Box, Inc., Mailed Jun. 26, 2013, 13 pages.
International Search Report and Written Opinion for PCT/US2013/039782, Applicant: Box, Inc., Mailed Aug. 28, 2013, 15 pages.
Internet Forums, http://web.archive.org/web/20100528195550/http://en.wikipedia.org/wiki/Internet—forums, Wikipedia, May 30, 2010, pp. 1-20.
Langfeld L. et al., “Microsoft SharePoint 2003 Unleashed,” Chapters 11 and 15, Jun. 2004, pp. 403-404, 557-561, 578-581.
Lars, “35 Very Useful Online Tools for Improving your project Management and Team Collaboration,” Apr. 31, 2010, tripwiremagazine.com, pp. 1-32.
Palmer, “Load Balancing FTP Servers,” BlogNav, Oct. 2008, 2 pages.
Parr, “Google Docs Improves Commenting, Adds E-mail Notifications,” Apr. 16, 2011, mashable.com, pp. 1-6.
Partial International Search Report for PCT/US2011/041308 dated Feb. 27, 2012, pp. 1- 2.
Supplementary European Search Report European Application No. EP 08 85 8563 dated Jun. 20, 2011 pp. 1-5.
Wayback, “Wayback machine,” Wayback, Jun. 1, 2011, 1 page.
Wiki, http://web.archive.org/web/20100213004936/http://en.wikipedia.org/wiki/Wiki, Feb. 13, 2010, pp. 1-16.
Yahoo! Groups, http://web.archive.org/web/20090320101529/http://en.wikipedia.org/wiki/Yahoo!—Groups, Wikipedia, Mar. 20, 2009, pp. 1-6.
“Average Conversion Time for a D60 RAW file?” http://www.dpreview.com, Jul. 22, 2002, 4 pages.
“Tulsa TechFest 2012—Agenda,” retrieved from the website, http://web.archive.org, Oct. 2, 2012, 2 pages.
Burns, “Developing Secure Mobile Applications for Android,” Oct. 2008, Version 1.0, 1-28 pages.
Cohen, “Debating the Definition of Cloud Computing Platforms,” retrieved from the internet, http:://forbes.com, Feb. 3, 2014, 7 pages.
Comes, “MediaXchange User's Manual,” Version 1.15.15, Feb. 1, 2009, pp. 1-90.
Delendik, “Evolving with Web Standards—The Story of PDF.JS,” retrieved from the internet, http://people.mozilla.org, Oct. 12, 2012, 36 pages.
Delendik, “My PDF.js talk slides from Tulsa TechFest,” retrieved from the internet, http://twitter.com, Oct. 12, 2012, 2 pages.
Exam Report for EP13185269.1, Applicant: Box, Inc. Mailed Jan. 28, 7 pages.
Exam Report for GB1308842.2, Applicant: Box, Inc. Mailed Mar. 10, 2014, 4 pages.
Exam Report for GB1312264.3, Applicant: Box, Inc. Mailed Mar. 24, 2014, 7 pages.
Exam Report for GB1314771.5, Applicant: Box, Inc. Mailed Feb. 17, 2014, 7 pages.
John et al., “Always Sync Support Forums—View topic—Allway sync funny behavior,” Allway Sync Support Forum at http://sync-center.com, Mar. 28, 2011, XP055109680, 2 pages.
Rao, “Box Acquires Crocodoc to Add HTML5 Document Converter and Sleek Content Viewing Experience to Cloud Storage Platform,” retrieved from the internet, http://techcrunch.com, May 9, 2013, 8 pages.
Search Report for EP 11729851.3, Applicant: Box, Inc. Mailed Feb. 7, 2014, 9 pages.
Search Report for EP13187217.8, Applicant: Box, Inc. Mailed Apr. 15, 2014, 12 pages.
Search Report for EP14151588.2, Applicant: Box, Inc. Mailed Apr. 15, 2014, 12 pages.
Search Report for EP14153783.7, Applicant: Box, Inc. Mailed Mar. 24, 2014, 7 pages.
Sommerer, “Presentable Document Format: Improved On-demand PDF to HTML Conversion,” retrieved from the internet, http://research.microsoft.com, 8 pages.
Walker, “PDF.js project meeting notes,” retrieved from the internet, http://groups.google.com, May 15, 2014, 1 page.
“PaperPort Professional 14,” PC Mag. Com review, published Feb. 2012, Ziff Davis, Inc., 8 pages.
“PaperPort,” Wikipedia article (old revision), published May 19, 2012, Wikipedia Foundation, 2 pages.
“Quickoffice Enhances Android Mobile office Application for Improved Productivity on latest Smartphone and Table Devices,” QuickOffice Press Release, Nov. 21, 2011, QuickOffice Inc., 2 pages.
“QuickOffice,” Wikipedia Article (old revision), published May 9, 2012, Wikipedia Foundation, 2 pages.
Exam Report for EP13168784.0, Applicant: Box, Inc. Mailed Nov. 21, 2013, 7 pages.
Exam Report for GB1309209.3, Applicant: Box, Inc. Mailed Oct. 30, 2013, 11 pages.
Exam Report for GB1311417.8, Applicant: Box, Inc. Mailed Dec. 20, 2013, 5 pages.
Exam Report for GB1312095.1, Applicant: Box, Inc. Mailed Dec. 12, 2013, 7 pages.
Exam Report for GB1312874.9, Applicant: Box, Inc. Mailed Dec. 20, 2013, 11 pages.
Exam Report for GB1316532.9, Applicant: Box, Inc. Mailed Oct. 31, 2013, 10 pages.
Exam Report for GB1316533.7, Applicant: Box, Inc. Mailed Oct. 8, 2013, 9 pages.
Exam Report for GB1316971.9, Applicant: Box, Inc. Mailed Nov. 26, 2013, 10 pages.
Exam Report for GB1317600.3, Applicant: Box, Inc. Mailed Nov. 21, 2013, 8 pages.
Exam Report for GB1318373.6, Applicant: Box, Inc. Mailed Dec. 17, 2013, 4 pages.
Exam Report for GB1320902.8, Applicant: Box, Inc. Mailed Dec. 20, 2013, 4 pages.
Gedymin, “Cloud computing with an emphasis on Google App Engine,” Master Final Project, Sep. 2011, 146 pages.
International Search Report and Written Opinion for PCT/US2013/034765, Applicant: Box, Inc., Mailed Jan. 20, 2014, 15 pages.
Patent Court Document of Approved Judgment for GB0602349.3 and GB0623571.7; Mar. 3, 2009, 17 pages.
“Revolving sync conflicts; frequently asked questions,” Microsoft Tech Support, Jul. 16, 2012, retrieved from the Internet: http://web.archive.org/web, 2 pages.
“Troubleshoot sync problems,” Microsoft Tech Support: May 2, 2012, retrieved from the internet, http://web. Archive.org/web, 3 pages.
Duffy, “The Best File-Syncing Services,” pcmag.com, retrieved from the internet: http://www.pcmag.com, Sep. 28, 2012, 7 pages.
Exam Report for EP13177108.1, Applicant: Box, Inc. Mailed May 26, 2014, 6 pages.
Exam Report for GB1318792.7, Applicant: Box, Inc. Mailed May 22, 2014, 2 pages.
Partial Search Report for EP131832800, Applicant: Box, Inc. Mailed May 8, 2014, 5 pages.
Pyle et al., “How to enable Event logging for Offline Files (Client Side Caching) in Windows Vista,” Feb. 18, 2009, retrieved from the internet: http://blogs.technet.com, 3 pages.
Search Report for EP141509422, Applicant: Box, Inc. Mailed May 8, 2014, 7 pages.
Tulloch et al., “Windows Vista Resource Kit,” Apr. 8, 2007, Microsoft Press, XP055113067, 6 pages.
Exam Report for GB1410569.6 Applicant: Box, Inc. Mailed Jul. 11, 2014, 9 pages.
Extended Search Report for EP131832800, Applicant: Box, Inc. Mailed Aug. 25, 2014, 7 pages.
Extended Search Report for EP141509422, Applicant: Box, Inc. Mailed Aug. 26, 2014, 12 pages.