Ubiquitous remote access to services, application programs and data has become commonplace as a result of the growth and availability of broadband and wireless network access. As such, users are accessing application programs and data using an ever-growing variety of client devices (e.g., mobile devices, tablet computing devices, laptop/notebook/desktop computers, etc.). Data may be communicated to the devices from a remote server over a variety of networks including, 3G and 4G mobile data networks, wireless networks such as WiFi and WiMax, wired networks, etc. Clients may connect to a server offering the services, applications programs and data across many disparate network bandwidths and latencies.
In such an environment, applications may also be shared among remote users in a collaborative session. However, when collaborating, users may be limited solely to the functionalities provided by the shared application, thus limiting the collaborative session.
Disclosed herein are systems and methods for providing remote access to an application in a tiered remote access framework. In accordance with the present disclosure, a method of providing remote access to an application is disclosed. The method may include providing a tiered remote access framework comprising a server tier in which an application that is remotely accessed and a server remote access application execute on a server, and a client tier in which a client remote access application executes on a client device; providing a server SDK that is associated with the application in the server tier, the server SDK being adapted to communicate display information to the client tier; and providing a client SDK that is associated with a client application executing on the client device, the client SDK being adapted to receive the display information from the server tier. The client device connects to the server at an enumerated Uniform Resource Locator (URL) associated with the application program to initiate the reception of the display information.
In accordance with other aspects, there is disclosed a method of providing remote access to an application. The method may include providing a tiered remote access framework that includes a server tier in which an application that is remotely accessed and a server remote access application execute on a server; and a client tier in which a client remote access application executes on a client device. The method also includes enumerating a Uniform Resource Locator (URL) that is associated with the application in the server tier; receiving a connection from the client device at the enumerated URL; and communicating display information associated with the application to the client remote access application.
Other systems, methods, features and/or advantages will be or may become apparent to one with skill in the art upon examination of the following drawings and detailed description. It is intended that all such additional systems, methods, features and/or advantages be included within this description and be protected by the accompanying claims.
The components in the drawings are not necessarily to scale relative to each other. Like reference numerals designate corresponding parts throughout the several views.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art. Methods and materials similar or equivalent to those described herein can be used in the practice or testing of the present disclosure. While implementations will be described for remotely accessing applications, it will become evident to those skilled in the art that the implementations are not limited thereto, but are applicable for remotely accessing any type of data or service via a remote device.
Referring to
The server 102B is connected, for example, via the computer network 110 to a Local Area Network (LAN) 109 or may be directly connected to the computer network 110. For example, the LAN 109 is an internal computer network of an institution such as a hospital, a bank, a large business, or a government department. Typically, such institutions still use a mainframe computer 102A and a database 108 connected to the LAN 109. Numerous application programs 107A may be stored in memory 106A of the mainframe computer 102A and executed on a processor 104A. Similarly, numerous application programs 107B may be stored in memory 106B of the server 102B and executed on a processor 104B. The application programs 107A and 107B may be “services” offered for remote access. The mainframe computer 102A, the server 102B and the client computing devices 112A, 112B, 112C or 112N may be implemented using hardware such as that shown in the general purpose computing device of
A client remote access application 121A, 121B, 121C, 121N may be designed for providing user interaction for displaying data and/or imagery in a human comprehensible fashion and for determining user input data in dependence upon received user instructions for interacting with the application program using, for example, a graphical display with touch-screen 114A or a graphical display 114B/114N and a keyboard 116B/116C of the client computing devices 112A, 112B, 112C, 112N, respectively. For example, the client remote access application is performed by executing executable commands on processor 118A, 118B, 118C, 118N with the commands being stored in memory 120A, 120B, 120C, 120N of the client computing devices 112A, 112B, 112C, 112N, respectively.
Alternatively or additionally, a user interface program is executed on the server 102B (as one of application programs 107B) which is then accessed via an URL by a generic client application such as, for example, a web browser executed on the client computing device 112A, 112B. The user interface is implemented using, for example, Hyper Text Markup Language HTML 5. In some implementations, the server 102B may participate in a collaborative session with the client computing devices 112A, 112B, 112C . . . 112N. For example, the aforementioned one of the application programs 107B may enable the server 102B to collaboratively interact with the application program 107A or another application program 107B and the client remote access applications 121A, 121B, 121C, 121N. As such, the server 102B and each of the participating client computing devices 112A, 112B, 112C . . . 112N may present a synchronized view of the display of the application program.
The operation of a server remote access application 111B with the client remote access application (any of 121A, 121B, 121C, 121N, or one of application programs 107B) is performed in cooperation with a state model 200, as illustrated in
Upon receipt of application data from an application program 107A or 1097B, the server remote access application 111B updates the state model 200 in accordance with the screen or application data, generates presentation data in accordance with the updated state model 200, and provides the same to the client remote access application 121A, 121B, 121C, 121N on the client computing device. The state model 200 comprises an association of logical elements of the application program with corresponding states of the application program, with the logical elements being in a hierarchical order. For example, the logical elements may be a screen, a menu, a submenu, a button, etc. that make up the application program user interface. This enables the client device, for example, to natively display the logical elements. As such, a menu of the application program that is presented on a mobile phone will look like a native menu of the mobile phone. Similarly, the menu of the application program that is presented on desktop computer will look like a native menu of the desktop computer operating system.
The state model 200 is determined such that each of the logical elements is associated with a corresponding state of the application program 107A or 107B. The state model 200 may be determined such that the logical elements are associated with user interactions. For example, the logical elements of the application program are determined such that the logical elements comprise transition elements with each transition element relating a change of the state model 200 to one of control data and application representation data associated therewith.
In some implementations, two or more of the client computing devices 112A, 112B, 112C . . . 112N and/or the server 102B may collaboratively interact with the application program 107A or 107B. As such, by communicating state information between each of the client computing devices 112A, 112B, 112C . . . 112N and/or the server 102B and/or the mainframe computer 102A participating in a collaborative session, each of the participating client computing devices 112A, 112B, 112C . . . 112N may present a synchronized view of the display of the application program 107A or 107B.
In accordance with some implementations, the system 100 may provide for uncoupled or decoupled application extensions. Such extensions are provided as part of the server remote access application 111B (e.g., as a plug-in), the client remote access applications 121A, 121B, 121C, 121N (e.g., as part of a client software development kit (SDK)), one of the applications 107B (e.g., as part of a server SDK), or combinations thereof to provide features and functionalities that are otherwise are not provided by the application programs 107A or 107B. These are described more fully with regard to
For example, an “interactive digital surface layer” may be provided as an application extension to enable participants to a collaborative session to make annotations on top of the application running in the session. Such a layer may function as an “acetate layer” in that it is transparently overlaid on top of the application running in the session. The interactive digital surface layer functions like a scribble tool to enable a user to draw lines, arrows, symbols, scribbles, etc. on top of an application to provide collaboration of both the application and the interactive digital surface layer. As will be described below with reference to
In yet another example, in the application tier, the application extension 310 may be a separate executable program that includes new features to enhance the applications 107A/107B. The application extension 310 may consume the state model 200 and produce its own document 314 (i.e., a state model of the application extension 310) that may include: (1) information from the state model 200 and information associated with the application extension 310, (2) only information associated with the application extension 310, or (3) a combination of some of the state model information and information associated with the extension state model 314. The extension state model 314 may be communicated to the server remote access application 111B, where the server remote access application 111B may compose an updated state model 200 to include the information in the extension state model 314. Alternatively or additionally, the client remote access application 121A, 121B, 121C, 121N may receive both the state model 200 and the extension state model 314, and the client remote access application may compose an updated state model 200 to include the information in the extension state model 314.
In general, the interactive digital surface layer may be used for any purpose to provide information to a user and/or provide features and functionalities that are independent of the application being shared by users in a collaborative session. The interactive digital surface layer may provide such features concurrently with the execution of the application or as a stand-alone application. For example, the interactive digital surface layer may be used to enable users to annotate a display as the application is executing, to enable pointing operations by the user, and to provide notifications about status or user availability. The interactive digital surface layer may be enabled and disabled by a user control and may be controlled by a session leader or each participant within the collaborative session. Additional features may be provided within the framework discussed above. Details of the above-noted features will now be provided.
Under the collaboration node there are also one or more views defined. In the example of
The above information is displayed by the client remote access application as illustrated in axial views of
Below is an example section of a state model 200 in accordance with the tree of
Information regarding the application (107A or 107B) is maintained in the ApplicationState node in a first portion of the XML state model. Different states of the application program associated with the axial view and the coronal view are defined, as well as related triggers. For example, in the axial view a “field” is defined for receiving a name as user input data and displaying the same. The uncoupled collaboration states and application extension states (e.g., interactive digital surface layer) are maintained in a second portion of the XML document.
The state model 200 may thus contain session information about the application itself, the application extension information (e.g., interactive digital surface layer), information about views, and how to tie the annotations to specific views (e.g., scribble, arrow, circle tied to axial view).
If the user activates the draw function 508, then the interactive digital surface layer is operable to receive user input to collaboratively display annotations input by the users in the session. As noted above, the annotations may be color-coded to each of the users in the collaborative session. Referring to
In accordance with features of the interactive digital surface layer, users may see each other's mouse cursor. Each client may erase annotations made within the interactive digital surface layer. The interactive digital surface layer may be saved either separately from the application, or may be consumed by original application and saved in a meaningful way. The interactive digital surface layer may also be bookmarked in case a collaboration session goes down, such that it can be recovered.
During a collaborative session, a user may wish to point to an area of the user interface 500 without interacting with the underlying application program 107A/107B. For example, a user may be making a presentation of a slide deck and may wish to “point” to an item on the slide being displayed in the user interface 200. The interactive digital surface layer may be used to provide such an indication to other users in the collaborative session.
To accommodate the above, the sending of mouse cursor position data may be separated from the sending of mouse input events to the application 107A/107B so that the position and event data can be triggered independently of one another. As such, a cursor position tool may be directed to send cursor information without input events that would otherwise cause an interaction when the user of the tablet device 112N does not desire such interaction with the application program 107A/107B. The above may be achieved by separating a single method that updates the interactive digital surface layer for cursor position into two methods, one of which performs cursor position updates, and one of which queues the input events. Optionally or additionally, the mouse cursor may change characteristics when operating in such a mode. For example, where the mouse cursor is being used for indication purposes, the cursor may thicken, change color, change shape, blink, etc. to indicate to other users that the cursor is being used as an indicator.
While the above may be implemented for all types of client computers, a particular use case is where users of mobile devices having a touch-sensitive interface (e.g., tablet device 112N) wish to indicate to other users what he or she is currently viewing on the display. Typically, a touch of a tablet device represents an interaction with the application program 107A/107B. In accordance with the above, separating the mouse cursor position data (i.e., the touch location) from the sending of mouse input events (i.e., the actual touch) enables users of tablet devices 112N to make such an indication similar to client computers having a pointing device.
In another aspect that may be combined with the above or separately implemented, annotations can be created in the interactive digital surface layer without interacting with the underlying application program 107A/107B, and interactions with the underlying application program 107A/107B do not necessarily create annotations within the interactive digital surface layer. For example, with reference to
Additionally or alternatively to the drawing function above, in some implementations, the interactive digital surface layer may be used to provide other indicators, such as indications of the users within the session, the user who is controlling of the interactive digital surface layer, an indicator that a user has joined or exited a collaboration session, whether changes in image quality (JPEG downsizing) have been made, etc. In general, the indicator may be enabled by providing information in the extension state model 314 that is used to trigger the appearance and/or disappearance of the indicator. In some implementations, the indicator may be temporarily displayed to a user. In other implementations, the indicator may remain until a user action is taken, such as a gesture on the tablet device 112N, a click of a mouse, etc.
In accordance with other implementations, the interactive digital surface layer may be provided access to names of users participating in a collaborative session. The name information may be contained in the extension state model 314 and displayed to other users in the collaborative session using the interactive digital surface layer extension 310. As noted above, each user may be associated with a unique color that may be used for pointer location and mark-up within the interactive digital surface layer. Each user may toggle the visibility of other user's cursor positions. Users may clear their own mark-up, and toggle visibility of other users' mark-up. Among the users, a leader may be chosen or default to an initial creator. The leader may clear all mark-up and transfer leadership. If a leader drops from the collaborative session, the leadership may transfer to a next participant in the session.
To enable the above, client side APIs may be provided the respond to changes in the extension state model 314 to provide the appropriate displays and controls. For example, APIs may be provided to build user interface components, render images in the interactive digital surface layer, provide a capture/clipboard feature, provide an undo/redo feature, provide for changes to image/line thickness, provide for font selection/size/attribute, select a color, provide for text boxes, provide for emoticon/icons, and provide for watermarks (e.g., confidential, draft, working copy).
Based on configuration settings, an API may determine whether the interactive digital surface layer will be “on” for everyone when one user turns the interactive digital surface layer on, if individual users may toggle the visibility of the interactive digital surface layer independent of other users, or if the interactive digital surface layer will be on only if the leader toggles it on. Certain configuration may enable an API to determine whether individual clients may toggle visibility of other users' markup, or if only the leader may control a client's visibility on behalf of everyone. Yet other configuration may enable an API to determine whether individual clients may toggle visibility of other users' cursor position, or if only the leader may control a client's cursor visibility on behalf of everyone. It may be specified that only the leader may clear all other users' markup; however, anyone can clear their own markup at any time.
In some implementations, the state of the interactive digital surface layer may be saved. For example, the extension state model 314 may be saved at the client computing devices 112A, 112B, 112C or 112N for later retrieval. A replay option may be provided to “replay” the states within the extension state model 314 that were received over a period of time during an interactive session. Users may be enabled to save a drawing in the interactive digital surface layer and select from a list of saved images. In some implementations, the images may be saved as a set of vectors in an XML file that is loaded so a user may continue editing the drawing.
Thus, as described above, the present disclosure provides for an interactive digital surface layer that may be independently provided and used collaboratively by users to provide annotations on top of an application. Also, more generally, the present disclosure provides for application extensions that may be made available through a remote access mechanism. The application extensions provide enhancements to applications through the remote access mechanism without the need to modify the applications themselves.
At 604, a selection is received to activate the interactive digital surface layer. For example, a user may select the draw function 508 in the sharing control 604. At 606, user inputs are received in the interactive digital surface layer. The user inputs may be received as lines, arrows, squiggles, circles, etc., drawn on the interactive digital surface layer by one or more of the users in the collaborative session. The interactive digital surface layer may function to collaboratively receive the inputs from the one or more users in the session.
At 608, the extension state model is updated. For example, the extension state model 314 may be updated to represent the annotations made by each of the users in the collaborative session. Where the user input is a pre-defined shape, such as a circle, line, square or other geometric shape, the extension state model 314 may represent such shapes using XY coordinates. For a line, the XY coordinates may define an origin and together with a length the line may be defined. For a circle, the coordinate point may be a center, and a radius may be defined from the center to represent the circle. Squiggles may be represented by a series of small vectors, that when combined define the shape of the squiggle. The use of coordinates, rather than pixels, provides for scaling of the shapes in the various displays of the client computing devices participating in a collaborative session.
At 610, the extension state model is communicated to the participants in the collaborative session. As described in
At 654, the interactive digital surface layer is activated. For example, a user may select a function in the sharing control 604 or the interactive digital surface layer may be automatically activated at step 652.
At 656, notification-related information is processed. For example, the extension state model 314 is received at the client computing device and an application may process information contained in the extension state model 314 to determine if a notification should be presented. For example, timing information in the extension state model 314 may be examined to determine if a participant in the collaborative session is experiencing network latency. As another example, an identity of a new user to a collaborative session may be indicated.
At 658, a notification displayed, if necessary. If the processing at 656 determines that a notification should be presented, then the notification is presented to the user in the interactive digital surface layer. For example, network latency may be greater than a predetermined threshold; as such that indicator 520 may be presented to the user.
The processes at 656 and 658 may be repeated to determine if the notification should be removed or the notification may be removed after a predetermined time-out period has expired.
Numerous other general purpose or special purpose computing system environments or configurations may be used. Examples of well known computing systems, environments, and/or configurations that may be suitable for use include, but are not limited to, personal computers, server computers, handheld or laptop devices, multiprocessor systems, microprocessor-based systems, network personal computers (PCs), minicomputers, mainframe computers, embedded systems, distributed computing environments that include any of the above systems or devices, and the like.
Computer-executable instructions, such as program modules, being executed by a computer may be used. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Distributed computing environments may be used where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium. In a distributed computing environment, program modules and other data may be located in both local and remote computer storage media including memory storage devices.
With reference to
Computing device 700 may have additional features/functionality. For example, computing device 700 may include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape. Such additional storage is illustrated in
Computing device 700 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by device 700 and includes both volatile and non-volatile media, removable and non-removable media.
Computer storage media include volatile and non-volatile, and removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Memory 704, removable storage 708, and non-removable storage 710 are all examples of computer storage media. Computer storage media include, but are not limited to, RAM, ROM, electrically erasable program read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 700. Any such computer storage media may be part of computing device 700.
Computing device 700 may contain communications connection(s) 712 that allow the device to communicate with other devices. Computing device 700 may also have input device(s) 714 such as a keyboard, mouse, pen, voice input device, touch input device, etc. Output device(s) 716 such as a display, speakers, printer, etc. may also be included. All these devices are well known in the art and need not be discussed at length here.
It should be understood that the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both. Thus, the methods and apparatus of the presently disclosed subject matter, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the presently disclosed subject matter. In the case of program code execution on programmable computers, the computing device generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs may implement or utilize the processes described in connection with the presently disclosed subject matter, e.g., through the use of an application programming interface (API), reusable controls, or the like. Such programs may be implemented in a high level procedural or object-oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language and it may be combined with hardware implementations.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
This application is a continuation of U.S. patent application Ser. No. 13/632,245, filed Oct. 1, 2012, and entitled “Uncoupled Application Extensions Including Interactive Digital Surface Layer for Collaborative Remote Application Sharing and Annotating.” The present application also claims priority to U.S. Provisional Patent Application No. 61/541,540, filed Sep. 30, 2011 and U.S. Provisional Patent Application No. 61/623,108, filed Apr. 12, 2012, each entitled “Uncoupled Application Extensions Including Acetate Layer for Collaborative Remote Application Sharing and Annotating.” The above applications are incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
5249121 | Baum | Sep 1993 | A |
6151621 | Colyer et al. | Nov 2000 | A |
6342906 | Kumar | Jan 2002 | B1 |
6602185 | Uchikubo | Aug 2003 | B1 |
6662210 | Carleton et al. | Dec 2003 | B1 |
6698021 | Amini et al. | Feb 2004 | B1 |
6742015 | Bowman-Amuah | May 2004 | B1 |
6763371 | Jandel | Jul 2004 | B1 |
6938212 | Nakamura | Aug 2005 | B2 |
6981062 | Suryanarayana | Dec 2005 | B2 |
6996605 | Low et al. | Feb 2006 | B2 |
7003550 | Cleasby et al. | Feb 2006 | B1 |
7133895 | Lee et al. | Nov 2006 | B1 |
7152092 | Beams et al. | Dec 2006 | B2 |
7191233 | Miller | Mar 2007 | B2 |
7197561 | Lovy | Mar 2007 | B1 |
7254634 | Davis et al. | Aug 2007 | B1 |
7287054 | Lee et al. | Oct 2007 | B2 |
7346616 | Ramanujam et al. | Mar 2008 | B2 |
7356563 | Leichtling et al. | Apr 2008 | B1 |
7363342 | Wang et al. | Apr 2008 | B1 |
7533146 | Kumar | May 2009 | B1 |
7624185 | Miller et al. | Nov 2009 | B2 |
7676506 | Reinsch | Mar 2010 | B2 |
7810089 | Sundarrajan et al. | Oct 2010 | B2 |
7950026 | Urbach | May 2011 | B1 |
7984115 | Tien et al. | Jul 2011 | B2 |
8010901 | Rogers | Aug 2011 | B1 |
8065166 | Maresh et al. | Nov 2011 | B2 |
8122341 | Dayan et al. | Feb 2012 | B1 |
8195146 | Prakash et al. | Jun 2012 | B2 |
8239773 | Billman | Aug 2012 | B1 |
8478307 | Hayes | Jul 2013 | B1 |
8527591 | Pirnazar | Sep 2013 | B2 |
8856259 | Burckart et al. | Oct 2014 | B2 |
8909703 | Gupta et al. | Dec 2014 | B2 |
8935328 | Tumuluri | Jan 2015 | B2 |
20010037358 | Clubb | Nov 2001 | A1 |
20020051541 | Glick et al. | May 2002 | A1 |
20030179230 | Seidman | Sep 2003 | A1 |
20040045017 | Dorner et al. | Mar 2004 | A1 |
20040249885 | Petropoulakis et al. | Dec 2004 | A1 |
20050114711 | Hesselink et al. | May 2005 | A1 |
20050114789 | Chang et al. | May 2005 | A1 |
20050154288 | Wang | Jul 2005 | A1 |
20050246422 | Laning | Nov 2005 | A1 |
20060041891 | Aaron | Feb 2006 | A1 |
20060053380 | Spataro et al. | Mar 2006 | A1 |
20060085245 | Takatsuka et al. | Apr 2006 | A1 |
20060112188 | Albanese | May 2006 | A1 |
20060179119 | Kurosawa et al. | Aug 2006 | A1 |
20060221081 | Cohen | Oct 2006 | A1 |
20060242254 | Okazaki et al. | Oct 2006 | A1 |
20070143398 | Graham | Jun 2007 | A1 |
20070244930 | Bartlette et al. | Oct 2007 | A1 |
20070244962 | Laadan et al. | Oct 2007 | A1 |
20080028323 | Rosen | Jan 2008 | A1 |
20080052377 | Light | Feb 2008 | A1 |
20080195362 | Belcher et al. | Aug 2008 | A1 |
20080320081 | Shriver-Blake et al. | Dec 2008 | A1 |
20090070404 | Mazzaferri | Mar 2009 | A1 |
20090094369 | Wooldridge et al. | Apr 2009 | A1 |
20090106422 | Kriewall | Apr 2009 | A1 |
20090172100 | Callanan et al. | Jul 2009 | A1 |
20090187817 | Ivashin et al. | Jul 2009 | A1 |
20100017727 | Offer et al. | Jan 2010 | A1 |
20100115023 | Peled | May 2010 | A1 |
20100131591 | Thomas et al. | May 2010 | A1 |
20100274858 | Lindberg et al. | Oct 2010 | A1 |
20100306642 | Lowet | Dec 2010 | A1 |
20110047190 | Lee et al. | Feb 2011 | A1 |
20110058052 | Bolton | Mar 2011 | A1 |
20110119716 | Coleman, Sr. | May 2011 | A1 |
20110128378 | Raji | Jun 2011 | A1 |
20110154302 | Balko et al. | Jun 2011 | A1 |
20110187652 | Huibers | Aug 2011 | A1 |
20110191438 | Huibers et al. | Aug 2011 | A1 |
20110191823 | Huibers | Aug 2011 | A1 |
20110219419 | Reisman | Sep 2011 | A1 |
20120072833 | Song et al. | Mar 2012 | A1 |
20120159308 | Tseng | Jun 2012 | A1 |
20120159356 | Steelberg | Jun 2012 | A1 |
20120210242 | Burckart et al. | Aug 2012 | A1 |
20120210243 | Uhma et al. | Aug 2012 | A1 |
20120233555 | Psistakis et al. | Sep 2012 | A1 |
20120331061 | Lininger | Dec 2012 | A1 |
20130046815 | Thomas et al. | Feb 2013 | A1 |
20130120368 | Miller | May 2013 | A1 |
20130159709 | Ivory et al. | Jun 2013 | A1 |
20130208966 | Zhao et al. | Aug 2013 | A1 |
20130297676 | Binyamin | Nov 2013 | A1 |
20140258441 | L'Heureux | Sep 2014 | A1 |
Number | Date | Country |
---|---|---|
2646414 | Oct 2007 | CA |
2697936 | Mar 2009 | CA |
2742779 | Jun 2010 | CA |
1015068 | Apr 2008 | EP |
9825666 | Jun 1998 | WO |
0191482 | Nov 2001 | WO |
2008011063 | Jan 2008 | WO |
2012093330 | Jul 2012 | WO |
2013046016 | Apr 2013 | WO |
2013072764 | May 2013 | WO |
2013076554 | May 2013 | WO |
2014033554 | Mar 2014 | WO |
Entry |
---|
Li, Sheng Feng, Quentin Stafford-Fraser, and Andy Hopper. “Integrating synchronous and asynchronous collaboration with virtual network computing.” Internet Computing, IEEE 4.3 (2000): 26-33. |
Conference Schedule for ADASS XXI, European Southern Observatory, http://www.eso.org/sci/meetings/2011/adass2011/program/schedule.html#day2, Nov. 7, 2011, 4 pages. |
GoInstant shared web technology, http://website.s3.goinstant.com.s3.amazonaws.com/wp-content/uploads/2012/04/GoInstant-Shared-Web-Technology.pdf, 2012, 4 pages. |
Press Release, Calgary Scientific Revolutionizes Application Sharing and Advanced Collaboration with PureWeb 3.0, Jun. 21, 2011, 3 pages. |
Samesurf web real-time co-browser application, http://i.samesurf.com/i/0586021, 2009. |
International Search Report and Written Opinion, dated May 17, 2013, in connection with International Application No. PCT/IB2012/002842. |
International Search Report and Written Opinion, dated Feb. 12, 2013, in connection with International Application No. PCT/IB2012/002417. |
International Search Report and Written Opinion, dated Jan. 30, 2013, in connection with International Application No. PCT/IB2012/001935. |
International Search Report and Written Opinion, dated Jan. 23, 2013, in connection with International Application No. PCT/IB2012/001931. |
International Search Report and Written Opinion, dated May 16, 2012, in connection with International Application No. PCT/IB2012/000009. |
Luo, Y., et al., “Real Time Multi-User Interaction with 3D Graphics via Communication Networks,” 1998 IEEE Conference on Information Visualization, 1998, 9 pages. |
International Preliminary Report on Patentability and Written Opinion, dated Feb. 17, 2015, received in connection with related International Application No. PCT/IB2013/002776. |
Federl, Pavol, “Remote Visualization of Large Multi-dimensional Radio Astronomy Data Sets,” Institute for Space Imaging Science, University of Calgary, 2012, 22 pages. |
Yang, Lili, et al., “Multirate Control in Internet-Based Control Systems,” IEEE Transactions on Systems, Man, and Cybernetics: Part C: Applications and Reviews, vol. 37, No. 2, 2007, pp. 185-192. |
International Search Report and Written Opinion, dated Jun. 9, 2014, received in connection with International Application No. PCT/IB2013/002776. |
International Preliminary Report on Patentability, dated May 27, 2014, received in connection with International Application No. PCT/IB2012/002417. |
European Search Report, dated Jun. 12, 2014, received in connection with European Application No. 12731899.6. |
Supplementary European Search Report, dated Apr. 10, 2015, received in connection with European Application No. 12837201.8. |
Hong, C., “Multimedia Presentation Authoring and Virtual Collaboration in Medicine,” International Journal of Kimics, vol. 8, No. 6, 2010, pp. 690-696. |
Layers: Capture Every Item on Your Screen as a PSD Layered Image, Internet Website, retrieved on Jun. 30, 2016 at http://web.archive.org/web/20140218111143, 2014, 9 pages. |
Shim, H.S., et al., “Providing Flexible Services for Managing Shared State in Collaborative Systems,” Proceedings of the Fifth European Conference on Computer Supported Cooperative Work, 1997, pp. 237-252. |
International Search Report and Written Opinon, dated Jul. 8, 2016, received in connection with International Patent Application No. PCT/IB2016/051856. |
Number | Date | Country | |
---|---|---|---|
20140207858 A1 | Jul 2014 | US |
Number | Date | Country | |
---|---|---|---|
61541540 | Sep 2011 | US | |
61623108 | Apr 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13632245 | Oct 2012 | US |
Child | 14225584 | US |