The following applications are incorporated by reference:
Haworth, Inc., a Michigan corporation, Thought Stream LLC, a Delaware corporation, and Obscura Digital Incorporated, a California corporation, are parties to a Joint Research Agreement.
///
The invention relates to apparatuses, methods, and systems for digital collaboration, and more particularly to digital whiteboard systems which facilitate multiple simultaneous users having access to global collaboration data.
Digital whiteboards are often used for interactive presentations and other purposes. Some whiteboards are networked and can be used for collaboration, so that modifications made to the display image on one whiteboard are replicated on another whiteboard or display. Large scale whiteboards offer the opportunity for more than one user to present or annotate simultaneously on the same surface. However, problems can occur in the coordination of the multiple users, and in some circumstances their use of a single whiteboard can restrict their flexibility of expression.
Also, digital whiteboards can comprise large display screens or arrays of screens in a single room, which are configured to provide a large “whiteboard” like interaction surface. Thus, it is anticipated that the large digital whiteboards may be shared by many users at different times for different collaborations. Where the collaboration data for collaboration is confidential with access limited to authorized users, but the digital whiteboards at which the users interact are distributed to many sites and not necessarily under exclusive control of a single user, a problem arises with the security of access to a collaboration.
In addition, the distributed nature of the system leads to the possibility of multiple users in different places who interact with, and can change, the same collaboration data at the same time, and at times when no other user is observing the collaboration data. This creates a problem with concurrency in the multiple locations, and with sharing information about a current state of the collaboration data.
Therefore, it would be desirable to find ways to allow multiple users to share collaboration data in a distributed network of whiteboards, in such a way that each user has maximum freedom to express his or her ideas with real time exchange of ideas, while providing security adequate to protect the confidential nature of the collaboration. An opportunity therefore arises to create robust solutions to the problem. Better ideas, collaboration and results may be achieved.
A collaboration system is described that can have many, distributed digital whiteboards which are used both to display images based on collaboration data managed by a shared collaboration server, and to accept user input that can contribute to the collaboration data. The system can include management logic providing collaboration data to selected whiteboards based a protocol that insures that an user authorized for the collaboration data has physical access the selected whiteboard. Also, a collaboration system can have many, distributed digital whiteboards which are used both to display images based on collaboration data managed by a shared collaboration server, and to accept user input that can contribute to the collaboration data, while enabling each whiteboard to rapidly construct an image to display based on session history, real time local input and real-time input from other whiteboards. Yet another aspect of the technology described herein involves a whiteboard architecture based on federated displays arranged in an array, which cooperate to act as one whiteboard used both to display images based on collaboration data managed by a remote collaboration server, and to accept user input that can contribute to the collaboration data.
The above summary is provided in order to provide a basic understanding of some aspects of the collaboration system described herein. This summary is not intended to identify key or critical elements of invention or to delineate a scope of invention.
The invention will be described with respect to specific embodiments thereof, and reference will be made to the drawings, which are not drawn to scale, and in which:
The following description is presented to enable any person skilled in the art to make and use the invention, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present invention. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
The application running at the collaboration server 105 can be hosted using Web server software such as Apache or nginx. It can be hosted for example on virtual machines running operating systems such as LINUX. The server 105 is heuristically illustrated in
The database 106 stores, for example, a digital representation of collaboration data sets for each collaboration, where the collaboration data set can include or identify objects displayable on a whiteboard canvas and events related to such objects. There can be collaboration data sets for many different collaborations. A data set for a given collaboration can be configured in a database, or as machine readable document linked to the collaboration. The canvas can also be mapped to a region in a collaboration space which can have unlimited or virtually unlimited dimensions. The collaboration data includes data structures identifying objects displayable by a display client in the display area on a display wall, and associates a location in the collaboration space with the objects identified by the data structures. Each device 102 displays only a portion of the overall collaboration space. A display wall has a display area for displaying objects, the display area being mapped to corresponding area in the collaboration space that corresponds to a region in the collaboration space centered on, or otherwise located with, a user location in the collaboration space. The mapping of the display area to a corresponding area in the collaboration space is usable by the display client to identify objects in the collaboration data within the display area to be rendered on the display, and to identify objects to which to link user touch inputs at positions in the display area on the display.
As explained in more details below, the server 105 stores collaboration data sets for a plurality of collaborations, and provides the collaboration data to the display clients participating in the session. The collaboration data is then used by the computer systems 110 with appropriate software 112 including display client software, to determine images to display on the whiteboard, and to assign objects for interaction to locations on the display surface. In some alternatives, the server 105 can keep track of a “viewport” for each device 102, indicating the portion of the canvas viewable on that device, and can provide to each device 102 data needed to render the viewport. The application software that runs on the client device and is responsible for rendering drawing objects, handling user inputs, and communicating with the server can be HTML5-based and run in a browser environment. This allows for easy support of many different client operating system environments.
The user interface data stored in database 106 includes various types of objects, such as image bitmaps, video objects, multi-page documents, scalable vector graphics, and the like. The devices 102 are each in communication with the collaboration server 105 via a network 104. The network 104 can include all forms of networking components, such as LANs, WANs, routers, switches, WiFi components, cellular components, wired and optical components, and the internet. In one scenario two or more of the users 101 are located in the same room, and their devices 102 communicate via WiFi with the collaboration server 105. In another scenario two or more of the users 101 are separated from each other by thousands of miles and their devices 102 communicate with the collaboration server 105 via the internet. The walls 102c, 102d, 102e can be multi-touch devices which not only display images, but also can sense user gestures provided by touching the display surfaces with either a stylus or a part of the body such as one or more fingers. In some embodiments, a wall (e.g. 102c) can distinguish between a touch by one or more fingers (or an entire hand, for example), and a touch by the stylus. In an embodiment, the wall senses touch by emitting infrared light and detecting light received; light reflected from a user's finger has a characteristic which the wall distinguishes from ambient received light. The stylus emits its own infrared light in a manner that the wall can distinguish from both ambient light and light reflected from a user's finger. The wall 102c may, for example, be an array of Model No. MT553UTBL MultiTaction Cells, manufactured by MultiTouch Ltd, Helsinki, Finland, tiled both vertically and horizontally. In order to provide a variety of expressive means, the wall 102c is operated in such a way that it maintains “state”. That is, it may react to a given input differently depending on (among other things) the sequence of inputs. For example, using a toolbar, a user can select any of a number of available brush styles and colors. Once selected, the wall is in a state in which subsequent strokes by the stylus will draw a line using the selected brush style and color.
In an illustrative embodiment, the array totals on the order of 6′ in height and 30′ in width, which is wide enough for multiple users to stand at different parts of the wall and manipulate it simultaneously. Flexibility of expression on the wall may be restricted in a multi-user scenario, however, since the wall does not in this embodiment distinguish between fingers of different users, or styli operated by different users. Thus if one user places the wall into one desired state, then a second user would be restricted to use that same state because the wall does not have a way to recognize that the second user's input is to be treated differently.
In order to avoid this restriction, the system defines “drawing regions” on the wall 102c. A drawing region, as used herein, is a region within which at least one aspect of the wall's state can be changed independently of other regions on the wall. In the present embodiment, the aspects of state that can differ among drawing regions are the properties of a line drawn on the wall using a stylus. The response of the system to finger touch behaviors is not affected by drawing regions.
The drawing state can be a feature of a region independent of objects 301, 302 displayed in the region, and is defined by the line drawing properties, which in the embodiment of
In the embodiment of
If there is plenty of space on either side of the user's touch point, then the computer system 110 can set the initial region width to Wideal. This is the scenario illustrated in
Drawing regions can also be made automatically track the movement of the stylus. Although numerous possible tracking algorithms will be apparent to the reader, one that follows these minimum rules is preferred: (1) the region does not move so long as the stylus remains relatively near the center of the region; and (2) as the stylus approaches a region boundary, the region moves so that the boundary remains ahead of the stylus.
Drawing regions provide one example of user interaction that can have an effect at a local display wall, but not have an effect on the global collaboration data. As illustrated in this example, the locations of the objects 301, 302 are not affected by the assignment of drawing regions, the toolbars, and the drawing overlays within the regions. Of course in other types of user interface interactions, the locations of the objects 301, 302 can be moved, and such movements can be events related to objects in the global collaboration data.
A variety of behaviors related to the interpretation of user input based on interaction with a local wall are described in co-pending U.S. application Ser. No. 13/758,984, filed on 4 Feb. 2013, entitled REGION DYNAMICS FOR DIGITAL WHITEBOARD, which is incorporated by reference above. These behaviors are illustrative of local processing of user input and image data at a wall that can be executed by the local computer systems 110, with little or no effect on the shared collaboration data maintained at the collaboration server in some embodiments.
The display client 603 can include a physical or virtual computer system having computer programs stored in accessible memory that provide logic supporting the collaboration session, including an HTML 5 client, wall array coordination logic, collaboration data parsing searching and rendering logic, and a session events application to manage live interaction with collaboration data and the display wall.
The portal 602 can include a physical or virtual computer system having computer programs stored in accessible memory, that provide logic supporting user access to the collaboration server. The logic can include applications to provide initial entry points for users, such as a webpage with login resources, logic to manage user accounts and session anticipation, logic that provides authorization services, such as OAuth-based services, and account data.
The collaboration service 601 can manage the session event data, coordinate updated events among clients, deliver catchable history and images to clients, and control access to a database stored in the collaboration data.
Each of the display client 711-714 can maintain a communication channel 721-724 with the collaboration server 105, which is in turn is coupled to the white board collaboration database 106. The collaboration server 105 can maintain a user location within the collaboration space for each authorized user. When an authorized user is logged in, and has selected a display array such as that shown in
In order to support coordination of a single whiteboard among a plurality of displays, each of the display clients 711-714 can also communicate with each of the other display clients coming connection with events that are local to the management of the whiteboard display area, and which do not have an effect on the global collaboration data. Alternatively, the display client 711-714 can communicate solely with the collaboration server 105, which can then direct local events back to the group of display clients associated with the session, and global events to all of the display clients in active sessions with the collaboration, and to the database storing collaboration data.
The display clients at a single whiteboard comprised of federated displays can be implemented individual computer systems coupled to the corresponding displays, or can be implemented using a single computer system with virtual machines coupled to the corresponding displays.
In an embodiment of the collaboration system, an application program interface API is executed by the collaboration server 105 and display clients based on two communication channels for each display client, as suggested with reference to
Socket Requests in one example can be executed on a channel that maintains connections via Websockets with the collaboration service 601. Messages exchanged can be individual UTF-8 encoded JSON arrays. A display client can establish a socket connection with the server, carrying session identifier, or in a manner otherwise linked to a session. A message structure and some of the message types included in an example system can be understood from the following description.
Message Structure
Valid Message Types
Client ID Request:
Client ID Response:
Room Data Arguments:
Room Join Response:
Room List Response
Session Request:
Session List:
Set Card Position, Bounds and Z-index:
Delete Card:
Object Create:
Continue Stroke:
End Stroke:
End the stroke specified by stroke-id
Stroke:
Delete Stroke:
Undo:
Display Array Dimensions:
Tool Change:
Cards Locked:
Region Open:
Region Move:
Region Close:
Pan Array:
Session Change:
Zoom Change:
Map-Mode Change:
Create Card:
Global Mode Toggle:
Save Position:
Stroke IDs
Target IDs
History:
Retrieving a Block of History:
Retrieving Objects:
Card Templates:
These values can be used to send a create card message:
Upload:
The physical hardware component of network interfaces are sometimes referred to as network interface cards (NICs), although they need not be in the form of cards: for instance they could be in the form of integrated circuits (ICs) and connectors fitted directly onto a motherboard, or in the form of macrocells fabricated on a single integrated circuit chip with other components of the computer system.
User interface input devices 1022 may include a keyboard, pointing devices such as a mouse, trackball, touchpad, or graphics tablet, a scanner, a touch screen incorporated into the display (including the touch sensitive portions of large format digital whiteboard 102c), audio input devices such as voice recognition systems, microphones, and other types of input devices. In general, use of the term “input device” is intended to include all possible types of devices and ways to input information into the computer system or onto computer network 104.
User interface output devices 1020 may include a display subsystem, a printer, a fax machine, or non-visual displays such as audio output devices. The display subsystem may include a cathode ray tube (CRT), a flat panel device such as a liquid crystal display (LCD), a projection device, or some other mechanism for creating a visible image. In the embodiment of
Storage subsystem 1024 stores the basic programming and data constructs that provide the functionality of certain embodiments of the present invention. For example, the various modules implementing the functionality of certain embodiments of the invention may be stored in storage subsystem 1024. These software modules are generally executed by processor subsystem 1014.
Memory subsystem 1026 typically includes a number of memories including a main random access memory (RAM) 1030 for storage of instructions and data during program execution and a read only memory (ROM) 1032 in which fixed instructions are stored. File storage subsystem 1028 provides persistent storage for program and data files, and may include a hard disk drive, a floppy disk drive along with associated removable media, a CD ROM drive, an optical drive, or removable media cartridges. The databases and modules implementing the functionality of certain embodiments of the invention may have been provided on a computer readable medium such as one or more CD-ROMs, and may be stored by file storage subsystem 1028. The host memory 1026 contains, among other things, computer instructions which, when executed by the processor subsystem 1014, cause the computer system to operate or perform functions as described herein. As used herein, processes and software that are said to run in or on “the host” or “the computer”, execute on the processor subsystem 1014 in response to computer instructions and data in the host memory subsystem 1026 including any other local or remote storage for such instructions and data.
Bus subsystem 1012 provides a mechanism for letting the various components and subsystems of computer system communicate with each other as intended. Although bus subsystem 1012 is shown schematically as a single bus, alternative embodiments of the bus subsystem may use multiple busses.
The computer system itself can be of varying types including a personal computer, a portable computer, a workstation, a computer terminal, a network computer, a television, a mainframe, a server farm, or any other data processing system or user device. In one embodiment, computer system includes several computer systems, each controlling one of the tiles that make up the large format whiteboard 102c. Due to the ever changing nature of computers and networks, the description of computer system 110 depicted in
Certain information about the drawing regions active on the digital whiteboard 102c are stored in a database accessible to the computer system 110 of the display client. The database can take on many forms in different embodiments, including but not limited to a MongoDB database, an XML database, a relational database, or an object oriented database.
In embodiments described herein, each drawing region is considered to be a child of a toolbar. The touching of a point on the wall background spawns a toolbar, which in turn spawns a drawing region (though the toolbar is not necessarily visible until the drawing region opens). Similarly, to close a drawing region, a user touches a ‘close’ icon on the drawing region's toolbar. Thus in
The drawing properties include or point to an array 1114 of drawing attributes, each in association with one or more values. The drawing properties in
In order to draw a line on the whiteboard 102c, a user provides “drawing user input” which indicates the drawing of the line. While other embodiments may allow a user to draw with a finger, in the embodiment of
In one example, the process of downloading the collaboration data includes delivering the event objects for the session to each display client. Included with the collaboration data, a current user location can be provided. Alternatively, the collaboration data can be delivered, followed by a sequence of messages which identify to the display client how to compute an offset from a default location, such as at the center of the collaboration data, to a current location associated with the user. Each display client then can traverse the event objects to identify those objects having session locations which map to the display area managed by the display client. The logic to traverse the event objects can include an R-TREE search for example, which is configured to find objects in the collaboration space that map to the display area. The identified objects can then be rendered, possibly communicating with the portal to obtain data relevant to the objects, on the display area managed by the display.
As used herein, the “identification” of an item of information does not necessarily require the direct specification of that item of information. Information can be “identified” in a field by simply referring to the actual information through one or more layers of indirection, or by identifying one or more items of different information which are together sufficient to determine the actual item of information. In addition, the term “indicate” is used herein to mean the same as “identify”.
Also as used herein, a given signal, event or value is “responsive” to a predecessor signal, event or value if the predecessor signal, event or value influenced the given signal, event or value. If there is an intervening processing element, step or time period, the given signal, event or value can still be “responsive” to the predecessor signal, event or value. If the intervening processing element or step combines more than one signal, event or value, the signal output of the processing element or step is considered “responsive” to each of the signal, event or value inputs. If the given signal, event or value is the same as the predecessor signal, event or value, this is merely a degenerate case in which the given signal, event or value is still considered to be “responsive” to the predecessor signal, event or value. “Dependency” of a given signal, event or value upon another signal, event or value is defined similarly.
The applicant hereby discloses in isolation each individual feature described herein and any combination of two or more such features, to the extent that such features or combinations are capable of being carried out based on the present specification as a whole in light of the common general knowledge of a person skilled in the art, irrespective of whether such features or combinations of features solve any problems disclosed herein, and without limitation to the scope of the claims. The applicant indicates that aspects of the present invention may consist of any such feature or combination of features. In view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the invention.
The foregoing description of preferred embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in this art. For example, though the whiteboards described herein are of large format, small format whiteboards can also be arranged to use multiple drawing regions, though multiple drawing regions are more useful for whiteboards that are at least as large as 12′ in width. In particular, and without limitation, any and all variations described, suggested by the Background section of this patent application or by the material incorporated by references are specifically incorporated by reference into the description herein of embodiments of the invention. In addition, any and all variations described, suggested or incorporated by reference herein with respect to any one embodiment are also to be considered taught with respect to all other embodiments. The embodiments described herein were chosen and described in order to best explain the principles of the invention and its practical application, thereby enabling others skilled in the art to understand the invention for various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Name | Date | Kind |
---|---|---|---|
4686332 | Greanias et al. | Aug 1987 | A |
5008853 | Bly et al. | Apr 1991 | A |
5220657 | Bly et al. | Jun 1993 | A |
5309555 | Akins et al. | May 1994 | A |
5446842 | Schaeffer et al. | Aug 1995 | A |
5537526 | Anderson et al. | Jul 1996 | A |
5563996 | Tchao | Oct 1996 | A |
5727002 | Miller et al. | Mar 1998 | A |
5781732 | Adams | Jul 1998 | A |
5818425 | Want et al. | Oct 1998 | A |
5835713 | FitzPatrick et al. | Nov 1998 | A |
5872924 | Nakayama et al. | Feb 1999 | A |
5938724 | Pommier et al. | Aug 1999 | A |
5940082 | Brinegar et al. | Aug 1999 | A |
6084584 | Nahi et al. | Jul 2000 | A |
6128014 | Nakagawa et al. | Oct 2000 | A |
6167433 | Maples et al. | Dec 2000 | A |
6320597 | Ieperen | Nov 2001 | B1 |
6342906 | Kumar et al. | Jan 2002 | B1 |
6343313 | Salesky et al. | Jan 2002 | B1 |
6518957 | Lehtinen et al. | Feb 2003 | B1 |
6564246 | Varma et al. | May 2003 | B1 |
6911987 | Mairs et al. | Jun 2005 | B1 |
6930673 | Kaye et al. | Aug 2005 | B2 |
6930679 | Wu et al. | Aug 2005 | B2 |
7003728 | Berque | Feb 2006 | B2 |
7043529 | Simonoff | May 2006 | B1 |
7129934 | Luman et al. | Oct 2006 | B2 |
7171448 | Danielsen et al. | Jan 2007 | B1 |
7356563 | Leichtling et al. | Apr 2008 | B1 |
7450109 | Halcrow et al. | Nov 2008 | B2 |
D600703 | LaManna et al. | Sep 2009 | S |
8209308 | Rueben et al. | Jun 2012 | B2 |
D664562 | McCain et al. | Jul 2012 | S |
8402391 | Doray et al. | Mar 2013 | B1 |
8898590 | Okada et al. | Nov 2014 | B2 |
20030020671 | Santoro et al. | Jan 2003 | A1 |
20030058227 | Hara et al. | Mar 2003 | A1 |
20040060037 | Damm et al. | Mar 2004 | A1 |
20040150627 | Luman et al. | Aug 2004 | A1 |
20040155871 | Perski et al. | Aug 2004 | A1 |
20040174398 | Luke et al. | Sep 2004 | A1 |
20050060656 | Martinez et al. | Mar 2005 | A1 |
20050195216 | Kramer et al. | Sep 2005 | A1 |
20050237380 | Kakii et al. | Oct 2005 | A1 |
20050273700 | Champion et al. | Dec 2005 | A1 |
20060012580 | Perski et al. | Jan 2006 | A1 |
20060066588 | Lyon et al. | Mar 2006 | A1 |
20060195507 | Baek et al. | Aug 2006 | A1 |
20060211404 | Cromp et al. | Sep 2006 | A1 |
20060220982 | Ueda | Oct 2006 | A1 |
20060224427 | Salmon | Oct 2006 | A1 |
20070262964 | Zotov et al. | Nov 2007 | A1 |
20080143818 | Ferren et al. | Jun 2008 | A1 |
20080163053 | Hwang et al. | Jul 2008 | A1 |
20080177771 | Vaughn | Jul 2008 | A1 |
20080207188 | Ahn et al. | Aug 2008 | A1 |
20090049381 | Robertson et al. | Feb 2009 | A1 |
20090089682 | Baier et al. | Apr 2009 | A1 |
20090128516 | Rimon et al. | May 2009 | A1 |
20090153519 | Suarez Rovere | Jun 2009 | A1 |
20090160786 | Finnegan | Jun 2009 | A1 |
20090174679 | Westerman | Jul 2009 | A1 |
20090195518 | Mattice et al. | Aug 2009 | A1 |
20090251457 | Walker et al. | Oct 2009 | A1 |
20090278806 | Duarte et al. | Nov 2009 | A1 |
20090282359 | Saul et al. | Nov 2009 | A1 |
20090309846 | Trachtenberg et al. | Dec 2009 | A1 |
20090309853 | Hildebrandt et al. | Dec 2009 | A1 |
20100017727 | Offer et al. | Jan 2010 | A1 |
20100073454 | Lovhaugen et al. | Mar 2010 | A1 |
20100132034 | Pearce et al. | May 2010 | A1 |
20100205190 | Morris et al. | Aug 2010 | A1 |
20100211920 | Westerman et al. | Aug 2010 | A1 |
20100306650 | Oh et al. | Dec 2010 | A1 |
20100306696 | Groth et al. | Dec 2010 | A1 |
20100309148 | Fleizach et al. | Dec 2010 | A1 |
20100315481 | Wijngaarden et al. | Dec 2010 | A1 |
20100318470 | Meinel et al. | Dec 2010 | A1 |
20100318921 | Trachtenberg et al. | Dec 2010 | A1 |
20100328306 | Chau et al. | Dec 2010 | A1 |
20110069184 | Go | Mar 2011 | A1 |
20110109526 | Bauza et al. | May 2011 | A1 |
20110148926 | Koo et al. | Jun 2011 | A1 |
20110154192 | Yang et al. | Jun 2011 | A1 |
20110183654 | Lanier et al. | Jul 2011 | A1 |
20110197147 | Fai | Aug 2011 | A1 |
20110197157 | Hoffman et al. | Aug 2011 | A1 |
20110202424 | Chun et al. | Aug 2011 | A1 |
20110208807 | Shaffer | Aug 2011 | A1 |
20110214063 | Saul | Sep 2011 | A1 |
20110216064 | Dahl et al. | Sep 2011 | A1 |
20110225494 | Shmuylovich et al. | Sep 2011 | A1 |
20110246875 | Parker et al. | Oct 2011 | A1 |
20110264785 | Newman et al. | Oct 2011 | A1 |
20110271229 | Yu | Nov 2011 | A1 |
20120011465 | Rezende | Jan 2012 | A1 |
20120019452 | Westerman | Jan 2012 | A1 |
20120026200 | Okada et al. | Feb 2012 | A1 |
20120030193 | Richberg et al. | Feb 2012 | A1 |
20120038572 | Kim et al. | Feb 2012 | A1 |
20120050197 | Kemmochi | Mar 2012 | A1 |
20120075212 | Park et al. | Mar 2012 | A1 |
20120124124 | Beaty et al. | May 2012 | A1 |
20120127126 | Mattice et al. | May 2012 | A1 |
20120176328 | Brown et al. | Jul 2012 | A1 |
20120179994 | Knowlton et al. | Jul 2012 | A1 |
20120229425 | Barrus et al. | Sep 2012 | A1 |
20120254858 | Moyers et al. | Oct 2012 | A1 |
20120260176 | Sehrer | Oct 2012 | A1 |
20120274583 | Haggerty | Nov 2012 | A1 |
20120275683 | Adler et al. | Nov 2012 | A1 |
20120278738 | Kruse et al. | Nov 2012 | A1 |
20120320073 | Mason | Dec 2012 | A1 |
20130004069 | Hawkins et al. | Jan 2013 | A1 |
20130047093 | Reuschel et al. | Feb 2013 | A1 |
20130218998 | Fischer et al. | Aug 2013 | A1 |
20130320073 | Yokoo et al. | Dec 2013 | A1 |
20130346878 | Mason | Dec 2013 | A1 |
20130346910 | Mason | Dec 2013 | A1 |
20140022334 | Lockhart et al. | Jan 2014 | A1 |
20140033067 | Pittenger et al. | Jan 2014 | A1 |
20140055400 | Reuschel | Feb 2014 | A1 |
20140062957 | Perski et al. | Mar 2014 | A1 |
20140222916 | Foley et al. | Aug 2014 | A1 |
20140223335 | Pearson | Aug 2014 | A1 |
20150084055 | Nagata et al. | Mar 2015 | A1 |
Number | Date | Country |
---|---|---|
101630240 | Jan 2010 | CN |
2010079834 | Apr 2010 | JP |
2012043251 | Mar 2012 | JP |
0161633 | Aug 2001 | WO |
2009018314 | Feb 2009 | WO |
2011029067 | Mar 2011 | WO |
2011048901 | Apr 2011 | WO |
2012162411 | Nov 2012 | WO |
Entry |
---|
PCT/US2014/014489—International Search Report and Written Opinion dated May 30, 2014, 13 pages. |
PCT/US2014/014494—International Search Report and Written Opinion dated May 30, 2014, 10 pages. |
PCT/US2014/018375—International Search Report and Written Opinion mailed Jul. 1, 2014, 16 pages. |
Anacore, “Anacore Presents Synthesis”, InfoComm 2012: Las Vegas, NV, USA, Jun. 9-15, 2012, 2 pages, screen shots taken from http://www.youtube.com/watch?v=FbQ9P1c5aHk (visited Nov. 1, 2013). |
PCT/US2012/39176—International Search Report and Written Opinion mailed Sep. 24, 2012, 21 pages. |
Masters Thesis: “The ANA Project, Development of the ANA-Core Software” Ariane Keller, Sep. 21, 2007, ETH Zurich, 92 pages. |
U.S. Appl. No. 13/758,984, filed Feb. 4, 2013, entitled “Region Dynamics for Digital Whiteboard,” Inventor Steve Mason, 28 pages. |
U.S. Appl. No. 13/758,989, filed Feb. 4, 2013, entitled “Toolbar Dynamics for Digital Whiteboard,” Inventor Steve Mason, 26 pages. |
U.S. Appl. No. 13/758,993, filed Feb. 4, 2013, entitled “Line Drawing Behavior for Digital Whiteboard,” Inventor Steve Mason, 29 pages. |
U.S. Appl. No. 13/759,018, filed Feb. 4, 2013, entitled “Collaboration System with Whiteboard with Federated Display,” Inventor Adam Pearson, 48 pages. |
EP12789695.9—Supplemental European Search Report dated Nov. 19, 2014, 9 pages. |
PCT/US2013/058030—International Search Report and Written Opinion mailed Dec. 27, 2013, 11 pgs. |
PCT/US2013/058040—International Search Report and Written Opinion mailed Dec. 18, 2013, 10 pgs. |
PCT/US2013/058249—International Search Report and Written Opinion mailed Dec. 18, 2013, 14 pgs. |
PCT/US2013/058261—International Search Report and Written Opinion mailed Dec. 30, 2013, 14 pgs. |
U.S. Appl. No. 13/758,984—Office Action dated Oct. 8, 2014, 21 pgs. |
U.S. Appl. No. 13/758,989—Office Action dated Oct. 7, 2014, 8 pgs. |
PCT/US2014/014475—International Search Report and Written Opinion dated Nov. 28, 2014, 10 pgs. |
U.S. Appl. No. 13/758,993—Office Action dated Feb. 3, 2015, 22 pgs. |
Villamor, C., et al., “Touch Gesture Reference Guide”, Apr. 15, 2010, retrieved from the internet: http://web.archive.org/web/20100601214053; http://www.lukew.com/touch/TouchGestureGuide.pdf, 7 pages, retrieved on Apr. 10, 2014. |
U.S. Appl. No. 14/018,370—Office Action dated May 21, 2015, 51 pages. |
U.S. Appl. No. 13/478,994—Office Action dated Jul. 8, 2015, 12 pgs. |
U.S. Appl. No. 13/478,994—Office Action dated Sep. 29, 2014, 10 pgs. |
U.S. Appl. No. 13/478,994—Office Action dated Dec. 9, 2013, 7 pgs. |
Albin, T., “Comfortable Portable Computing: The Ergonomic Equation,” Copyright 2008 Ergotron, Inc., 19 pgs. |
“Ergonomics Data and Mounting Heights,” Ergonomic Ground Rules, last revised Sep. 22, 2010, 2 pgs. |
U.S. Appl. No. 13/758,984—Office Action dated Jun. 19, 2015, 25 pgs. |
U.S. Appl. No. 13/758,989—Office Action dated Jun. 19, 2015, 9 pgs. |
U.S. Appl. No. 13/758,989—Office Action dated Feb. 12, 2015, 9 pgs. |
U.S. Appl. No. 13/758,993—Office Action dated Jul. 30, 2015, 43 pgs. |
U.S. Appl. No. 13/759,018—Office Action dated Oct. 22, 2014, 16 pgs. |
U.S. Appl. No. 13/759,018—Office Action dated Apr. 23, 2015, 24 pgs. |
U.S. Appl. No. 13/759,018—Office Action dated Aug. 27, 2015, 22 pgs. |
U.S. Appl. No. 13/758,984—Office Action dated Feb. 13, 2015, 22 pgs. |
Number | Date | Country | |
---|---|---|---|
20140223334 A1 | Aug 2014 | US |