With the assistance of computers, computer systems and a variety of software applications, workers, students and leisure computer users are able to generate a variety of content items, including word processing documents, spreadsheet documents, slide presentations, charts, maps, data tables, photographs, images, and the like. In a typical setting, users may have access to and may interact with a large number of content items. For example, users may interact with a large number of content items as part of collaborative work environment where tens or hundreds of content items are generated and stored. While an outline or diagram may be prepared to help users organize such content items and/or to understand relationships between/among such content items, users are seldom, if ever, able to appreciate the content associated with the variety of content items or to appreciate or understand how the content items may relate to each other.
It is with respect to these and other considerations that the present invention has been made.
Embodiments of the present invention solve the above and other problems by providing an interactive visualization of multiple software functionality content items. According to embodiments, a variety of content items are associated with each other according to one or more association attributes. Such association attributes may include association of content items relative to a collaborative work space, association of such content items based on time of generation, author, department or organizational designation, metadata applied to such content items, and the like. A presentation of one or more of the content items may be displayed in a visualization interface.
An ordered navigation interface component may be provided to allow navigation of the displayed content items in a moving visualization such that a focused on content item may be displayed in a main or primary display position. As the ordered navigation interface component is navigated in either direction, content items move into the main or primary display position as their respective position in the ordered navigation interface component is approached and ultimately reached. Thus, a user may receive a visualization of each of the content items associated with a given work space or other storage location, and the visualization may be presented on a content item by content item basis in a logical order in association with the ordered navigation interface component.
Organization of content items in association with the ordered navigation interface component may be based on one or more logical attributes. For example, content items may be arranged relative to the ordered navigation interface component based on time of content item creation, time of content item revision, alphabetical order based on author identification, order based on a designated importance parameter, last item viewed, first item viewed, and the like. In addition, content items may be arranged in an order specified by a user.
According to other embodiments, users may interact with each content item displayed in the visualization interface to include providing edits, comments, feedback, and the like. As such changes occur in a given content item, or as new content items are added, or as existing content items are deleted, relevance indicators may be displayed in association with the ordered navigation interface component to notify users that something has changed with respect to a given content item and to allow users to navigate to the changed content item using the ordered navigation interface component.
The details of one or more embodiments are set forth in the accompanying drawings and description below. Other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that the following detailed description is explanatory only and is not restrictive of the invention as claimed.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.
As briefly described above, embodiments of the present invention are directed to interactive visualization of multiple software functionality content items. One or more content items, for example, word processing documents, spreadsheet documents, slide presentation application documents, charts, maps, data tables, photographs, images, and the like, may be assembled together according to a prescribed reason for assembly. For example, the one or more content items may be associated with each other because of the data in the content items. That is, if two content items contain data about the same topic or issue, the two content items may be associated with each other, as described herein. For another example, the one or more content items may be associated with each other owing to their generation and storage with respect to a collaborative work environment, the one or more content items may be associated with each other owing to their storage in a given storage location, and/or the one or more content items may be associated with each other owing to their relationship to one or more prescribed authors, editors, reviewers, or the like. In addition, the one or more content items may be associated with each other owing to a variety of association attributes, for example, time of creation, time of revision, time of interaction with by one or more specified users, geographical location, department or work space association, and the like. That is, one or more content items may be assembled together for visualization as described herein according to any attribute that may be applied to or associated with each content item as desired by users of the content items. For example, a collection of content items may include all maps generated by a maps department covering a specified geographical area, for example, the northwest portion of the United States, or the collection may be comprised of all word processing documents generated by a particular department during a particular time range.
Each content item comprising a collection of one or more content items may be displayed in a visualization interface and in association with an ordered navigation interface. At any given time, one of the one or more content items may be displayed in a main or primary display position. As a user navigates the associated ordered navigation interface component in a forward or backward direction, the display of one or more content items in the visualization interface may move with the navigation of the ordered navigation interface, and as a particular content item is focused on or interacted with via user interaction with the ordered navigation interface component, that content item may be displayed in the main or primary display position.
If desired, any content item displayed in the visualization interface may be interacted with whereby a user may edit, revise, comment on, provide feedback on, and the like with respect to any content item displayed in the visualization interface. As interactions occur with respect to a given content item displayed in the visualization interface, a relevance indicator may be displayed in association with the ordered navigation interface component to notify users that something has occurred with respect to a content item and to allow users to navigate to that content item via the ordered navigation interface component to allow users to review the content item in association with the interaction that has been received for that content item.
The following detailed description refers to the accompanying drawings. Wherever possible, the same reference numbers are used in the drawing and the following description to refer to the same or similar elements. While embodiments of the invention may be described, modifications, adaptations, and other implementations are possible. For example, substitutions, additions, or modifications may be made to the elements illustrated in the drawings, and the methods described herein may be modified by substituting, reordering, or adding stages to the disclosed methods. Accordingly, the following detailed description does not limit the invention, but instead, the proper scope of the invention is defined by the appended claims. As should be appreciated, user interface components, visualizations of data, and other visualizations of functionalities described herein (as illustrated in
According to embodiments, the functionalities described herein may be provided by a content visualization application 910, illustrated below with respect to
The content items illustrated in
The content items 105, 110, 115 and 120, illustrated on the display screen of the computing device 100 are illustrative of a variety of electronic computer-generated or computer-enabled content items, for example, text-based word processing documents, spreadsheet documents, slide presentation slides, Internet-based web pages, electronic maps, charts, images and the like, enabled according to a variety of suitable software applications, for example, word processing applications, spreadsheet applications, slide presentation applications, Internet-based web browsing applications, mapping applications, charting applications, and the like.
As described above, according to embodiments, the content items illustrated in
As illustrated in
Other displayed content items include a graph 110, a map 115, and a document 120. As illustrated in
Referring now to
Referring still to
According to one embodiment, a slider bar 225 is provided with which a user may navigate along the ordered navigation interface 205 in a right/left or forward/backward direction. Interaction with the slider bar 225 may be according to a variety of input means, for example, cursor/mouse interaction, touch interaction, gesture interaction, voice command, and the like. As the slider bar 225 is moved in any direction, the display of content items presented in the visualization interface 200 moves with movement of the slider bar 225. In addition, navigation may occur without use of the slider bar 225. For example, movement of a finger or stylus across the ordered navigation interface component may cause movement of the content items. Alternatively, gesture commands or voice commands may be used to move the content items in either direction.
Referring to
Thus, each content item associated with the present display of content items, for example, all content items obtained from a collaborative work space storage location, may be navigated across the display of the visualization interface by navigating the ordered navigation interface component such that a content item associated with a present navigation point on the ordered navigation interface component is displayed in a main or primary display location, for example, in the center of the visualization interface 200. As should be appreciated, another location in the visualization interface component may be designated as a main or primary display location. For example, the main or primary display location may be located on a right side of the visualization interface 200, on a left side of the visualization interface component, or the like.
Referring still to the ordered navigation interface component 205, in addition to movement of the slider bar 225 or other movement across the ordered navigation interface component, as described above, navigation of the ordered navigation interface component may be performed by selecting the ordered navigation interface component at a desired location along the visualization ordering. For example, a user may click on, touch, or otherwise interact with a particular location on the ordered navigation interface component 205 to cause navigation to that point on the visualization ordering and to cause display of an associated content item in the main or primary display location in the same manner as would occur if the user moves the aforementioned slider bar to that location on the ordered navigation interface component.
As illustrated in the ordered navigation interface component 205, a number of increment notations 210 may be displayed for providing a scaling associated with the ordered navigation interface component. For example, the space between increment notations 210 may be associated with one week of time, one month of time, one year of time, or the like. Alternatively, the spacing between increment notations 210 may be associated with other attributes, for example, increments along an alphabetical scale from A to Z. As should be appreciated, almost any scaling may be used for the increment notations 210 that allows for navigating along the visualization ordering in a manner that is desirable to the user. For example, the scaling associated with the visualization ordering may simply be based on the number of content items, (e.g., 1-100, 101-201, etc.)
Referring still to the ordered navigation interface component 205, a variety of relevance indicators may be provided in association with the ordered navigation interface component to alert users that a change or other interaction has occurred with respect to a content item available through the visualization interface 200 and associated with the ordered navigation interface component at some prescribed position. For example, a comments indicator 215 is illustrated showing that one or more comments have been provided in a content item associated with a particular position on the ordered navigation interface component. If a user navigates to the position along the ordered navigation interface component associated with the comments indicator 215, the content item, for example, content item 110, may be moved to the main or primary display position, for example, the location of the presently displayed content item 115, to allow the user to review the one or more comments that have been applied to the associated content item 110.
Likewise, navigation to the position on the ordered navigation interface component associated with the comments indicator 235 may allow the user to display a content item associated with the comments indicator 235 to review the one or more comments associated with that content item. Alternatively, a comment or alert may be viewed without navigating to the associated position on the ordered navigation interface component. For example, focusing a mouse or other input means (e.g., finger for a touch display) over the comment or alert indicator may show the comment or alert right away associated with the visualization ordering.
An alert indicator 220, 230 is provided to alert a user that something has occurred with respect to a content item associated with that position along the visualization ordering associated with the ordered navigation interface component 205. For example, if another user associated with a collaborative work space has made a change of some kind to a given document, when the change to the given document is stored, an alert indicator 220, 235 may be displayed at a position along the ordered navigation interface component associated with the document to which a change of some type has occurred. Navigation to that position on the interface component by the user causes a display of the associated content item in the main or primary display position to allow the user to learn the nature of the change that has occurred to the associated content item.
The other indicator 240 is illustrative of a variety of other relevance indicators that may be associated with the ordered navigation interface component 205 for alerting one or more users of an event associated with a content item positioned at that location on the ordered navigation interface component 205. For example, the other indicator 240 could include a variety of icons associated with a variety of events, for example, a calendar icon to illustrate a calendar or time based event, a notes icon to indicate application of one or more notes to a given content item, a document icon to indicate the association of a document with a content item, an avatar to indicate the association of one or more users with a content item, and the like. That is, an almost limitless number of indicators may be utilized and may be displayed along the ordered navigation interface component 205 to indicate a change or other event associated with a content item that may be displayed in the visualization interface 200.
Referring now to
Referring again to
As should be appreciated, content items likewise may be dragged onto a previously generated visualization, as illustrated in
Referring now to
According to one embodiment, a user may enter a desired comment into the comments text box 415 in association with some portion or attribute of the displayed content item. For example, referring to the map illustrated in
As described above with reference to
As should be appreciated, other types of interaction in addition to comments may be provided in association with a given content item 410. For example, if the displayed content item 410 is in the form of a word processing type document, interaction with the word processing document may include addition of text, deletion of text, revision of text, addition or deletion of images or other objects to the document, and the like. Likewise, if the content item 410 is in the form of a spreadsheet application document, interaction with the spreadsheet document may include addition of data, deletion of data, changes to spreadsheet formulas, and the like. Such interactions with a given content item may cause display of an alert 220, 230 or notification 240 along the ordered navigation interface component, as described above.
Referring still to
According to embodiments, the ordered navigation interface component illustrated in
For example, a slider bar 450 illustrated in
According to embodiments, the user interaction with the visualizations described herein may include authoring and editing. For example, referring still to
In addition, as a user navigates to a particular content item in the visualization, for example, a word processing document, spreadsheet document, chart or the like, the user may make edits to the navigated content item within the visualization of content items without having to separately launch those items for making edits. That is, the content items provided in a visualization of content items, as described herein, may be presented as live documents for which edits may be made and saved according to the native application functionality with which those documents were created. According to one embodiment, application functionality associated with such content items may be called and retrieved by the content visualization application 910 as needed, or each content item may be linked back to its native application functionality via a linking means, such as object linking and embedding (OLE).
Referring now to
For example, users associated with a sales organization may wish to annotate a map, such as the map illustrated in
An icon 510, 515, 520 may be applied to the example map object for showing the location of the embedded chart. Subsequent selection of one of the chart icons 510, 515, 520 may cause the display of a text box showing the information associated with the displayed and selected icon, or the information associated with the displayed and selected icon may be displayed in an information interface, for example, as illustrated along the right side of the visualization interface 200. As described above with reference to
Referring still to
Under the “Data Labels Position” section, one or more positions may be selected at which labels will be displayed on a given object. For example, a “Best Fit” option may allow the content visualization application 910 to determine and place a label in a manner that best fits the available display space and that does not cover other information. An “Inside” option may cause all labels to be displayed inside the associated object, and an “Outside” option may cause all labels to be displayed outside the associated object where, for example, it is important that the label not cover displayed information. As should be appreciated, the gallery or menu of options 540 illustrates only a few example display options and should not be considered limiting of a vast number of display options that may be provided in a gallery or menu of display options 540.
Referring now to
According to embodiments, feedback may be provided by users on the data contained in objects displayed in the visualization of objects, as well as, on the nature and quality of the visualization itself As illustrated in
A menu or gallery of feedback items, as illustrated in the UI component 650, may be provided to allow a user to select from one of a number of feedback options. For example, a user may apply a feedback item of “Report Bad Data” to a chart object displayed in the visualization to alert others that the user believes the data is inaccurate, misleading or otherwise not acceptable. According to one embodiment, if such a feedback item is applied to a given data item, that data item may be removed or hidden from the visualization. As should be appreciated, use of such feedback for a given object may be limited to users having permissions to apply such feedback that may result in a removal or hiding of a data item or content item from the visualization.
If users provide a simple approval, disapproval, satisfaction, dissatisfaction type of feedback, the icon 640 may be displayed in a manner to provide an immediate indication of such feedback. For example, if ten users approve of the content item or associated information, an approval icon 640, for example, a “thumbs up,” “smiley face,” “happy avatar,” or the like may be displayed along with a number associated with the number of approving feedback providers, for example, the number “10” displayed below, above, or adjacent to the feedback icon 640. In contrast, if a number of users disapprove, or are dissatisfied with the content item or the information associated with a particular content item, then a disapproving or dissatisfaction icon, for example, a “thumbs down,” a “frowning face,” a “unhappy avatar” or the like may be provided along with a number associated with the number of feedback providers providing negative feedback. As should be appreciated, feedback provided by one or more feedback providers may be provided in a feedback interface, as displayed along the right side of the visualization interface 200, in the same manner as comments or other information may be provided.
Referring still to
Referring to
For example, the node 715 may be illustrative of a collection or storage of insights information, illustrated in
According to one embodiment, a particular path traversed by a given user, for example, from node 714 to node 715 to node 710 may be highlighted to show a path traversed by one user, as opposed to a path traversed by a second user. For example, a path traversed by one user may be highlighted in the color red, a path traversed by a second user may be highlighted in the color blue, and so on. According to embodiments traversal or review of a particular path or items associated with a particular path may be performed in a number of ways as desired by one or more users. A given path may be traversed by a user wherein he/she views one or more visualizations associated with various insights items. For example, if a user views visualization 710 followed by visualization 717, then a traversal path may be generated showing movement along a path through nodes 715 and 716. For another example, the visualizations themselves or the insights themselves may be linked (e.g., when looking at node 715, the user sees a list of related insights 714 and 716). For another example, a path may be computed algorithmically based on similarities between different insights or visualizations (e.g., all insights with geographical data might be linked together and insights with geographical data from the same region might branch off each other).
A pair of user interface components 730 and 735 may be provided for interacting with a given insights file or visualization file to include adding additional insights files or visualization files. That is, the files, options, and information illustrated in the UI components 730, 735 is illustrative of any of a variety of data, information, content items, and the like that may be applied to the mapping illustrated in
According to embodiments, the insights/visualization menus 730/735 may include a most frequently used set of information or options used in association with a given insight or visualization. For example, if a user has access to a large database with a large number of database columns, the UI component 730/735 may be used to show a list of the most common columns used from the database in the generation of a given insight or visualization. Thus, the generation of visualizations from the example database may be simplified by allowing the user a quick reference of those columns of data most used by the user. In addition, information may be provided via the UI components 730, 735 about which users have interacted with (e.g., viewed, copied from, etc.) a given insight or visualization. By generating a social graph of people that have used a specific insight or visualization, other users, for example, an author of a given insight or visualization, may have more confidence in that insight or visualization (or less confidence if the insight or visualization is never or seldom used).
Having described a system architecture for providing an interactive visualization of multiple functionality content items,
At operation 815, a visualization of one or more content items may be generated, as described above. As should be appreciated, a number of visualizations may be generated from different subsets of content items associated with a given content item storage medium. For example, a given collaborative work space may include 50 content items associated with a given project. A first visualization may be generated for ten of the content items, a second visualization may be generated for ten other content items, and so on. For example, each of the visualizations may be associated with different aspects of the project. For example, a first visualization may be associated with a development stage for a software application development process, a second visualization may be prepared with content items associated with a production stage, and a third visualization may be generated with content items associated with a sales stage of the project. For another example, a first visualization may be generated for ten of the 50 content items, and then a second visualization may be generated from the same ten content items using different data from the ten content items or using a different visualization from the example ten content items.
At operation 820, a visualization ordering for the generated visualization may be generated, and an associated ordered navigation interface component may be generated and displayed in a visualization interface in association with a display of one or more content items associated with a visualization for which the visualization ordering is generated. At operation 825, interaction with content items contained in a given visualization is allowed. As described above with reference to
At operation 830, the visualization may be updated in association with interaction with one or more content items contained in the visualizations. For example, new content items may be displayed in the visualization, deleted content items may be removed from the visualization, ordering of displayed content items may be changed, and the like. At operation 835, an indication of updates to content items through interaction with the visualization may be provided in association with the ordered navigation interface component. For example, if comments are applied to a given content item, then a comment indicator may be provided in association with the ordered navigation interface component to notify a user that a comment has been added to a content item and to allow the user to navigate to the content item via the ordered navigation interface component 205 to allow the user to review the comment that has been applied to the associated content item.
At operation 840, a mapping 700 may be provided, as described and illustrated above with reference to
The embodiments and functionalities described herein may operate via a multitude of computing systems, including wired and wireless computing systems, mobile computing systems (e.g., mobile telephones, tablet or slate type computers, laptop computers, etc.). In addition, the embodiments and functionalities described herein may operate over distributed systems, where application functionality, memory, data storage and retrieval and various processing functions may be operated remotely from each other over a distributed computing network, such as the Internet or an intranet. User interfaces and information of various types may be displayed via on-board computing device displays or via remote display units associated with one or more computing devices. For example user interfaces and information of various types may be displayed and interacted with on a wall surface onto which user interfaces and information of various types are projected. Interaction with the multitude of computing systems with which embodiments of the invention may be practiced include, keystroke entry, touch screen entry, voice or other audio entry, gesture entry where an associated computing device is equipped with detection (e.g., camera) functionality for capturing and interpreting user gestures for controlling the functionality of the computing device, and the like.
Computing device 900 may have additional features or functionality. For example, computing device 900 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in
As stated above, a number of program modules and data files may be stored in system memory 904, including operating system 905. While executing on processing unit 902, programming modules 906, such as the content visualization application 910 may perform processes including, for example, one or more method 800's stages as described above. The aforementioned process is an example, and processing unit 902 may perform other processes. Other programming modules that may be used in accordance with embodiments of the present invention may include electronic mail and contacts applications, word processing applications, spreadsheet applications, database applications, slide presentation applications, drawing or computer-aided application programs, etc.
Generally, consistent with embodiments of the invention, program modules may include routines, programs, components, data structures, and other types of structures that may perform particular tasks or that may implement particular abstract data types. Moreover, embodiments of the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. Embodiments of the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Furthermore, embodiments of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. For example, embodiments of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in
Embodiments of the invention, for example, may be implemented as a computer process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process.
The term computer readable media as used herein may include computer storage media. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. System memory 904, removable storage 909, and non-removable storage 910 are all computer storage media examples (i.e., memory storage.) Computer storage media may include, but is not limited to, RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store information and which can be accessed by computing device 900. Any such computer storage media may be part of device 900. Computing device 900 may also have input device(s) 912 such as a keyboard, a mouse, a pen, a sound input device, a touch input device, etc. Output device(s) 914 such as a display, speakers, a printer, etc. may also be included. The aforementioned devices are examples and others may be used.
The term computer readable media as used herein may also include communication media. Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
Mobile computing device 1000 incorporates output elements, such as display 1005, which can display a graphical user interface (GUI). Other output elements include speaker 1025 and LED light 1020. Additionally, mobile computing device 1000 may incorporate a vibration module (not shown), which causes mobile computing device 1000 to vibrate to notify the user of an event. In yet another embodiment, mobile computing device 1000 may incorporate a headphone jack (not shown) for providing another means of providing output signals.
Although described herein in combination with mobile computing device 1000, in alternative embodiments the invention is used in combination with any number of computer systems, such as in desktop environments, laptop or notebook computer systems, multiprocessor systems, micro-processor based or programmable consumer electronics, network PCs, mini computers, main frame computers and the like. Embodiments of the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network in a distributed computing environment; programs may be located in both local and remote memory storage devices. To summarize, any computer system having a plurality of environment sensors, a plurality of output elements to provide notifications to a user and a plurality of notification event types may incorporate embodiments of the present invention.
One or more application programs 1066 may be loaded into memory 1062 and run on or in association with operating system 1064. Examples of application programs include phone dialer programs, e-mail programs, PIM (personal information management) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth. System 1002 also includes non-volatile storage 1068 within memory 1062. Non-volatile storage 1068 may be used to store persistent information that should not be lost if system 1002 is powered down. Applications 1066 may use and store information in non-volatile storage 1068, such as e-mail or other messages used by an e-mail application, and the like. A synchronization application (not shown) also resides on system 1002 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in non-volatile storage 1068 synchronized with corresponding information stored at the host computer. As should be appreciated, other applications may be loaded into memory 1062 and run on the device 1000, including the content visualization application 910, described herein.
System 1002 has a power supply 1070, which may be implemented as one or more batteries. Power supply 1070 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries.
System 1002 may also include a radio 1072 that performs the function of transmitting and receiving radio frequency communications. Radio 1072 facilitates wireless connectivity between system 1002 and the “outside world”, via a communications carrier or service provider. Transmissions to and from radio 1072 are conducted under control of OS 1064. In other words, communications received by radio 1072 may be disseminated to application programs 1066 via OS 2064, and vice versa.
Radio 1072 allows system 1002 to communicate with other computing devices, such as over a network. Radio 1072 is one example of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.
This embodiment of system 1002 is shown with two types of notification output devices; LED 1020 that can be used to provide visual notifications and an audio interface 1074 that can be used with speaker 1025 to provide audio notifications. These devices may be directly coupled to power supply 1070 so that when activated, they remain on for a duration dictated by the notification mechanism even though processor 1060 and other components might shut down for conserving battery power. LED 1020 may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device. Audio interface 1074 is used to provide audible signals to and receive audible signals from the user. For example, in addition to being coupled to speaker 1025, audio interface 1074 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation. In accordance with embodiments of the present invention, the microphone 920 may also serve as an audio sensor to facilitate control of notifications, as will be described below. System 1002 may further include video interface 1076 that enables an operation of on-board camera 1030 to record still images, video stream, and the like.
A mobile computing device implementing system 1002 may have additional features or functionality. For example, the device may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape. Such additional storage is illustrated in
Data/information generated or captured by the device 1000 and stored via the system 1002 may be stored locally on the device 1000, as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio 1072 or via a wired connection between the device 1000 and a separate computing device associated with the device 1000, for example, a server computer in a distributed computing network, such as the Internet. As should be appreciated such data/information may be accessed via the device 1000 via the radio 1072 or via a distributed computing network. Similarly, such data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.
Embodiments of the present invention, for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to embodiments of the invention. The functions/acts noted in the blocks may occur out of the order as shown in any flowchart. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
While certain embodiments of the invention have been described, other embodiments may exist. Furthermore, although embodiments of the present invention have been described as being associated with data stored in memory and other storage mediums, data can also be stored on or read from other types of computer-readable media, such as secondary storage devices, like hard disks, floppy disks, or a CD-ROM, a carrier wave from the Internet, or other forms of RAM or ROM. Further, the disclosed methods' stages may be modified in any manner, including by reordering stages and/or inserting or deleting stages, without departing from the invention.
It will be apparent to those skilled in the art that various modifications or variations may be made in the present invention without departing from the scope or spirit of the invention. Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein.
Number | Name | Date | Kind |
---|---|---|---|
4831552 | Scully et al. | May 1989 | A |
5297250 | Leroy et al. | Mar 1994 | A |
5495269 | Elrod et al. | Feb 1996 | A |
5566291 | Boulton et al. | Oct 1996 | A |
5675752 | Scott et al. | Oct 1997 | A |
5704029 | Wright, Jr. | Dec 1997 | A |
5717869 | Moran et al. | Feb 1998 | A |
5802299 | Logan et al. | Sep 1998 | A |
5821925 | Carey et al. | Oct 1998 | A |
5821932 | Pittore | Oct 1998 | A |
5893098 | Peters et al. | Apr 1999 | A |
5907324 | Larson et al. | May 1999 | A |
6016478 | Zhang et al. | Jan 2000 | A |
6018346 | Moran | Jan 2000 | A |
6119147 | Toomey | Sep 2000 | A |
6192395 | Lerner et al. | Feb 2001 | B1 |
6208339 | Atlas et al. | Mar 2001 | B1 |
6230185 | Salas et al. | May 2001 | B1 |
6353436 | Reichlen | Mar 2002 | B1 |
6553417 | Gampper | Apr 2003 | B1 |
6546246 | Varma | May 2003 | B1 |
6633315 | Sobeski et al. | Oct 2003 | B1 |
6670970 | Bonura et al. | Dec 2003 | B1 |
6735615 | Iwayama et al. | May 2004 | B1 |
6738075 | Torres et al. | May 2004 | B1 |
7035865 | Doss et al. | Apr 2006 | B2 |
7036076 | Anwar | Apr 2006 | B2 |
7051285 | Harrison et al. | May 2006 | B1 |
7073127 | Zhao et al. | Jul 2006 | B2 |
7075513 | Silfverberg et al. | Jul 2006 | B2 |
7124164 | Chemtob | Oct 2006 | B1 |
7171567 | Bayer et al. | Jan 2007 | B1 |
7203479 | Deeds | Apr 2007 | B2 |
7225257 | Aoike et al. | May 2007 | B2 |
7233933 | Horvitz et al. | Jun 2007 | B2 |
7242389 | Stern | Jul 2007 | B1 |
7246316 | Furlong et al. | Jul 2007 | B2 |
7248677 | Randall et al. | Jul 2007 | B2 |
7251786 | Wynn et al. | Jul 2007 | B2 |
7269787 | Amitay et al. | Sep 2007 | B2 |
7299193 | Cragun et al. | Nov 2007 | B2 |
7299405 | Lee et al. | Nov 2007 | B1 |
7299418 | Dieberger | Nov 2007 | B2 |
7401300 | Murmi | Jul 2008 | B2 |
7426297 | Zhang et al. | Sep 2008 | B2 |
7447713 | Berkheimer | Nov 2008 | B1 |
7451186 | Morinigo et al. | Nov 2008 | B2 |
7454439 | Gansner | Nov 2008 | B1 |
7466334 | Baba | Dec 2008 | B1 |
7469222 | Glazer | Dec 2008 | B1 |
7478129 | Chemtob | Jan 2009 | B1 |
7512906 | Baier et al. | Mar 2009 | B1 |
7554576 | Erol et al. | Jun 2009 | B2 |
7571210 | Swanson et al. | Aug 2009 | B2 |
7590941 | Wee | Sep 2009 | B2 |
7599989 | Stevens et al. | Oct 2009 | B2 |
7606862 | Swearingen et al. | Oct 2009 | B2 |
7627830 | Espinoza et al. | Dec 2009 | B1 |
7636754 | Zhu et al. | Dec 2009 | B2 |
7669141 | Pegg | Feb 2010 | B1 |
7679518 | Pabla et al. | Mar 2010 | B1 |
7730411 | Chotai et al. | Jun 2010 | B2 |
7743098 | Anglin et al. | Jun 2010 | B2 |
7764247 | Blanco et al. | Jul 2010 | B2 |
7770116 | Zhang et al. | Aug 2010 | B2 |
7774221 | Miller et al. | Aug 2010 | B2 |
7818678 | Massand | Oct 2010 | B2 |
7869941 | Coughlin et al. | Jan 2011 | B2 |
7911409 | Chatterjee et al. | Mar 2011 | B1 |
7941399 | Bailor et al. | May 2011 | B2 |
7962525 | Kansal | Jun 2011 | B2 |
7984387 | Batthish et al. | Jul 2011 | B2 |
7992089 | Murray et al. | Aug 2011 | B2 |
8032832 | Russ et al. | Oct 2011 | B2 |
8099458 | Burtner, IV et al. | Jan 2012 | B2 |
8126974 | Lyle et al. | Feb 2012 | B2 |
8150719 | Perella et al. | Apr 2012 | B2 |
8161419 | Palahnuk et al. | Apr 2012 | B2 |
8204942 | Roskind et al. | Jun 2012 | B2 |
8214748 | Srikanth et al. | Jul 2012 | B2 |
8330795 | Iyer et al. | Dec 2012 | B2 |
8358762 | Renner et al. | Jan 2013 | B1 |
8385964 | Haney | Feb 2013 | B2 |
8423883 | Stockmann | Apr 2013 | B1 |
8437461 | Gartner et al. | May 2013 | B1 |
8452839 | Heikes et al. | May 2013 | B2 |
8517888 | Brookins | Aug 2013 | B1 |
8560487 | Jhoney et al. | Oct 2013 | B2 |
8583148 | Ollila et al. | Nov 2013 | B2 |
8606517 | Ehrlacher et al. | Dec 2013 | B1 |
8631119 | Malkin et al. | Jan 2014 | B2 |
8682973 | Kikin-Gil et al. | Mar 2014 | B2 |
8768308 | Kim et al. | Jul 2014 | B2 |
8826117 | Junee | Sep 2014 | B1 |
20010040592 | Foreman et al. | Nov 2001 | A1 |
20020044683 | Deshpande et al. | Apr 2002 | A1 |
20020059604 | Papagan et al. | May 2002 | A1 |
20020062403 | Burnett et al. | May 2002 | A1 |
20020143876 | Boyer et al. | Oct 2002 | A1 |
20020143877 | Hackbarth et al. | Oct 2002 | A1 |
20030020805 | Allen et al. | Jan 2003 | A1 |
20030038831 | Engelfriet | Feb 2003 | A1 |
20030061284 | Mandarino et al. | Mar 2003 | A1 |
20030122863 | Dieberger et al. | Jul 2003 | A1 |
20030137539 | Dees | Jul 2003 | A1 |
20030142133 | Brown et al. | Jul 2003 | A1 |
20030158900 | Santos | Aug 2003 | A1 |
20030172174 | Mihalcheon | Sep 2003 | A1 |
20030222890 | Salesin et al. | Dec 2003 | A1 |
20040024822 | Werndorfer et al. | Feb 2004 | A1 |
20040027370 | Jaeger | Feb 2004 | A1 |
20040030992 | Molsa et al. | Feb 2004 | A1 |
20040034622 | Espinoza et al. | Feb 2004 | A1 |
20040062383 | Sylvain | Apr 2004 | A1 |
20040085354 | Massand | May 2004 | A1 |
20040128350 | Topfl et al. | Jul 2004 | A1 |
20040150627 | Luman et al. | Aug 2004 | A1 |
20040161090 | Digate et al. | Aug 2004 | A1 |
20040169683 | Chiu et al. | Sep 2004 | A1 |
20040175036 | Graham | Sep 2004 | A1 |
20040196286 | Guzik | Oct 2004 | A1 |
20040254998 | Horvitz | Dec 2004 | A1 |
20040263636 | Cutler et al. | Dec 2004 | A1 |
20040267701 | Horvitz et al. | Dec 2004 | A1 |
20050005025 | Harville et al. | Jan 2005 | A1 |
20050018828 | Nierhaus et al. | Jan 2005 | A1 |
20050055625 | Kloss | Mar 2005 | A1 |
20050081160 | Wee et al. | Apr 2005 | A1 |
20050088410 | Chaudhri | Apr 2005 | A1 |
20050091571 | Leichtling | Apr 2005 | A1 |
20050091596 | Anthony et al. | Apr 2005 | A1 |
20050125246 | Muller et al. | Jun 2005 | A1 |
20050125717 | Segal et al. | Jun 2005 | A1 |
20050138109 | Redlich et al. | Jun 2005 | A1 |
20050138570 | Good et al. | Jun 2005 | A1 |
20050171830 | Miller et al. | Aug 2005 | A1 |
20050246642 | Valderas et al. | Nov 2005 | A1 |
20060004911 | Becker | Jan 2006 | A1 |
20060010023 | Tromczynski et al. | Jan 2006 | A1 |
20060010197 | Overden | Jan 2006 | A1 |
20060026253 | Kessen et al. | Feb 2006 | A1 |
20060053380 | Spataro et al. | Mar 2006 | A1 |
20060067250 | Boyer et al. | Mar 2006 | A1 |
20060080610 | Kaminsky | Apr 2006 | A1 |
20060082594 | Vafiadis et al. | Apr 2006 | A1 |
20060132507 | Wang | Jun 2006 | A1 |
20060136828 | Asano | Jun 2006 | A1 |
20060143063 | Braun et al. | Jun 2006 | A1 |
20060146765 | De Sluis et al. | Jul 2006 | A1 |
20060161585 | Clarke et al. | Jul 2006 | A1 |
20060167996 | Orsolini et al. | Jul 2006 | A1 |
20060168533 | Yip et al. | Jul 2006 | A1 |
20060171515 | Hintermeister et al. | Aug 2006 | A1 |
20060184872 | Dontcheva et al. | Aug 2006 | A1 |
20060190547 | Bhogal et al. | Aug 2006 | A1 |
20060195587 | Cadiz et al. | Aug 2006 | A1 |
20060234735 | Digate et al. | Oct 2006 | A1 |
20060239212 | Pirzada et al. | Oct 2006 | A1 |
20060259875 | Collins et al. | Nov 2006 | A1 |
20060265398 | Kaufman | Nov 2006 | A1 |
20060282759 | Collins et al. | Dec 2006 | A1 |
20070005752 | Chawla et al. | Jan 2007 | A1 |
20070011231 | Manion et al. | Jan 2007 | A1 |
20070074268 | Pepper et al. | Mar 2007 | A1 |
20070083597 | Salesky et al. | Apr 2007 | A1 |
20070100937 | Burtner et al. | May 2007 | A1 |
20070106724 | Gorti et al. | May 2007 | A1 |
20070112926 | Brett et al. | May 2007 | A1 |
20070150583 | Asthana et al. | Jun 2007 | A1 |
20070168447 | Chen et al. | Jul 2007 | A1 |
20070174389 | Armstrong | Jul 2007 | A1 |
20070185870 | Hogue et al. | Aug 2007 | A1 |
20070189487 | Sharland et al. | Aug 2007 | A1 |
20070214423 | Teplov et al. | Sep 2007 | A1 |
20070219645 | Thomas et al. | Sep 2007 | A1 |
20070226032 | White | Sep 2007 | A1 |
20070245238 | Fugitt et al. | Oct 2007 | A1 |
20070253424 | Herot et al. | Nov 2007 | A1 |
20070276909 | Chavda et al. | Nov 2007 | A1 |
20070279416 | Cobb et al. | Dec 2007 | A1 |
20070294612 | Drucker et al. | Dec 2007 | A1 |
20070300185 | Macbeth et al. | Dec 2007 | A1 |
20080005235 | Hedge | Jan 2008 | A1 |
20080008458 | Gudipaty et al. | Jan 2008 | A1 |
20080013698 | Holtzberg | Jan 2008 | A1 |
20080022225 | Erl | Jan 2008 | A1 |
20080040187 | Carraher et al. | Feb 2008 | A1 |
20080040188 | Klausmeier | Feb 2008 | A1 |
20080059889 | Parker et al. | Mar 2008 | A1 |
20080065580 | Spence et al. | Mar 2008 | A1 |
20080066016 | Dowdy et al. | Mar 2008 | A1 |
20080084984 | Levy et al. | Apr 2008 | A1 |
20080086688 | Chandratillake | Apr 2008 | A1 |
20080098328 | Rollin et al. | Apr 2008 | A1 |
20080114844 | Sanchez et al. | May 2008 | A1 |
20080115076 | Frank et al. | May 2008 | A1 |
20080136897 | Morishima et al. | Jun 2008 | A1 |
20080141126 | Johnson | Jun 2008 | A1 |
20080177782 | Poston et al. | Jul 2008 | A1 |
20080189624 | Chotai et al. | Aug 2008 | A1 |
20080195981 | Puller et al. | Aug 2008 | A1 |
20080239995 | Lee et al. | Oct 2008 | A1 |
20080244442 | Vaselova et al. | Oct 2008 | A1 |
20080263460 | Altberg | Oct 2008 | A1 |
20080276174 | Hintermeister et al. | Nov 2008 | A1 |
20080288889 | Hunt et al. | Nov 2008 | A1 |
20080300944 | Surazski et al. | Dec 2008 | A1 |
20080303746 | Schlottmann et al. | Dec 2008 | A1 |
20080307322 | Stochosky | Dec 2008 | A1 |
20080320082 | Kuhlke et al. | Dec 2008 | A1 |
20090006980 | Hawley | Jan 2009 | A1 |
20090006982 | Curtis et al. | Jan 2009 | A1 |
20090007014 | Coomer et al. | Jan 2009 | A1 |
20090019367 | Cavagnari et al. | Jan 2009 | A1 |
20090030766 | Denner et al. | Jan 2009 | A1 |
20090037848 | Tewari et al. | Feb 2009 | A1 |
20090043856 | Darby | Feb 2009 | A1 |
20090055739 | Murillo et al. | Feb 2009 | A1 |
20090089055 | Caspi | Apr 2009 | A1 |
20090094367 | Song et al. | Apr 2009 | A1 |
20090109180 | Do | Apr 2009 | A1 |
20090112703 | Brown | Apr 2009 | A1 |
20090119255 | Frank et al. | May 2009 | A1 |
20090119604 | Simard | May 2009 | A1 |
20090129596 | Chavez et al. | May 2009 | A1 |
20090138552 | Johnson et al. | May 2009 | A1 |
20090138826 | Barros | May 2009 | A1 |
20090183095 | Deitsch | Jul 2009 | A1 |
20090204671 | Hawkins et al. | Aug 2009 | A1 |
20090210793 | Yee et al. | Aug 2009 | A1 |
20090222741 | Shaw et al. | Sep 2009 | A1 |
20090228569 | Kalmanje et al. | Sep 2009 | A1 |
20090234721 | Bigelow et al. | Sep 2009 | A1 |
20090235177 | Saul et al. | Sep 2009 | A1 |
20090249223 | Barsook et al. | Oct 2009 | A1 |
20090254843 | Van Wie et al. | Oct 2009 | A1 |
20090265632 | Russ et al. | Oct 2009 | A1 |
20090282339 | Van Melle et al. | Nov 2009 | A1 |
20090309846 | Trachtenberg | Dec 2009 | A1 |
20090313584 | Kerr et al. | Dec 2009 | A1 |
20090319562 | Morten et al. | Dec 2009 | A1 |
20100031152 | Villaron et al. | Feb 2010 | A1 |
20100037140 | Penner et al. | Feb 2010 | A1 |
20100037151 | Ackerman | Feb 2010 | A1 |
20100058201 | Harvey et al. | Mar 2010 | A1 |
20100097331 | Wu | Apr 2010 | A1 |
20100114691 | Wu et al. | May 2010 | A1 |
20100114991 | Chaudhary et al. | May 2010 | A1 |
20100131868 | Chawla et al. | May 2010 | A1 |
20100138756 | Saund et al. | Jun 2010 | A1 |
20100149307 | Iyer et al. | Jun 2010 | A1 |
20100174993 | Pennington et al. | Jul 2010 | A1 |
20100201707 | Rasmussen et al. | Aug 2010 | A1 |
20100235216 | Hehmeyer et al. | Sep 2010 | A1 |
20100235763 | Massand | Sep 2010 | A1 |
20100241968 | Tarara et al. | Sep 2010 | A1 |
20100251140 | Tipirneni | Sep 2010 | A1 |
20100268705 | Douglas et al. | Oct 2010 | A1 |
20100279266 | Laine et al. | Nov 2010 | A1 |
20100306004 | Burtner et al. | Dec 2010 | A1 |
20100306018 | Burtner et al. | Dec 2010 | A1 |
20100312706 | Combet et al. | Dec 2010 | A1 |
20100324963 | Gupta et al. | Dec 2010 | A1 |
20110022967 | Vijayakumar et al. | Jan 2011 | A1 |
20110105092 | Felt et al. | May 2011 | A1 |
20110107241 | Moore | May 2011 | A1 |
20110113348 | Twiss et al. | May 2011 | A1 |
20110113351 | Phillips | May 2011 | A1 |
20110137894 | Narayanan et al. | Jun 2011 | A1 |
20110154180 | Evanitsky et al. | Jun 2011 | A1 |
20110161130 | Whalin et al. | Jun 2011 | A1 |
20110185288 | Gupta et al. | Jul 2011 | A1 |
20110212430 | Smithmier et al. | Sep 2011 | A1 |
20110239142 | Steeves et al. | Sep 2011 | A1 |
20110282871 | Seefeld et al. | Nov 2011 | A1 |
20110289142 | Whalin et al. | Nov 2011 | A1 |
20110289433 | Whalin et al. | Nov 2011 | A1 |
20110295879 | Logis et al. | Dec 2011 | A1 |
20120023418 | Frields et al. | Jan 2012 | A1 |
20120050197 | Kemmochi | Mar 2012 | A1 |
20120075337 | Rasmussen et al. | Mar 2012 | A1 |
20120078708 | Taylor et al. | Mar 2012 | A1 |
20120144325 | Mital et al. | Jun 2012 | A1 |
20120150577 | Berg | Jun 2012 | A1 |
20120150863 | Fish | Jun 2012 | A1 |
20120159347 | Fish | Jun 2012 | A1 |
20120159355 | Fish | Jun 2012 | A1 |
20120166985 | Friend | Jun 2012 | A1 |
20120179980 | Whalin et al. | Jul 2012 | A1 |
20120179981 | Whalin et al. | Jul 2012 | A1 |
20120233543 | Vagell et al. | Sep 2012 | A1 |
20130007103 | Braun et al. | Jan 2013 | A1 |
20130035853 | Stout et al. | Feb 2013 | A1 |
20130091205 | Kotler et al. | Apr 2013 | A1 |
20130091440 | Kotler et al. | Apr 2013 | A1 |
20130097544 | Parker et al. | Apr 2013 | A1 |
20130101978 | Ahl et al. | Apr 2013 | A1 |
20130124978 | Horns et al. | May 2013 | A1 |
20130132886 | Mangini et al. | May 2013 | A1 |
20130154946 | Sakuramata et al. | Jun 2013 | A1 |
20130211980 | Heiferman et al. | Aug 2013 | A1 |
20130212494 | Heiferman et al. | Aug 2013 | A1 |
20130239002 | Maloney et al. | Sep 2013 | A1 |
20130246903 | Mukai | Sep 2013 | A1 |
20130263020 | Heiferman et al. | Oct 2013 | A1 |
20140033068 | Gupta et al. | Jan 2014 | A1 |
20140207867 | Kotler et al. | Jul 2014 | A1 |
20140317561 | Robinson et al. | Oct 2014 | A1 |
20150127628 | Rathod | May 2015 | A1 |
Number | Date | Country |
---|---|---|
1551567 | Dec 2004 | CN |
1723431 | Jan 2006 | CN |
1928859 | Mar 2007 | CN |
1992625 | Jul 2007 | CN |
101689188 | Mar 2010 | CN |
101834905 | Sep 2010 | CN |
1 517 260 | Mar 2005 | EP |
04-257046 | Sep 1992 | JP |
2005139793 | Jun 2007 | RU |
WO 02061682 | Aug 2002 | WO |
2006100475 | Sep 2006 | WO |
2007092470 | Aug 2007 | WO |
Entry |
---|
“Activity Explorer: Activity-centric Collaboration from Research to Product,” IBM Systems Journal, IBM®, 23 pages accessed on Feb. 3, 2009, accessed at: http://www.research.ibm.com/journal/sj/454 /geyer.html. |
Adams et al., “Distributed Research Teams: Meeting Asynchronously in Virtual Space”, Institute of Electrical and Electronics Engineers (1999), 17 pages. |
“Adobe Connect”, Retrieved from: <http://www.adobe.com/acom/connectnow/> on Oct. 11, 2010, (Sep. 16, 2010), 3 pages. |
“Adobe ConnectNow”, Retrieved from: <http://www.adobe.com/acom/connectnow/> on Oct. 13, 2010, (2010), 6 pages. |
“An Overview of Aabel 3 Features”—Retrieved Date: Jul. 21, 2010, http://www.gigawiz.com/Aabel.html, 19 pgs. |
“Aquatic Sugar: The Children's Interface, Translated for Adults,” One Laptop Per Child News, Nov. 7, 2007, 5 pages. |
The Beginner's Guide to Data Visualization, Tableau Software, http://www.tableausoftware.com/beginners-data-visualization, pp. 1-6 (Date Retrieved Jul. 21, 2010). |
Bell, David et al., “Sensory Semantic User Interfaces (SenSUI) (position paper)”, Fluidity Research Grouo: Brunei University. (Oct. 20, 2009), 14 pages. |
Bunzel, Tom “Using Quindi Meeting Capture”, retrieved from http://www.informit.com/guides/content.as[2x?g=msoffice&segNum=220, (Sep. 1, 2006), 3 pages. |
Cathy, et al., “Mindshift Innovation”, Oct. 4, 2007, 2 pages. |
“Cisco Context-Aware Mobility Solution: Presence Applications”, retrieved from https://www.cisco.com/en/US/solutions/collateral/ns340/ns394/ns348/ns788/brochure c22-497557.html on Sep. 7, 2010, 5 pages. |
“Collaboration within the Telepresence Experience”—Published Date: Jan. 2010, http://www.wrplatinum.com/Downloads/11056.aspx, 11 pgs. |
“CounterPoint: A Zooming Presentation Tool”; http://web.archive.org/web/20050205082738/www.cs.umd.edu/hcil/counterpoint/, Archive.org 2005 Capture, 3 pgs. |
“Create treemaps using easy drag and drop interactions”—Retrieved Date: Jul. 21, 2010, http://www.magnaview.nl/treemap/, 1 pg.). |
“CSS Max-width Property” by W3Schools, archived by Internet Archive WaybackMachine Jun. 8, 2007, downloaded Nov. 16, 2012; 1 pg. |
“Datapoint”. Version 1.1, 1997-2007, FileDudes.com, 2 pages. |
Derthick et al., “An Interactive Visualization Environment for Data Exploration”, Published Date: Aug. 1997, http://www.cs.cmu.edu/˜sage/KDD97.html, 9 pgs. |
“Description for SharePoint Meeting Manager”, Retrieved from: <http://www.softpicks.net/software/Business/Project-Managemen/SharePoint-Meeting-Manager-47146.htm> on Oct. 11, 2010 (Jul. 27, 2009),2 pages. |
Fernando et al., “Narrowcasting Attributes for Presence Awareness in Collaborative Virtual Environments”, Published Date: 2006, http://ieeexploreleee.org/stamp/stamp.jsp?tp=&arnumber=4019930, 6 pgs. |
“Free PhotoMesa 3.1.2 (Windows)”, retrieved on Dec. 28, 2007 at <<http://www.windsorinterfaces.com/photomesa.shtml>>, Windsor Interfaces Inc., 3 pages. |
Fruchter, Renate “Brick & Bits & Interaction (BBI)”, http://www.ii.ist.i.kyotou.ac.io/sid/sid2001/oaoers/oositions/bricksbitsinteraction.odf (2001), 4 pages. |
Gallegos, D., et al. “CounterPoint User Manual” class project for Charles Paine at the University of New Mexico, Downloaded from Archive. Org 2005 capture, http://web.archive.org/web/20050205082738/www.cs.umd.edu/hcil/counterpoint/, 21 pgs. |
Good et al. (2001) “CounterPoint: Creating Jazzy Interactive Presentations”; HCIL Tech Report #2001-03, University of Maryland, College Park, MD 20742, 9 pgs. |
“GoToMeeting”, Retrieved from: <httQ://www.gotomeeting.com/fec/online meeting> on Oct. 11, 2010, 1 page. |
Greenberg et al.; “Human and Technical Factors of distributed Group Drawing Tools,” Interacting with Computers 4 (1), Dec. 1992, Butterworth-Heinemann (Special edition on CSCW, Tom Rodden ed.) pp. 364-392. |
Hupfer et al., “Introducing Collaboration into an Application Development Environment,” CSCW '04, Nov. 6-10, 2004, 4 pages. |
Ionescu, Arna et al., “Workspace Navigator: Tools for Capture, Recall and Reuse using Spatial Cues in an Interactive Workspace”, Stanford Technical Re[2ort TR2002-04 htto://bci.stanford.edu/research/wksocNavTR.odf (2002), 16 pages. |
Izadi et al., “Dynamo: A public interactive surface supporting the cooperative sharing and exchange of media” Published Date: Apr. 2007, http://hci.stanford.edu/publications/2007/range-wip-final.pdf, 10 pgs. |
Ju, Wendy et al., “Where the Wild Things Work: Capturing Shared Physical Design Workspaces”; Stanford University, CSCW '04, Nov 601-, 9 pgs. |
Kim, Hyun H., et al., “SmartMeeting: CMPT 481/811 Automatic Meeting Recording System”, http://www.cs.usask.ca/grads/hyk564/homePage/811/CM PT%20811 %20final.doc, (2004), 7 pages. |
“Meet mimio—The Digital Meeting Assistant”, Mayflower Business Systems Limited; http://www.kda.co.uk/mimio1/whiteQaQer.html (May 1999), 10 pages. |
“Meeting Center Using Video in Your Meetings”; Retrieved at <<http://www.oucs.ox.ac.uk/webex/Windows/Video.pdf>>, May 13, 2009, 2 pgs. |
“Meeting Management Software”, Thinking Faster: Ideas, tools and processes to improve personal, workgroup and enterprise productivity and innovation; Retrieved from: <http://workingsmarter.typepad.com/myweblog/2004/12/meeting managem. html> on Oct. 11, 2010, (Dec. 10, 2004), 2 pages. |
“Microsoft Office Communicator 2007 Getting Started Guide”, retrieved from http://www.ittdublin.ie/media/Media 22233 en.odf (Jul. 2007), 77 pages. |
“Microsoft ® Office Live Meeting Feature Guide”, Microsoft Corporation, Available at <http://download.microsoft.com/download/8/0/3/803f9 ba6-5e 12-4b40-84d9-d8a91073e3dc/LiveMeeting.doc>,(Jan. 2005), 17 pgs. |
Mitrovic, Nikola et al., “Adaptive User Interface for Mobile Devices”, retrieved from http://citeseerx.ist.pssu.edu/viewdoc/download?doi=10.1.1.140.4996&rep=rep1 &type=pdf. (2002), 15 pages. |
Moran et al., “Tailorable Domain Objects as Meeting Tools for an Electronic Whiteboard”—Published Date: 1998, http://www.fxpal.com/people/chiu/paper-mvc-CSCW98.pdf, 10 pgs. |
“Online Calendar & Group Scheduling”: MOSAIC Technologies, retrieved from ,http://www.webexone.com/Brandded/ID.asp?brandid=2348&pg=20AppCalendar. On Apr. 29, 2009, 4 pgs. |
Peddemors, A.J.H. et al., “Presence, Location and Instant Messaging in a Context-Aware Application Framework”, retrieved from htt://citeseerx.ist.psu.edu/viewdoc/download?doi=10.11.1.98.3321 &rep=rep1&type=pdf; 4th International Conference on Mobile Data Manaqement MDM (2003),6 pages. |
“The Platinum Experience of Collaboration—CollaboratorHYPERMAX”, Retrieved Date: Jul. 16, 2010, http://www.businessoctane.com/group_telepresence.php, 7 pgs. |
Photodex Corporation; “ProShow Producer Feature Overview”; http://www.photodex.com/products/producer/features.html; 2008; 2 Pgs. |
Rudnicky, Alexander I., et al., “Intelligently Integrating Information from Speech and Vision to Perform Light-weight Meeting Understanding”, retrieved from http://citesseerx.ist.psu.edu/viewdoc/download?doi=10.1.126.1733&rep=rep1 &type=pdf. (Oct. 2005), 6 pages. |
Shaw, “Create Pan and Zoom Effects in PowerPoint”, 2007, Microsoft Corporation, 10 pages. |
Thomas, “Through-Walls Collaboration”—Published Date: 2009, http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5165559, 8 pgs. |
Watson, Richard “What is Mobile Presence?”, Retrieved from http://reseller.tmcnet.com/topics/unified-communications/articles/54033-what-mobile-presence.htm, (Apr. 10, 2009), 4 pages. |
Wempen, F., “PowerPoint 2007 Bible”; Feb. 27, 2007, John Wiley & Sons, 27 pgs. Excerpt. |
Weverka, “PowerPoint 2007 All-in-One Desk Reference for Dummies” Jan. 2007, Published by Wiley Publishing, 8 pgs. |
Yu, Shoou-Jong et al., “Who Said What When? Capturing Important Moments of a Meeting”, retrieved from http://repository.cmu. edu/cgi/viewcontent.cgi?article= 1003&context=silicon valley; Technical Report, (Apr. 10-15, 2010),7 pages. |
Zenghong, Wu et al., “Context Awareness and Modeling in Self-Adaptive Geo-Information Visualization”, retrieved from http://icaci.org/documents/ICC_proceedings/ICC2009/htmI/refer/17_1.pdf on Aug. 30, 2010, 13 pages. |
ZuiPrezi Ltd.; “ZuiPrezi Nonlinear Presentation Editor”; http://zuiprezi.kibu.hu/; 2007; 2 Pgs. |
Office Action dated Aug. 12, 2013, in U.S. Appl. No. 13/272,832. |
Office Action dated Dec. 30, 2013, in U.S. Appl. No. 13/272,832. |
Chinese Fifth Office Action dated May 30, 2014 in Appln No. 200980131157.5, 9 pgs. |
Office Action dated Jun. 5, 2014, in U.S. Appl. No. 121965,965. |
Office Action dated Jul. 17, 2014, in U.S. Appl. No. 12/968,332. |
Office Action dated Jul. 18, 2014, in U.S. Appl. No. 14/225,234. |
Office Action dated Jul. 31, 2014, in U.S. Appl. No. 12/473,206. |
Office Action dated Aug. 11, 2014, in U.S. Appl. No. 12/184,174, 50 pgs. |
Office Action dated Aug. 14, 2014, in U.S. Appl. No. 13/253,886, 17 pgs. |
Chinese Office Action dated Nov. 2, 2014 in Appln No. 201210376181.9, 16 pgs. |
Office Action dated Sep. 16, 2014, in U.S. Appl. No. 12/472,101, 14 pgs. |
Office Action dated Oct. 31, 2014, in U.S. Appl. No. 13/272,832, 17 pgs. |
J. Ambrose Little, High-End Business Intelligence with Data Visualization for WPF 4, Published Jun. 29, 2010, http://www.codeproject.com/KB/showcase/DataVisualizationWPF4.aspx, 7 pgs. |
John Nelson, Just Around the Corner: Visual Fusion 4.5, Published Sep. 30, 2009, http://www.idvsolutions.com/press_newsletter_vfx45_silverlight.aspx, 6 pgs. |
Hewagamage, et al.,Interactive Visualization of Spatiotemporal Patterns Using Spirals on a Geographical Map, Published 1999, http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=00795916, 8 pgs. |
Visualize and Map SalesForce Leads with SpatialKey, Retrieved Jul. 19, 2010, http://www.spatialkey.com/support/tutorials/visualize-and-map-salesforce-leads-with-spatialkey-part-ii/, 8 pgs. |
GeoTime, Retrieved Jul. 19, 2010, http://www.geotime.com/Product/GeoTime-(1)/Features---Benefits.aspx, 7 pgs. |
“Microsoft Word's Click and Type Feature”, published by SnipTools, Nov. 12, 2003 downloaded Jun. 28, 2015 from http://sniptools.com/vault/microsoft-words-click-and-type-feature. |
“Second Office Action and Search Report Issued in Chinese Patent Application No. 201210382816.6”, dated Aug. 26, 2015, 14 Pages. |
Karlson, et al., “Courier: A Collaborative Phone-Based File Exchange System”; Technical Report, MSR-TR-2008-05, Jan. 2008. |
Grass Roots Software; “FREEPATH-EDU Nonlinear Presentation Software”;3 pgs. Retrieved May 13, 2008. http://www.fullcompass.com/product/233150.html. |
Werle, et al., “Active Documents Supporting Teamwork in a Ubiquitous Computing Environment”; 4 pgs. In Proceedings of the PCC Workshop (Apr. 3-5, 2001) http://www.pcc.lth.se/events/workshops/2001/pccposters/Werle.pdf. |
Office Action dated Sep. 16, 2015, in U.S. Appl. No. 14/272,832, 17 pgs. |
Chinese Third Office Action dated Feb. 22, 2016 in Appln No. 201210382816.6, 10 pgs. |
Office Action dated Mar. 4, 2016, in U.S. Appl. No. 13/272,832, 27 pgs. |
Chinese Office Action dated Feb. 3, 2015 in Appln No. 201210382816.6, 13 pgs. |
Office Action dated Apr. 20, 2015, in U.S. Appl. No. 14/272,832, 66 pgs. |
Chinese Notice of Allowance dated Nov. 30, 2016, in Appln No. 201210382816.6, 4 pgs. |
Fourth Office Action Issued in Chinese Patent Application No. 201210382816.6, dated Sep. 1, 2016, 6 Pages. |
“The Screen Capture Tool” by Help and Manual, archived Mar. 13, 1006 by the Internet Wayback Machine, downloaded Nov. 28, 2016 from https://web.archive.org/web/20060313150929/http://www.helpandmanual.com/help/help_toc_html?hm_advanced_tools_capture. |
Office Action dated Oct. 4, 2016, in U.S. Appl. No. 13/272,832, 22 pgs. |
Office Action dated Mar. 10, 2017, in U.S. Appl. No. 13/272,832, 18 pgs. |
Number | Date | Country | |
---|---|---|---|
20130091465 A1 | Apr 2013 | US |