The following detailed description of embodiments refers to the accompanying drawings, which illustrate specific embodiments of the invention. Other embodiments having different structures and operations do not depart from the scope of the present invention.
As will be appreciated by one of skill in the art, the present invention may be embodied as a method, system, or computer program product. Accordingly, portions of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, microcode, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, the present invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
Any suitable computer usable or computer readable medium may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device. Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
Computer program code for carrying out operations of the present invention may be written in an object oriented programming language such as Java, Smalltalk, C++ or the like. However, the computer program code for carrying out operations of the present invention may also be written in conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
The present invention is described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The field augmenting device 114 may include a live window 118 or a “live feed” or portion of a display 120 to present the live or real-time video image 106 of the chosen scene 108. The field augmenting device 114 may also include a support window 122, “office feed” or support portion of the display 120 to present the video image 106′ including any augmentation by the field personnel returned from the support site 112 and may also include any augmentation by the support personnel sent by a support site augmentation device 124 over a network 126.
Augmentation may include at least one of selection of a feature 128 in the video image 106 and association of any attributes 130 to the selected feature 128. Attributes 130 may include but are not necessarily limited to any annotations including text, graphics or the like, highlighting, identifying or otherwise drawing attention to a particular artifact, component or subject of interest in the video image 106. Association may involve assigning, attaching or coupling in some fashion the attribute 130 to the selected component or artifact in the video image 106. As illustrated in the example of
The field augmenting device 114 may include a user interface 132 and/or a touch-screen pen 134 or similar device to permit augmentation of the video image 106 similar to that just described. The touch-screen pen or similar device may be used to contact the display 120 to select features and associate any attributes to the selected features. In at least one embodiment of the present invention, a graphical user interface (GUI) or the like may be presented on the display 120 to facilitate augmenting the video image 106 as described above.
The field augmenting device 114 may also include a wired or wireless link 136 to a headset 138 or the like for voice or audio communication between a field person and support person. If a wireless link, the headset may be a wireless Bluetooth type headset or the like. In another embodiment of the present invention, a microphone and speaker may be built into the augmenting device 114 to permit audio and voice communications over the network 126 between the field site 110 and the support site 112. The system 100 may use Voice over Internet Protocol (VoIP) or similar technology for either these embodiments.
The field augmenting device 114 may access the network 126 via a wired connection or as illustrated in the exemplary embodiment of
As previously described, the system 100 may include a support site augmenting device 124 to receive video transmissions over the network 126 from the field augmenting device 114. The video transmissions may include the video image 106″ and any augmentation 144 by the field personnel. The support site augmenting device 124 may include a computing device 146. The computing device 146 may be a desktop PC or similar computer device. An augmentation module 148 may be operable on the computing device 144 to permit augmentation by the support personnel. The augmentation module may also permit sending the video image including any augmentation by the field personnel and any augmentation by the support personnel to the field augmenting device 114 over the network 126. An example of a method that may be embodied in the augmentation module 148 will be described with reference to
The support site augmenting device 124 may include a persistence feature 150 to maintain association between the selected feature 128 in the video image 106″ and its associated attribute 130 in response to any movement of the selected feature 128 in the chosen scene 108 or change in perspective of the video camera 104 relative to the chosen scene 108. The persistence feature 150 or some elements of the persistence feature 150 may actually be included in the field augmenting device 114 as will be described in more detail with reference to
The support augmenting device 124 may include a display 152. A field window 154 may be presented in a portion of the display 152 to present the video image 106″ including any augmentation by the field personnel and by the support personnel.
The support augmenting device 124 may further include a user interface 156 to facilitate augmentation of the video image 106″ by the support personnel. The user interface 156 may include a keyboard, a pointing device, such as a mouse or the like, and other apparatus to facilitate augmentation. A graphical user interface 158 or virtual toolbox may be presented in another portion 160 of the display 152 relative to the field window 154 to facilitate augmentation of the video image 106″ by the support personnel. The graphical user interface (GUI) 158 may include features to provide technical support associated with a predetermined kind of equipment or technology, such as a particular type of aircraft. The GU 158 may include icons or highlighting artifacts that can be dragged and dropped by a computer pointing device to select features in the video image 106″ and to assign or associate attributes to a selected feature. The GUI 158 may include links to technical manuals or the like that may be referenced by support personnel in assisting field personnel on a particular kind of equipment. Augmentation by support personnel may include but is not necessarily limited to association of other attributes with the feature or features selected by the field personnel, selection of another feature or features in the video image 106″, and association of any attributes with the other feature or features.
The field augmenting device 200 may include a live window 210 to present the video image of the chosen scene. The live window 210 may be presented in a portion of a display 212. The live window may correspond to the live window 118 or live feed in
The field augmenting device 200 may also include a support window 214 to present a video image including any augmentation by field personnel and any augmentation by support personnel received from a remote support site, such as support site 112 (
A graphical user interface (GUI) 216 may be presented in the display to facilitate augmenting the video image. The GUI 216 may be similar to GUI 158 described above with reference to
The augmenting device 200 may also include a user interface 218 to facilitate augmenting the video image and for controlling operation of the computing device 202. The user interface 218 may include a keyboard, pointing device or the like.
The augmenting device 200 may also include a radio transceiver 220 and a short range radio transceiver 222. The radio transceiver 220 may permit wireless or radio access to a network 224, such as a WLAN or the like. The short range radio transceiver 222 may permit communications with a wireless headset or similar device, such as headset 138 in
The computing device 202 may include a memory 228 and other components 230, such as input devices, output device or combination input/output devices. The memory may store data, such as data related to the video images, selected features and attributes and other data for later analysis. The other components may include disk drives for inputting and outputting data, reprogramming the field augmenting device 200 or for other purposes.
In blocks 306 and 308, duplex or two-way audio or voice communications may be established between the field site and the support site. The audio communications may be any means of audio communications, such as voice over internet protocol (VoIP), wireless, such as cellular, land line, a combination of these technologies, or other means. The audio communications may also be part of or included in the video signal or data stream.
In block 310, a real-time or live video image of a chosen scene may be acquired. A video camera or similar device may be used to acquire the video image as illustrated in the exemplary system 100 of
In block 312, the real-time video image may be presented in a “live window” or portion of a display of the field augmenting device. In block 314, the video image may be streamed to the support site augmenting device. The video image may also be streamed to other support sites and augmenting devices, if there are multiple sites collaborating for whatever reason, such as training, additional technical support or specific expertise, or for other reasons.
In block 316, the support augmenting device may receive the streaming real-time video image from the field augmenting device. In block 318, the real-time video image may be presented in a “field window” or portion of a display of the support site augmenting device. In block 320, the video image may be streamed back to the field site augmenting device.
In block 322, the video image may be received and presented in a “support window,” office feed, or portion of the display of the field augmenting device similar to that illustrated in the display 120 of the field augmenting device 114 in
In block 326, any attributes may be associated with the selected feature or features in the video image. Associating an attribute with the selected feature may include assigning, coupling or attaching in some way the attribute to the particular feature or artifact in the video image. This may also be considered part of the augmentation process. Associating the attribute to the selected feature may involve contacting the selected feature in a prescribed manner in the video image using a computer pointing device, touch-screen pen, dragging and dropping an icon or symbol from a menu or GUI or other means which are commonly known in manipulating items on a computer.
In block 328, a persistence feature or function may be operational. As previously described, the persistence feature provides for association between the attribute and selected feature in the video image to be maintained in response to any movement of the subject or artifact in the chosen scene or changes in perspective of the video camera acquiring the image.
In block 330, the video image and any augmentation by the field personnel may be streamed back to the support site augmenting device. In block 332, the video image and any augmentation may be received by the support site augmenting device. In block 334, the video image and any augmentation by the field personnel may be presented in the field window or portion of the display of the support site augmenting device. In block 336, the video image and any augmentation by field personnel may be streamed back to the field augmenting device.
In block 338, the video image and any augmentation by the field personnel may be received by the field augmenting device. In block 340, the video image and any augmentation by the field personnel may be presented in a “support window,” or office feed portion of the display of the field augmenting device.
In block 342, real-time augmentation by the support personnel of the video image received in block 332 may be enabled in block 342. Augmentation by the support personnel may include but is not necessarily limited to association of attributes to features selected by field personnel, selection of other features in the video image and association of attributes to those features.
In block 344, a user interface may be used to facilitate augmentation. A GUI or virtual toolbox may be presented in another portion of the display of the support site augmenting device relative to the field window or portion of the display. The GUI may include different tools, icons, links to technical manual or other features to permit support personnel to provide support or other assistance. The GUI may include icons to generate text blocks to enter labels, descriptions, instructions or similar textual information or graphics. An example of a GUI or toolbox is illustrated in
In block 346, any augmentations may be associated with the selected feature in the video image. Similar to that previously discussed, the association may involve attaching, coupling, assigning or otherwise associating the augmentation with the particular feature of interest in the video image. The association may be made by contacting or touching the selected feature in the video image in a prescribed manner similar to that previously discussed.
In block 348, augmentations by the field personnel may be distinguished from augmentations by the support personnel. Examples of differentiating the augmentations may include different colors, highlighting, different fonts, text blocks with different color background, etc.
In block 350, the support site augmenting device may also include a persistence feature or function to maintain association between assigned attributes and selected features in the event of any movement of the feature or subject of interest in the chosen scene or change in perspective of the camera acquiring the video image.
In block 352, the video image and any augmentations by support personnel and field personnel may be transmitted or streamed to the field site augmenting device. In block 354, the video image including any augmentations may be received by the field augmenting device. The video image on at least the support window or portion of the display may be updated to include the video image and any augmentations by the support personnel and the field personnel. The augmentations of the different personnel may be distinguished similar to that previously discussed.
In block 356, further augmentation, editing prior augmentations and the like may be performed in the field augmenting device in block 356. The method 300 may then return to block 324 and the method may proceed as previously described. Similarly, in block 358, further augmentation, editing prior augmentations and the like may be performed in the support site augmenting device. The method 300 may then return to block 342 and the method may proceed similar to that previously described.
A field technician may augment the local video stream 408. For example, annotate the video image, illustrate a question on a part, or the like. The augmented video may be streamed back to the support site 410. A support engineer may see the augmentation about two seconds later and may prepare a response 412 by further augmenting the video image similar to that described with respect to method 300 (
The support team may further mark up or further augment the video in block 418 and the augmented video may be streamed to the field site 420. In block 422, the field technician sees the annotation provided by the support team approximately two seconds after being streamed from the support site.
The flowcharts and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Although specific embodiments have been illustrated and described herein, those of ordinary skill in the art appreciate that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown and that the invention has other applications in other environments. This application is intended to cover any adaptations or variations of the present invention. The following claims are in no way intended to limit the scope of the invention to the specific embodiments described herein.