This disclosure relates generally to video collaboration systems, and, more particularly, to methods, apparatus and articles of manufacture to protect sensitive information in video collaboration systems.
Many modern conference and collaboration rooms are equipped with video collaboration equipment. Video collaboration systems enable people to see each other at a distance, and have more lively, emotional and/or productive interactions.
Wherever possible, the same reference numbers will be used throughout the drawing(s) and accompanying written description to refer to the same or like parts. Connecting lines or connectors shown in the various figures presented are intended to represent example functional relationships and/or physical or logical couplings between the various elements
During video collaborations, people may want and/or need to exclude sensitive information from being exposed to another person. For example, a person may want and/or need to avoid sensitive documents, exposed computer screens, exposed mobile device screens, whiteboards, sketches on whiteboards, etc. from being exposed to another person. In current solutions, users are required to start and stop video transmission. Such current solutions are often associated with a poor user experience, some video equipment doesn't make starting and stopping video user friendly, users may forget to manually stop video transmission leading to inadvertent information leakage, meeting productivity may be reduced when video is disabled, etc. Some current solutions are performed off-line after a person sees the stream. Thus, in some current solutions, a person can maliciously edit a video stream by, for example, copying content before sensitive information is obscured.
Disclosed example masking streamers overcome these and other deficiencies of existing solutions. Disclosed example masking streamers protect sensitive information in video collaboration systems by recognizing (e.g., detecting, identifying, etc.) features (e.g., objects, persons, content, etc.) in a video that are associated with sensitive information, and obscuring (e.g., masking, covering, blurring, etc.) those features in the video stream. In some examples, the features are automatically recognized in real-time (e.g., as each frame is captured), and sensitive information automatically obscured in the video stream in accordance with policies before each frame is streamed, stored, etc., that is, before a frame can be seen by a person. Accordingly, sensitive information in frames can be obscured before the frames are seen by anyone, further reducing the chances for inadvertent exposure of sensitive information. Moreover, because obscuration is performed before the video is seen, the video stream can be protected against malicious editing, e.g., copying non-obscured video stream. In some examples, the masking streamer is implemented in hardware, in a trusted execution environment, etc. to prevent tampering with frames, object recognition and/or masking.
Reference will now be made in detail to non-limiting examples of this disclosure, examples of which are illustrated in the accompanying drawings. The examples are described below by referring to the drawings.
The example masking streamer 102 of
In the illustrated example, the masking streamer 102 and the video camera 104 are located at a SITE A, and the projector 114 is located at a SITE B. SITE A and SITE B may be any type(s) of sites (e.g., rooms, auditoriums, personal computers, mobile devices, etc.) separated by any geographic distance (e.g., locations in a building, different buildings, different cities, etc.). Data representing the modified video stream 112 may be conveyed using any number and/or type(s) of private and/or public networks 113, computer-readable storage medium or, more generally, any number and/or type(s) of communicative couplings. The data may be conveyed in real-time (e.g., as the video stream 106 is captured) or delayed (e.g., storage on a computer-readable storage medium).
The example media source 106 provides the media 104 and the audio 110 to the example media presentation device 114 using any number and/or type(s) of example public, and/or public network(s) 116 or, more generally, any number and/or type(s) of communicative couplings.
In some examples, a computing device 120 (e.g., a personal computer, a mobile device, a tablet, a gaming console, etc.) is communicatively coupled to the masking streamer 102 to provide (e.g., define, etc.) information on what features to identify, and/or what masking policy(-ies) (e.g., mask vs. not mask, blur vs. completely mask, etc.) to apply to the features. In some examples, the masking streamer 102 is integrated with the computing device 120, e.g., in a same housing.
In the illustrated example of
To enforce sensitive information protection policies 216, the example masking streamer 102 includes an example policy enforcer 218. The example policies 216 of
For each feature identified by the example analytics engine 204, the example policy enforcer 218 of
To apply the obscuration, the example masking streamer 102 includes an example masker 222. The example masker 222 of
To encode video, the example masking streamer 102 includes an example video encoder 224. In the example of
To stream video encoded by the example video encoder 224 to, for example, the example projector 114, the example masking streamer 102 includes an example streamer 226. The example streamer 226 streams video in accordance with any number and/or type(s) of video streaming specifications and/or standards.
Because, in the illustrated example of
While shown separately in
In some examples, a person can use an example user interface 228 to manage the feature list 206 and/or the policies 216. The feature list 206 corresponds to the features that the analytics engine 204 has been trained to detect. Accordingly, the feature list 206 is, in some examples, only modifiable by an administrator. In some examples, an attendee of a video collaboration session can manage the policies 216 on a call-by-call basis based on, for example, the topic of a meeting, attendees, etc. In some examples, only an administrator can manage the policies 216.
While an example manner of implementing the masking streamer 102 of
Turning to
Turning to
A flowchart representative of example machine-readable instructions for implementing the masking streamer 102 of
As mentioned above, the example processes of
The example program of
The processor platform 600 of the illustrated example includes a processor 610. The processor 610 of the illustrated example is hardware. For example, the processor 610 can be implemented by one or more integrated circuits, logic circuits, microprocessors, GPUs, DSPs or controllers from any desired family or manufacturer. The hardware processor may be a semiconductor based (e.g., silicon based) device. In this example, the processor implements the analytics engine 204, the example object recognizer 210, the example person recognizer 212, the example content recognizer, the example policy enforcer 218, the example mask calculator 220, the example masker 222, the example video encoder 224, the example streamer 226, and the example user interface 228.
The processor 610 of the illustrated example includes a local memory 612 (e.g., a cache). The processor 610 of the illustrated example is in communication with a main memory including a volatile memory 614 and a non-volatile memory 616 via a bus 618. The volatile memory 614 may be implemented by Synchronous Dynamic Random-access Memory (SDRAM), Dynamic Random-access Memory (DRAM), RAMBUS® Dynamic Random-access Memory (RDRAM®) and/or any other type of random-access memory device. The non-volatile memory 616 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 614, 616 is controlled by a memory controller (not shown).
The processor platform 600 of the illustrated example also includes an interface circuit 620. The interface circuit 620 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB) interface, a Bluetooth® interface, a near field communication (NFC) interface, and/or a peripheral component interface (PCI) express interface.
In the illustrated example, one or more input devices 622 are connected to the interface circuit 620. The input device(s) 622 permit(s) a user to enter data and/or commands into the processor 610. The input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 624 are also connected to the interface circuit 620 of the illustrated example. The output devices 624 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube display (CRT), an in-plane switching (IPS) display, a touchscreen, etc.) a tactile output device, a printer, and/or speakers. The interface circuit 620 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip and/or a graphics driver processor.
The interface circuit 620 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem, a residential gateway, and/or network interface to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 626 (e.g., an Ethernet connection, a digital subscriber line (DSL), a telephone line, a coaxial cable, a cellular telephone system, a Wi-Fi system, etc.). In some examples of a Wi-Fi system, the interface circuit 6p20 includes a radio frequency (RF) module, antenna(s), amplifiers, filters, modulators, etc.
The processor platform 600 of the illustrated example also includes one or more mass storage devices 628 for storing software and/or data. Examples of such mass storage devices 628 include floppy disk drives, hard drive disks, CD drives, Blu-ray disk drives, redundant array of independent disks (RAID) systems, and DVD drives.
Coded instructions 632 including the coded instructions of
From the foregoing, it will be appreciated that example methods, apparatus and articles of manufacture have been disclosed that protect sensitive information in video collaboration systems. From the foregoing, it will be appreciated that methods, apparatus and articles of manufacture have been disclosed which enhance the operations of video collaboration systems by automatically recognizing, in real-time, features in video frames that may contain sensitive information as each video frame is captured, and automatically obscuring those features in the video frame before the video frame is encoded, stored, streamed, etc. Such systems reduce the opportunity for a person to be exposed to sensitive information in the video frame before it obscured.
Example methods, apparatus, and articles of manufacture to protect sensitive information in video collaboration systems are disclosed herein. Further examples and combinations thereof include at least the following.
Example 1 is a masking video streamer including:
an analytics engine to recognize a feature in a first frame of a first video stream;
a policy enforcer to apply an obscuration policy to the recognized feature to identify whether to mask the recognized feature; and
a masker to obscure the recognized feature in the first frame to form a second frame in a second video stream.
Example 2 is the masking video streamer of example 1, further including a video camera to capture the first frame of the first video stream, and a streamer to stream the second video stream.
Example 3 is the masking video streamer of example 2, further including a housing, the video camera, the analytics engine, the policy enforcer, the masker, and the streamer implemented in the housing.
Example 4 is the masking video streamer of example 1, wherein the analytics engine, the policy enforcer, and the masker are implemented in a trusted execution environment.
Example 5 is the masking video streamer of example 1, wherein the masker is to obscure the recognized feature by combining the first frame with an image that at least partially obscures the recognized feature.
Example 6 is the masking video streamer of example 1, further including a non-transitory computer-readable storage medium storing a list of features, wherein the analytics engine recognizes the feature based on the list of features.
Example 7 is the masking video streamer of example 1, further including a video encoder to encode the second frame.
Example 8 is the masking video streamer of example 1, wherein the recognized feature includes at least one of a face, a whiteboard, a portion of a whiteboard, a computer screen, a phone screen, or a projection screen.
Example 9 is the masking video streamer of example 1, wherein the policy enforcer includes a mask calculator to determine a portion of the first frame to obscure.
Example 10 is the masking video streamer of example 1, wherein obscuring the recognized feature includes at least one of blurring, or a black box.
Example 11 is the masking video streamer of example 1, wherein the analytics engine is to recognize the feature based on a generic object definition.
Example 12 is a method including:
recognizing, by executing an instruction with a processor, a feature in a first image;
querying a policy, by executing an instruction with the processor, to determine whether to obscure the feature;
when the feature is to be obscured, modifying, by executing an instruction with the processor, the first image to obscure the feature in the first image to form a second image; and
sending, by executing an instruction with the processor, the second image for playback by a projector.
Example 13 is the method of example 12, wherein modifying the first image to obscure the feature includes combining the first image and a third image, the third image including a portion that obscures the feature.
Example 14 is the method of example 12, wherein modifying the first image to obscure the feature includes combining the first frame with an image that at least partially obscures the recognized feature.
Example 15 is the method of example 12, wherein modifying the first image to obscure the feature includes encoding the second frame.
Example 16 is the method of example 12, wherein modifying the first image to obscure the feature includes determining a portion of the first frame to obscure.
Example 17 is the method of example 12, wherein recognizing the feature in the first image is based on a generic object definition.
Example 18 is a non-transitory computer-readable storage medium comprising instructions that, when executed, cause a machine to perform operations including:
recognizing, by executing an instruction with a processor, a feature in a first image;
querying a policy, by executing an instruction with the processor, to determine whether to obscure the feature;
when the feature is to be obscured, modifying, by executing an instruction with the processor, the first image to obscure the feature in the first image to form a second image; and
sending, by executing an instruction with the processor, the second image for playback by a projector.
Example 19 is the non-transitory computer-readable storage medium of example 18, wherein the operations further include recognizing the feature in the first image based on a generic object definition.
Example 20 is the non-transitory computer-readable storage medium of sample 18, wherein the operations further include modifying the first image to obscure the feature by combining the first frame with an image that at least partially obscures the recognized feature.
Example 21 is a masking video streamer including:
an analytics engine to recognize a feature in a first frame of a first video stream;
a policy enforcer to apply an obscuration policy to the recognized feature to identify whether to mask the recognized feature; and
a masker to obscure the recognized feature in the first frame to form a second frame in a second video stream.
Example 22 is the masking video streamer of example 21, further including:
a video camera to capture the first frame of the first video stream; and
a streamer to stream the second video stream.
Example 23 is the masking video streamer of example 22, further including a housing, the video camera, the analytics engine, the policy enforcer, the masker, and the streamer implemented in the housing.
Example 24 is the masking video streamer of any of examples 21 to 23, wherein the analytics engine, the policy enforcer, and the masker are implemented in a trusted execution environment.
Example 25 is the masking video streamer of any of examples 21 to 24, wherein the masker is to obscure the recognized feature by combining the first frame with an image that at least partially obscures the recognized feature.
Example 26 is the masking video streamer of any of examples 21 to 25, further including a non-transitory computer-readable storage medium storing a list of features, wherein the analytics engine recognizes the feature based on the list of features.
Example 27 is the masking video streamer of any of examples 21 to 26, further including a video encoder to encode the second frame.
Example 28 is the masking video streamer of any of examples 21 to 27, wherein the recognized feature includes at least one of a face, a whiteboard, a portion of a whiteboard, a computer screen, a phone screen, or a projection screen.
Example 29 is the masking video streamer of any of examples 21 to 28, wherein the policy enforcer includes a mask calculator to determine a portion of the first frame to obscure.
Example 30 is the masking video streamer of any of examples 21 to 29, wherein the analytics engine is to recognize the feature based on a generic object definition.
Example 31 is a method including:
recognizing, by executing an instruction with a processor, a feature in a first image;
querying a policy, by executing an instruction with the processor, to determine whether to obscure the feature;
when the feature is to be obscured, modifying, by executing an instruction with the processor, the first image to obscure the feature in the first image to form a second image; and
sending, by executing an instruction with the processor, the second image for playback by a projector.
Example 32 is the method of example 31, wherein modifying the first image to obscure the feature includes combining the first image and a third image, the third image including a portion that obscures the feature.
Example 33 is the method of any of examples 31 to 32, wherein modifying the first image to obscure the feature includes combining the first frame with an image that at least partially obscures the recognized feature.
Example 34 is the method of any of examples 31 to 33, wherein modifying the first image to obscure the feature includes encoding the second frame.
Example 35 is the method of any of examples 31 to 34, wherein modifying the first image to obscure the feature includes determining a portion of the first frame to obscure.
A non-transitory computer-readable storage medium comprising instructions that, when executed, cause a computer processor to perform the method of any of claims 31 to 35.
Example 37 is a system including:
means for recognizing, by executing an instruction with a processor, a feature in a first image;
means for querying a policy, by executing an instruction with the processor, to determine whether to obscure the feature;
means for, when the feature is to be obscured, modifying, by executing an instruction with the processor, the first image to obscure the feature in the first image to form a second image; and
means for sending, by executing an instruction with the processor, the second image for playback by a projector.
Example 38 is the system of example 37, wherein the means for modifying the first image to obscure the feature includes combines the first image and a third image, the third image including a portion that obscures the feature.
Example 39 is the system of any of examples 37 to 38, wherein the means for modifying the first image to obscure the feature combines the first frame with an image that at least partially obscures the recognized feature.
Example 40 is the system of any of examples 37 to 39, wherein the means for modifying the first image to obscure the feature encodes the second frame.
Example 41 is the system of any of examples 37 to 40, wherein the means for modifying the first image to obscure the feature determines a portion of the first frame to obscure.
Example 42 is the system of any of examples 37 to 41, wherein the means for recognizing the feature in the first image recognizes the feature based on a generic object definition.
“Including” and “comprising” (and all forms and tenses thereof) are used herein to be open ended terms. Thus, whenever a claim lists anything following any form of “include” or “comprise” (e.g., comprises, includes, comprising, including, has, etc.), it is to be understood that additional elements, terms, etc. may be present without falling outside the scope of the corresponding claim. As used herein, when the phrase “at least” is used as the transition term in a preamble of a claim, it is open-ended in the same manner as the term “comprising” and “including” are open ended. Conjunctions such as “and,” “or,” and “and/or” are inclusive unless the context clearly dictates otherwise. For example, “A and/or B” includes A alone, B alone, and A with B. In this specification and the appended claims, the singular forms “a,” “an” and “the” do not exclude the plural reference unless the context clearly dictates otherwise.
Any references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.
Although certain example methods, apparatus and articles of manufacture have been disclosed herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, apparatus and articles of manufacture fairly falling within the scope of the claims of this patent.