A variety of security, monitoring and control systems equipped with a plurality of cameras and/or sensors have been used to detect various threats such as intrusions, fire, smoke, flood, etc. at a monitored location (e.g., home or office). For a non-limiting example, motion detection is often used to detect intruders in vacated homes or buildings, wherein the detection of an intruder may lead to an audio or silent alarm and contact of security personnel. Video monitoring is also used to provide additional information about personnel living in, for a non-limiting example, an assisted living facility.
Currently, home or office security monitoring systems can be artificial intelligence (AI) or machine learning (ML)-driven, which process video and/or audio stream collected from the video cameras and/or other sensors to differentiate and detect abnormal activities/events by persons from their normal daily routines at a monitored location. However, since the video streams often include images and representations of the persons at the monitored location, which may be in private settings, such as inside of their homes and/or offices, such video stream-based security monitoring system may cause privacy concerns with respect to the persons' images and activities in private.
The foregoing examples of the related art and limitations related therewith are intended to be illustrative and not exclusive. Other limitations of the related art will become apparent upon a reading of the specification and a study of the drawings.
Aspects of the present disclosure are best understood from the following detailed description when read with the accompanying figures. It is noted that, in accordance with the standard practice in the industry, various features are not drawn to scale. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
The following disclosure provides many different embodiments, or examples, for implementing different features of the subject matter. Specific examples of components and arrangements are described below to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting. In addition, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
A new approach is proposed that contemplates systems and methods to support efficient user privacy protection for security monitoring. Under the proposed approach, a privacy mode is deployed to a security monitoring system, which captures privacy information of a user (person being monitored), including but not limited to video, audio, and other privacy information of a user captured during security monitoring. Under the privacy mode, a set of stick figures/skeletons depicting/representing postures of the human body of the user is extracted from a set of still images in a captured video stream. In some embodiments, at least a portion of the human body of the user is pixelized to ensure protection of the user's privacy data while still enabling the security monitoring system to effectively perform its security monitoring functions. In addition, the captured privacy data of the user is securely stored at a local site (e.g., a local database) and boundaries of the user in the images are computed to not only reduce latency of user data processing in real time security monitoring but also to further ensure privacy of the user.
Under the proposed approach, body images and other privacy data of the user are uniquely handled to provide the highest privacy for the user in a security monitoring environment, e.g., in elderly care facilities, homes, and/or work places (e.g., factories, construction sites, retail shops, offices, public transport etc.) or other private settings where residents, workers or customers' privacy is sensitive and expected to be protected by laws and/or regulations. Specifically, this privacy mode is a novel application deployed for human activity monitoring (specifically in elderly home care) to detect possible abnormalities of the users. In the meantime, the proposed approach is able to ensure that the security monitoring can still perform its monitoring functions accurately in real time while protecting the user's privacy data.
Although security monitoring systems have been used as non-limiting examples to illustrate the proposed approach to efficient user privacy protection, it is appreciated that the same or similar approach can also be applied to efficient privacy protection in other types of AI-driven systems that utilize a user's privacy data.
In the example of
In the example of
In the example of
In some embodiments, the collected privacy or sensitive information (e.g., images, video, and/or audio) of the users are maintained in a secured local user data database 104, which can be a data cache associated with the user data privacy engine 102, to ensure privacy of the user. For example, live video stream from the cameras can be stored locally as a video archive file. The data locally maintained in the local user data database 104 can be accessed by user data privacy engine 102 and/or the human activity detection engine 106 via one or more Application Programming Interface (API) under strict data access control policies (e.g., only accessible for authorized personnel or devices only) to protect the user's privacy. In some embodiments, information retrieved from the local user data database 104 is encrypted before such information is transmitted over a network for processing. The local user data database 104 guarantees the user being monitored at the location have full control of his/her data, which is particularly important in sensitive or private areas such as a bathroom or a bedroom.
In the example of
In the next step of the approach, the human activity detection engine 106 is configured to accept and match/compare the stick figure extracted by the user data privacy engine 102 in a still image currently taken from the video stream with a stick figure extracted from an image previously taken from the video stream at the same monitored location to identify or recognize an activity of the user. In some embodiments, the human activity detection engine 106 is located remotely from the user data privacy engine 102 and/or the monitored location. In some embodiments, the human activity detection engine 106 is configured to retrieve the stick figures extracted from the current and/or the previous image of the user from the local user database 104. In some embodiments, the human activity detection engine 106 is configured to determine the probability that the stick figure from current image matches the stick figure from the previous image by calculating one or more of the following metrics between the two stick figures:
In some embodiments, the human activity detection engine 106 is configured to track and analyze activity, behavior and/or movement of the user based on the set of stick figures of the user identified over time. If the human activity detection engine 106 determines that the most recent activity of the user as represented by the latest set of stick figures deviates from the user's activity at the same or similar monitored location in the past, the human activity detection engine 106 is configured to identify the most recent activity of the user as abnormal and to alert an administrator at the monitored location about the recognized abnormal activity. In some embodiments, the human activity detection engine 106 is configured to request or subscribe information of the user from the local user database 104 and/or the user data privacy engine 102 directly for tracking and analyzing the activity of the user, wherein the requested or subscribed information include but is not limited to video and/or audio stream, still images from the video stream, and stick figures created from the still images. Since the human activity detection engine 106 is configured to train the ML models and to detect human activities by interpreting the stick figures representing the human body of the user, neither the performance nor functionality of the security monitoring system 100 is compromised by the stick figures whilst providing the privacy features.
In some cases, the camera generating the video stream may be switched to “private mode,” which triggers and records the video stream in private mode, wherein live video stream is not recorded or shared to the security monitoring system 100. Under such private mode, the user data privacy engine 102 is configured to continue to track the stick figures in the video stream. However, the user data privacy engine 102 takes the last free datapoint of a background image of the monitored location instead of the real image from the actual video stream. The user data privacy engine 102 then draws a stick figure in a specific place and time on top of the background image, and uses different color variations of the stick figures to track and monitor the user at the monitored location. The result is a set of color-coded private mode images that represent the user in the video stream.
In some embodiments, the user data privacy engine 102 is configured to pixelize the human body of the user in the set of still images taken from the video stream by blurring (e.g., by applying blocks or mosaics over) at least a portion of the human body of the user in the still images frame by frame (e.g., one still image at a time) to further protect the user's privacy and/or identity. Note that the size of blocks for pixelization can be varied.
In some embodiments, the user data privacy engine 102 is configured to transform one frame from the video stream for pixelization as follows. First, as shown by the example of
In the example of
One embodiment may be implemented using a conventional general purpose or a specialized digital computer or microprocessor(s) programmed according to the teachings of the present disclosure, as will be apparent to those skilled in the computer art. Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those skilled in the software art. The invention may also be implemented by the preparation of integrated circuits or by interconnecting an appropriate network of conventional component circuits, as will be readily apparent to those skilled in the art.
The methods and system described herein may be at least partially embodied in the form of computer-implemented processes and apparatus for practicing those processes. The disclosed methods may also be at least partially embodied in the form of tangible, non-transitory machine readable storage media encoded with computer program code. The media may include, for example, RAMs, ROMs, CD-ROMs, DVD-ROMs, BD-ROMs, hard disk drives, flash memories, or any other non-transitory machine-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the method. The methods may also be at least partially embodied in the form of a computer into which computer program code is loaded and/or executed, such that, the computer becomes a special purpose computer for practicing the methods. When implemented on a general-purpose processor, the computer program code segments configure the processor to create specific logic circuits. The methods may alternatively be at least partially embodied in a digital signal processor formed of application specific integrated circuits for performing the methods.
This application is a continuation application of United States Patent Application No. PCT/US21/24302, filed Mar. 26, 2021, entitled “System and Method for Efficient Privacy Protection for Security Monitoring,” which claims the benefit of U.S. Provisional Patent Application No. 63/001,844, filed Mar. 30, 2020. Both of which are incorporated herein in their entireties by reference.
Number | Date | Country | |
---|---|---|---|
63001844 | Mar 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US21/24302 | Mar 2021 | US |
Child | 17353210 | US |