The present invention relates generally to the field of computing, human-machine interfacing, surveillance, security, media and automated control systems. More specifically, the present invention relates to methods, systems, apparatuses, circuits and associated computer executable code for providing video based subject characterization, categorization, identification, tracking, monitoring and/or presence response.
Video based observation of human subjects (subjects) dates back to the 1940's. Computer vision is a field that includes methods for acquiring, processing, analyzing, and understanding images and, in general, high-dimensional data from the real world in order to produce numerical or symbolic information. Image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory. Applications of computer vision range from tasks such as industrial machine vision systems which can inspect bottles speeding by on a production line, to research into artificial intelligence and computers or robots that can comprehend the world around them. Computer vision and/or derivatives thereof may also be used as part of video based human machine interfacing systems which provide users the ability to control or interact with computerized devices by gesturing while in line-of-sight of a video camera associated with the computerized device. Computer vision and/or derivatives thereof may further be used as part of video surveillance based security systems able to identify individuals and optionally to track and/or characterize their activity within a video feed or recording.
Computerized and dynamic control of various aspects and devices associated with a home, premises, facility, perimeter and/or any type of location is desirable. It is further desirable to provide subject specific computerized or automated control of various aspects and devices associated with a home, premises, facility, perimeter and/or any other type of location, such that control of the device or aspect is responsive to an identity or characteristic of a subject or subjects at or in proximity of the home, premises, facility, perimeter and/or any other type of location. It is yet further desirable to provide subject specific computerized or automated control of various security devices associated with a home, premises, facility, perimeter and/or any type of location, such that control or state of the security device is responsive to an identity or characteristic of a subject or subjects at or in proximity of the home, premises, facility, perimeter and/or any other type of location. It is also desirable to monitor and track subjects efficiently in areas where large volumes of people are present and in motion.
Permission to access a resource is called authorization. The act of accessing may mean consuming, entering, or using it.
Access control is the selective restriction of access to a place or other resource. This might be physical, as with doors, gates, etc. or logical when accessing virtual resources. Locks and login credentials are two analogous mechanisms of access control.
There are three types (factors) of authenticating information:
Something the user knows, e.g. a password, pass-phrase or PIN.
Something the user has, such as smart card.
Something the user is, such as fingerprint, verified by biometric measurement.
Behavioral Analytics is a subset of business analytics that focuses on how and why users of eCommerce platforms, online games, & web applications behave. Behavioral analytics utilizes user data captured while the web application, game, or website is in use by analytic platforms. The data points are then compiled and analyzed, looking at the timeline progression from when a user first entered the platform. Behavioral analysis allows future actions and trends to be predicted based on all the data collected.
Behavioral biometric is based on a behavioral trait of an individual. Examples of behavioral biometrics may include speech patterns, signatures and keystrokes, gait, facial (to an extent) etc. Contrary to physical biometrics, it considers not only the physical traits of the subject, but rather the way it uses his/her body to complete different tasks.
Biometrics (or biometric authentication) refers to the identification of humans by their characteristics or traits. Biometric identifiers are the distinctive, measurable characteristics used to label and describe individuals. Biometric identifiers are often categorized as physiological versus behavioral characteristics.
Biometric parameters are the values of measured physical characteristics of an individual. Biometric parameters can be divided into Static Biometric parameters and Dynamic Biometric parameters, wherein static biometric parameters are values of dimensions or other physical characteristics of physical elements of the individual unrelated to movement (e.g. height, weight, eye color, hand size and shape, etc.), whereas dynamic biometric parameters are values of dimensions, movements and ratios of movements of physical elements of an individual during performance of different physical movements (e.g. walking speed, ration of limb movements when walking, speed of arm movement when throwing a rock, etc.)
A set of eigenvectors used in the computer vision problem of human face recognition. This is used in biometric identification and classification of human subjects.
Electronic access control systems are systems using one or more technologies, for managing and granting and recording access authorizations to secured resources. Technologies which may be found in such systems include biometrics, Radio Frequency ID (RFID), secured keypads, etc.
Extreme Motion is a unique motion capture engine, developed by Extreme Reality ltd., which extracts the 3D position of the user in front of a regular camera in every frame and creates a real time 3D model of the user, represented by its joints XYZ coordinates. This model is then analyzed and gestures are extracted according to skeleton position and/or trajectories. Extreme Motion is the core technology behind the entire line of products offered by Extreme Reality ltd., including those discussed in this document.
Gait is defined as a person's manner of walking.
A security system where more than one form of authentication is used such as something you know (password), something you have (smart card), and something you are (biometric technology). The combination of these three security systems provides a high degree of security and convenience, which ensures confidentiality of personal information. This is superior to traditional passwords/PINS as these are easily guessed, forgotten or copied. Multi-factor authentication also includes biometric technology that uses biological characteristics or features, which are inseparable from a person; therefore, reducing the threat of loss or theft.
Multimodal biometric systems use multiple biometric inputs to overcome the limitations of uni-modal biometric systems. This approach is targeted at contending with a variety of problems such as noisy data, intra-class variations, restricted degrees of freedom, non-universality, spoof attacks, and unacceptable error rates. Some of these limitations can be addressed by deploying multimodal biometric systems that integrate the evidence presented by multiple sources of information. Multimodal biometric systems overcome some of these problems by consolidating the evidence obtained from different sources, which may be multiple sensors for the same biometric, multiple instances of the same biometric, multiple snapshots of the same biometric, multiple representations and matching algorithms for the same biometric, or multiple biometric traits.
Physical biometric is based on a physical trait of an individual. Examples of physical biometrics include fingerprints, hand geometry, retinal scans, and DNA. Contrary to behavioral biometrics, the physical approach considers only physical traits of the subject and not the way they are used.
Video surveillance is observation from a distance by means of electronic equipment (such as CCTV cameras), for the purpose of the monitoring of the behavior, activities, or other changing information, usually of people for the purpose of influencing, managing, directing, or protecting. Surveillance can be the observation of individuals or groups by government or commercial organizations, as well as for domestic use.
The present invention includes methods, systems, apparatuses, circuits and associated computer executable code for providing video based subject characterization, categorization, identification, recognition, tracking, monitoring, authentication and/or presence response. According to some embodiments, there may be provided one or more Image Based Biometric Extrapolation (IBBE) methods, systems and apparatuses adapted to extrapolate static and/or dynamic biometric parameters of one or more subjects, from one or more images or video segments including the subjects. According to some embodiments, extrapolated biometric parameters of subjects may be used to identify, recognize, track/monitor and/or authenticate the subjects. According to further embodiments, extrapolated biometric parameters may be used to determine physical positions of subjects and may further be used to identify one or more subjects exhibiting suspicious behavior based on their physical positions.
According to some embodiments, biometric parameters may be extrapolated from images and/or video segments. The images and/or video segments may be acquired by one or more native image sensors of the system and/or may be received from third parties. Accordingly, according to some embodiments, there may provided one or more video/image acquisition sub-systems which may include cameras, video circuits and ports, and which may be adapted to capture one or more images/videos of subjects (e.g. person, adult, child, etc.). Furthermore, according to further embodiments, there may be provided one or more interfaces adapted to receive images and/or video data from third party image/video sensors and/or image/video processing systems. Third party data may be analyzed in conjunction or separately from image/video data received from native video/image acquisition sub-systems. The one or more video acquisition sub-systems may be integral or otherwise functionally associated with a video analytics sub-system, which video analytics sub-system may include various image processing modules, logic and circuits, and which may be adapted to extract, extrapolate or estimate from the images/videos biological parameters of subjects appearing in the images/video, static and/or dynamic, such as: height, width, volume, limb sizes, ratio of limb sizes, ratio of limb to torso sizes, limb/head shape, heights of different body parts during motion (e.g. max/min head heights when walking) or change of heights during motion, angular or distantial range of motion between different limbs and limb parts, angular or distantial range of motion between limb parts and torso or other body parts, head size, head movements, movement parameters of body parts when performing certain movements (e.g. parameters of hand motions when waving hello), hair color and configuration, facial features, shape and dimensions of specific body parts (e.g. ear shape and/or dimensions), distinguishing marks, clothing type and arrangement, unique wearable accessories, and so on. The extracted subject features may be analyzed and characterized/quantified to generate one or more subject indicators or subject biometric parameters which may include biometric parameters indicative of the subject's: (1) age, (2) gender, (3) ethnicity, (4) demographic, (5) physical attributes, (6) motion attributes, (7) physical condition, (8) mental state, (9) behavioral characteristics; (10) current intention(s) and/or (11) any other biometric parameters. Some indicators/biometric-parameters or sets of indicators/biometric-parameters derived from dynamic (motion) features of the subject may include body part motion profiles, combinations of body part motion profiles and derivative thereof including motion frequency coefficients and amplitudes.
According to some embodiments, biological parameters of subjects may be extrapolated from images by first correlating one or more skeletal models (2D or 3D) to the subjects. Based on a correlation of a skeletal model, dimensions and relations of body elements may be extrapolated. Further, based on a correlated skeletal model of a subject, body elements of a subject may be tracked to extrapolate motion parameters of the body elements and relations between them. A description of methods and systems for correlating skeletal models to subjects appearing in images is provided in U.S. Pat. No. 8,114,172, titled “SYSTEM AND METHOD FOR 3D SPACE-DIMENSION BASED IMAGE PROCESSING”, which is hereby incorporated into the present application in its entirety. Further, a correlation of a skeletal model to a subject in one or more images of a sequence may then be used to correlate the model to previous/future images in the sequence. According to further embodiments, skeletal model correlation and tracking may be joint based. In other words, a skeletal model may be correlated to an individual by correlating the locations and angles of the joints of the model to the joints of the individual. Accordingly, tracking of body parts of an individual and their motion may be performed by tracking the locations, positions and angles of the joints during the motion. Further, in relation to the description below, it should be understood that all descriptions relating to parameters of body parts and their motions may equally relate to parameters of joints and their motion, such that a portion or all of the analysis described herein may be performed entirely or partially based on joint identification, modeling and tracking alone.
A Subject Identification/Recognition/Categorization/Tracking/Monitoring/Authentication Sub-System may be comprised of modules, logic and circuits adapted to receive the extracted subject indicators/biometric-parameters, access an indicators/biometric-parameter reference database of subject indicators/biometric-parameters, and to either identify and/or categorize the subject. The identification/categorization sub-system may attempt to correlate the received subject indicators/biometric-parameters with reference subject/category indicators/biometric-parameters in the indicators/biometric-parameter reference database and/or may store received subject indicators/biometric-parameters for future reference. The reference subject indicators/biometric-parameters stored in the reference database may be associated with either a specific subject (e.g. individual by the name of Dor Givon) or with one or more groups, types or categories of subjects (e.g. Male, Asian, Adult, Men who previously passed by entrance X, etc.).
Correlation between a set of received subject indicators/biometric-parameters and reference indicators/biometric-parameters stored on the reference database may be absolute or partial. In certain situation, the set of subject indicators/biometric-parameters received for a subject (individual) being tracked may be smaller than the set of reference indicators/biometric-parameters stored for that subject in the reference database. In such an event, a partial match between the received indicators/biometric-parameters and the reference indicators/biometric-parameters may suffice for an identification, authentication, monitoring or categorization (optionally: only as long as the match between corresponding indicators is sufficiently high). In other cases, the set of subject indicators/biometric-parameters received for a subject (individual) being tracked may be larger than the set of reference indicators/biometric-parameters stored for that subject in the reference database. In such an event, once a match between the received indicators/biometric-parameters and the reference indicators/biometric-parameters is made (optionally: only when a match between corresponding indicators/biometric-parameters is sufficiently high) and the individual subject is identified, that subject's records in the reference indicators/biometric-parameters database may be updated to include the new and/or updated subject indicators/biometric-parameters received from the analytics sub-system. According to further embodiments, certain types of indicators/biometric-parameters may receive different weights in correlation, such that, for example, a correlated physical height indicator/biometric-parameter may be factored more heavily than a clothing related indicator/biometric-parameter. Furthermore, one or more indicator/biometric-parameter type correlations may allow for deviations in parameters which will still be considered matching in degrees which may vary from indicator/biometric-parameter type to indicator/biometric-parameter type (for example, deviations in walking speed of less than 10% may still be considered matches while deviations of 10% in height will not). Yet further, allowed deviations in parameters may vary from individual to individual, possibly based on a degree of deviation determined when obtaining a reference profile for the individual. For example, it may be determined that individual x varies the degree of motion of his arms as much as 20% during walking whereas individual y only varies the degree of motion of his arms 2% or not at all. Accordingly, individual x's reference biometric profile may allow deviations of 20% in arm motion, whereas individual y's reference biometric profile may only allow deviations of 2%. Allowed deviations may also be situation based. According to some embodiments, multi-factor biometric identification/authentication may be performed by an IBBE system, i.e. a set of different types of biometric parameters of a subject may be compared to a set of different types of reference biometric parameters in a stored reference biometric profile. In such cases, different types of biometric parameters may receive different weights in evaluating a match between the subject parameters and a reference profile. Further, the weights assigned to different biometric parameter types may be situation/circumstance dependent. For example, in low lighting conditions the value of color based parameters may be reduced (as color may deviate in poor lighting) or in extremely cold conditions the value of a walking speed parameter may be reduced (as people tend to change their walking speed in extreme cold).
According to some embodiments, correlation of subject indicators/biometric-parameters to reference indicators/biometric-parameters may be performed in a multistage process, wherein one or more indicators/parameters may be used to first eliminate and/or identify a group of possible matches, such that correlation based on other parameters is performed in relation to a smaller group of reference indicators/biometric-parameters. For example, a first stage of correlation may comprise correlation of static biometric parameters, and a second stage correlation of dynamic parameters between the subject indicators/parameters and the group of reference indicators/parameters correlated to the subject indicators/parameters based on the static parameters.
According to further embodiments, Subject Identification/Recognition/Categorization/-Tracking/Monitoring/Authentication Sub-Systems may interact with other Subject/Identification/-Recognition/Categorization/Tracking/Monitoring/Authentication Systems to facilitate multi-modal identification/recognition/categorization/tracking/monitoring/authentication, which other systems may be third party systems. For example, a biometric subsystem configured to track subjects based on motion parameters may interact with a facial recognition system to facilitate multi-modal tracking of subjects.
According to some embodiments, visually detected features may be static and/or dynamic features. Any combination of static and/or dynamic features may be acquired and analyzed to estimate a subject indicator or subject biometric parameter. The acquired static/dynamic features or combination thereof may include the subject's: height, width, volume, limb sizes, ratio of limb sizes, ratio of limb to torso sizes, limb/head shape, heights of different body parts during motion (e.g. max/min head heights when walking) or change of heights during motion, angular or distantial range of motion between different limbs and limb parts, angular or distantial range of motion between limb parts and torso or other body parts, head size, head movements, movement parameters of body parts when performing certain movements (e.g. parameters of hand motions when waving hello), hair color and configuration, facial features, shape and dimensions of specific body parts (e.g. ear shape and/or dimensions), distinguishing marks, clothing type and arrangement, unique wearable accessories and so on. Any other visually detectable features or combinations of features known today or to be discovered or devised in the future are applicable to the present invention.
Further embodiments of the present invention may include methods, circuits, apparatuses, systems and associated computer executable code for providing video based surveillance, identification, recognition, monitoring, tracking and/or categorization of individuals based on visually detectable dynamic features of the subject, such as the subject's motion dynamics. According to some embodiments, spatiotemporal characteristics of an instance of a given individual moving in a video sequence may be converted into: (1) one or more Body Part Specific Motion Profiles (BPSMP), and/or (2) a set of Body Part Specific Frequency Coefficients (BPSFC) or Body Part Specific Motion Amplitudes (BPSMA). Either the BPSMP, BPSFC, BPSMA may be stored as subject indicators and used as reference(s) for identifying another instance of the given subject/individual in another image/video sequence. According to further embodiments, one or more limbs, a torso and optionally the head/neck (referred to as Body Parts) of an individual in a video sequence may be individually tracked while the individual is in motion. Tracking of body part movements may be analyzed and used to generate a motion profile for one or more of the tracked body parts. The body part specific motion profiles may indicate recurring patterns of body part motion while the subject is walking, running or otherwise moving. Additionally, one or more of the motion profiles may be used to generate one or a set of motion related frequency coefficients and/or amplitudes for each of the tracked body parts. Motion related frequency coefficients associated with a given body part may be referred to as a Body Part Specific Frequency Coefficient, and one or more BPSFC's may be generated for each tracked body part. The one or more BPSFC's for each given tracked limb/torso/head may be indicative of spatiotemporal patterns (e.g. cyclic/repeating movements) present in the given tracked part while the individual subject is in motion, for example during walking or running.
One or an aggregation of body part specific motion profiles BPSMP of an individual (e.g.: (1) right arm, right leg and head; (2) right arm, left arm, left leg and right shoulder) may be stored, indexed, and later referenced as part of a Motion Signature Vector (MSV). Combinations or an aggregation of BPSFC's relating to different body parts of the same given individual (e.g.: (1) right arm, right leg and head; (2) right arm, left arm, left leg and right shoulder) may also be stored, indexed, and later referenced as part of the same or another Motion Signature Vector (MSV) for the same given individual. Accordingly, matches or substantial matches between corresponding BPSMP's and/or BPSFC's and/or BPSMA's of a stored reference MSV and corresponding profiles and/or BPSFC's derived from an individual being video tracked may indicate that the person being video tracked is the same person who was the source of the reference MSV.
Reference BPSMP value ranges and reference BPSFC/BPSMA value ranges, or any combination thereof, may be indicative of a specific subject categorization, including: age ranges, genders, races, etc. Accordingly, an MSV derived from a video sequence of a given subject and including BPSMP values and/or BPSFC/BPSMA values within specific reference ranges defined to be associated with a specific category (e.g. age range, gender, race, etc.) may indicate that the given subject in the video sequence belongs to the specific category (e.g. age range, gender, race, etc.).
According to some embodiments, one or more BPSMP of a walking pattern of a subject may be extrapolated from a video sequence. It should be noted that a body/head of a human walking typically moves in a spiral pattern, going up and down and left to right simultaneously in a cyclic motion as the person walks. This motion can be tracked, by tracking one or more points on the subject's body/head, as the subject walks. Accordingly, one or more BPSMP's of a walking pattern of a subject may include one or more of:
a. a max/min height of the subject when walking;
b. an amplitude and/or frequency of the spiral pattern;
c. a shape of the spiral pattern;
d. an amplitude and/or frequency of the sideways motion of the subject when walking;
e. a speed of progression; and
f. relations between one or more of the above parameters;
g. any other parameter of the spiral pattern.
Furthermore, characteristics of a subjects spiral pattern when walking may indicate a mood/intention/state of the subject. For example, a man angry and determined may exhibit specific types of patterns discernable from regular walking patterns which may be used to identify subjects about to commit a crime of violence.
It can further be noted that hands/arms of a human walking typically move when the human is walking, going forward and back along the subject's side as the subject walks. Accordingly, one or more BPSMP's of a walking pattern of a subject may include one or more of:
a. a max/min height of the arm/hand when walking, i.e. amplitude of the motion;
b. a frequency/speed of the arm/hand motion;
c. angles between the arm/hand and other body parts in different stages of the motion;
d. positions of the palms and their relations to the arm in different stages of the motion;
e. a relationship between the arm/hand motion and the other bodily motions;
f. relations between one or more of the above parameters;
g. positions and angles of joints in different stages of the motion; and/or
h. any other parameter of the arm/hand motion.
According to further embodiments, movements of other limbs and body parts during motion may be similarly analyzed such that BPSMP's of a walking pattern of a subject may include parameters of these motions as well.
According to some embodiments, BPSMP's of a walking pattern of a subject appearing in a video may be used to identify a subject by comparing the BPSMP's to reference BPSMP's of walking pattern of subjects stored in a reference database, either alone or in concert with other parameters associated with the subject. According to further embodiments, BPSMP's of a walking pattern of a subject may be used to track the subject when moving through an area including multiple video acquisition subsystems, e.g. in an airport.
According to some embodiments, one or more BPSMP's of one or more gestures of a subject may be extrapolated from a video sequence and used to identify and/or authenticate a subject. For example, one or more BPSMP's of a hand waving motion of a subject may be used to identify/authenticate the subject, alone or in concert with other identification/authentication parameters. It should be understood that the following description is presented in relation to a hand waving motion by way of example and that any other gesture may be equally used with the appropriate modifications. One or more BPSMP's of a hand waving of a subject may include one or more of:
According to some embodiments, there may be provided a biometric based authentication system which may include IBBE functionalities and may authenticate a subject by comparing one more BPSMP's of the subject to a record or table of records of reference BPSMP's of registered/permitted users/subjects. Such a system may include an image sensor (e.g. webcam) or may be functionally associated with such a sensor. The biometric based authentication system, may receive one or more images of a subject attempting authentication, from the image sensor. The biometric based authentication system may extrapolate BPSMP's of the subject from the received images and compare them to reference BPSMP's of registered/permitted users/subjects to verify the subject is permitted access to the requested resource, i.e. authenticate the user. For the purpose of authentication the subject may be required to perform one or more gestures or motions from which the system may extrapolate the relevant BPSMP's. For example, the subject may be required to wave to the camera when attempting authentication. According to further embodiments, a biometric based authentication system may interact with another type of authentication system or process to supplement the other authentication process. For example, a subject waving to the camera may also be required to enter a password to provide for multi-factor authentication.
According to some embodiments, the systems and methods described herein may be used to determine motion parameters of tracked subjects in an area and compared to a reference of natural ranges of motion parameters of humans. Thereby, subjects exhibiting unnatural motion parameters may be identified. Often times persons with unlawful or malicious/suspicious intentions will exhibit unnatural motion parameters. In this fashion, such individuals can be identified in a group of people and the appropriate authorities alerted. This feature can be especially useful in monitoring of areas containing large groups of people, e.g. airports, large events, public speeches, etc.
The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with objects, features, and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying drawings in which:
It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the present invention.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as “processing”, “computing”, “calculating”, “determining”, or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.
Embodiments of the present invention may include apparatuses for performing the operations herein. This apparatus may be specially constructed for the desired purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs) electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, or any other type of media suitable for storing electronic instructions, and capable of being coupled to a computer system bus.
The processes and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the desired method. The desired structure for a variety of these systems will appear from the description herein. In addition, embodiments of the present invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the inventions as described herein.
The present invention includes methods, systems, apparatuses, circuits and associated computer executable code for providing video based subject characterization, categorization, identification, recognition, tracking, monitoring, authentication and/or presence response. According to some embodiments, there may be provided one or more Image Based Biometric Extrapolation (IBBE) methods, systems and apparatuses adapted to extrapolate static and/or dynamic biometric parameters of one or more subjects, from one or more images or video segments including the subjects. According to some embodiments, extrapolated biometric parameters of subjects may be used to detect, identify, recognize, track/monitor and/or authenticate the subjects. According to further embodiments, extrapolated biometric parameters may be used to determine physical positions of subjects and may further be used to identify one or more subjects exhibiting suspicious behavior (or otherwise interesting biometric parameters) based on their physical positions and/or other biometric parameters.
According to some embodiments, biometric parameters may be extrapolated from images and/or video segments. The images and/or video segments may be acquired by one or more native image sensors of the system and/or may be received from third parties. Accordingly, according to some embodiments, there may provided one or more video/image acquisition sub-systems which may include cameras, video circuits and ports, and which may be adapted to capture one or more images/videos of subjects (e.g. person, adult, child, etc.) [see
According to some embodiments, biological parameters of subjects may be extrapolated from images by first correlating one or more skeletal models (2D or 3D) to the subjects [see FIGS. 1 and 4-6]. Based on a correlation of a skeletal model, dimensions and relations of body elements may be extrapolated [see
Referring to
According to some embodiments, biometric parameters of subjects may be extrapolated by passively monitoring subjects (e.g. from surveillance images and video) [see
A Subject Identification/Recognition/Categorization/Tracking/Monitoring/Authentication Sub-System may be comprised of modules, logic and circuits adapted to receive the extracted subject biometric parameters, access a biometric parameter reference database of stored subject biometric parameters, and to either identify and/or categorize/characterize the subject. The identification/categorization sub-system may attempt to correlate the received subject biometric parameters with reference subject/category biometric parameters and/or biometric parameter profiles in the biometric parameter reference database and/or may store received subject biometric parameters/profiles for future reference. The reference subject biometric parameters stored in the reference database may either be associated with specific subjects (e.g. individual by the name of Dor Givon) or with one or more groups, types or categories of subjects (e.g. Male, Asian, Adult, Men who previously passed by entrance X, suspicious humans, etc.).
Correlation between a set of received subject biometric parameters and reference biometric parameters stored on the reference database may be absolute or partial. In certain situations, the set of subject biometric parameters received for a subject (individual) being tracked may be smaller than the set of reference biometric parameters stored for that subject in the reference database. In such an event, a partial match between the received biometric parameters and the reference biometric parameters may suffice for an identification, recognition, authentication, monitoring or categorization (optionally: only as long as the match between corresponding biometric parameters is sufficiently high). Different degrees of match between the determined biometric parameters of a subject and reference parameters may be required for different uses of the system; e.g. a higher matching degree may be required for authentication purposes than for tracking purposes. In other cases, the set of subject biometric parameters received for a subject (individual) being tracked may be larger than the set of reference indicators stored for that subject in the reference database. In such an event, once a match between the received indicators and the reference indicators is made (optionally: only when match between corresponding indicators being sufficiently high) and the individual subject is identified, that subject's records in the reference indicator database may be updated to include the new and/or updated subject indicators received from the analytics sub-system. According to further embodiments, certain types of indicators may receive different weights in correlation, such that, for example, a correlated physical height indicator may be factored more heavily than a clothing related indicator. Furthermore, one or more indicator types correlations may allow for deviations in parameters which will still be considered matching in degrees which may vary from indicator type to indicator type (for example, deviations in walking speed of less than 10% may still be considered matches while deviations of 10% in height will not). Yet further, allowed deviations in parameters may vary from individual to individual, possibly based on a degree of deviation determined when obtaining a reference profile for the individual. For example, it may be determined that individual x varies the degree of motion of his arms as much as 20% during walking whereas individual y only varies the degree of motion of his arms 2% or not at all. Accordingly, individual x's reference biometric profile may allow deviations of 20% in arm motion, whereas individual y's reference biometric profile may only allow deviations of 2% in arm motion. Accordingly, extrapolating reference profiles of individuals may include both recording parameter values/ranges for different biometric parameters and extrapolating degrees of deviation in each biometric parameter for the individual. Allowed deviations may also be situation based.
According to some embodiments, multi-factor biometric identification/authentication may be performed by an IBBE system, i.e. a set of different types of biometric parameters of a subject may be compared to a set of different types of reference biometric parameters in a stored reference biometric profile. For example, a set of subject dynamic walking related biometric parameters may be compared to walking related parameters of a reference profile alongside comparison of the subject's static biometric parameters (e.g. height and head shape). In such cases, the amount of matching parameters required for identification/authentication and the degree of matching required may depend on the purpose of identification/authentication. For example, at the entrance to a nuclear facility a large number of matching parameters along with a high degree of match in each parameter may be required, whereas for purposes of tracking individuals moving through an amusement park to determine their park facility usage habits, a much smaller amount and degree of matching parameters may suffice. According to some embodiments, multi-biometric-factor type identification/authentication may include calculation of a match score between a set of extrapolated subject biometric parameters (static and dynamic) and a reference profile of biometric parameters. The match score may be calculated by first determining a degree of correlation between each specific extrapolated biometric parameter and the corresponding reference biometric parameter in the reference profile to determine a match score for the specific type of biometric parameter. As stated, the match score for each specific type of biometric parameter equated with each degree of correlation may differ between parameter types (e.g. a 2% deviation in walking speed between a subject's walking speed and a reference profile walking speed may result in a higher match score for walking speed than a 2% deviation in height). Further, the match score for a given type of biometric parameter equated with each degree of correlation may differ in different conditions/environments, e.g. a given degree of correlation in walking speed may result in a higher match score in cold weather (where people are expected to change their walking speed) than in normal weather or a different degree of correlation may be expected in low quality images, etc. Further examples of distinguishing conditions may include: lighting, the quality of the reference profile, the purpose of the identification/authentication, the angle of image capture, the nature of the activity in which the subject is involved as opposed to the nature of the activity the subject was involved in during capture of the reference profile parameters (e.g. a subject walking while speaking to a friend as opposed to a subject walking alone or a subject at a business function as opposed to a subject at a social event, etc.). According to further embodiments, the matching algorithm may include predefined adaptations/mappings/normalizations of parameters for given conditions, e.g. walking speed at extremely cold temperatures may automatically be reduced by 10% or walking/standing parameters may include automatic adaptations/mappings/normalizations between business and social conditions. Such automatic adaptations/mappings/normalizations may be based on a statistical analysis of multiple subject biometric parameters in different conditions, i.e. an IBBE system may include self-learning algorithms. Further, experimentation with known subjects in different conditions may be performed to collect statistical data for this purpose. Once a match score for each type of extrapolated biometric parameter has been determined a total biometric profile match score may be determined by aggregating the match scores for each type of biometric parameter. When aggregating the match scores, different types of biometric parameters may receive different weights in evaluating a match between the subject parameters and a reference profile. For example, correlation in height may be more influential than correlation in walking speed. Further, the weights assigned to different biometric parameter types may be situation/circumstance dependent. For example, in low lighting conditions the value of color based parameters may be reduced (as color may deviate in poor lighting) or in extremely cold conditions the value of a walking speed parameter may be reduced (as people tend to change their walking speed in extreme cold). Further examples of distinguishing conditions may include: lighting, the quality of the reference profile or the quality of each of the types of biometric parameters in the reference profile and/or the specific extrapolated profile (for this purpose, an accuracy score may be given to reference and/or extrapolated biometric parameters during their determination by the system), the purpose of the identification/authentication, the angle of image capture, the nature of the activity in which the subject is involved as opposed to the nature of the activity the subject was involved in during capture of the reference profile parameters (e.g. a subject walking while speaking to a friend as opposed to a subject walking alone or a subject at a business function as opposed to a subject at a social event, etc.).
According to some embodiments, correlation of subject indicators to reference indicators may be performed in a multistage process, wherein one or more indicators/parameters may be used to first eliminate and/or identify a group of possible matches, such that correlation based on other parameters is performed in relation to a smaller group of reference indicators [see
According to further embodiments, Subject Identification/Recognition/Categorization/-Tracking/Monitoring/Authentication Sub-Systems may interact with other Subject Identification/-Recognition/Categorization/Tracking/Monitoring/Authentication Systems to facilitate multi-modal identification/recognition/categorization/tracking/monitoring/authentication, which other systems may be third party systems [see
According to some embodiments, visually detected features may be static and/or dynamic features. Any combination of static and/or dynamic features may be acquired and analyzed to estimate a subject indicator or subject biometric parameter/profile. The acquired static/dynamic features or combination thereof may include the subject's: height, width, volume, limb sizes, ratio of limb sizes, ratio of limb to torso sizes, limb/head shape, heights of different body parts during motion (e.g. max/min head heights when walking) or change of heights during motion, angular or distantial range of motion between different limbs and limb parts, angular or distantial range of motion between limb parts and torso or other body parts, ratios between the described parameters, head size, head movements, acceleration and velocity of movements, movement parameters of body parts when performing certain movements (e.g. parameters of hand motions when waving hello), hair color and configuration, facial features, shape and dimensions of specific body parts (e.g. ear shape and/or dimensions), distinguishing marks, clothing type and arrangement, unique wearable accessories and so on. Any other visually detectable features or combinations of features known today or to be discovered or devised in the future are applicable to the present invention.
Further embodiments of the present invention may include methods, circuits, apparatuses, systems and associated computer executable code for providing video based surveillance, identification, monitoring, tracking and/or categorization of individuals based on visually detectable dynamic features of the subject, such as the subject's motion dynamics [see
One or an aggregation of body part specific motion profiles BPSMP of an individual (e.g.: (1) right arm, right leg and head; (2) right arm, left arm, left leg and right shoulder) may be stored, indexed, and later referenced as part of a Motion Signature Vector (MSV). Combinations or an aggregation of BPSFC's relating to different body parts of the same given individual (e.g.: (1) right arm, right leg and head; (2) right arm, left arm, left leg and right shoulder) may also be stored, indexed, and later referenced as part of the same or another Motion Signature Vector (MSV) for the same given individual. Accordingly, matches or substantial matches between corresponding BPSMP's and/or BPSFC's and/or BPSMA's of a stored reference MSV and corresponding profiles and/or BPSFC's derived from an individual being video tracked may indicate that the person being video tracked is the same person who was the source of the reference MSV.
Reference BPSMP value ranges and reference BPSFC/BPSMA value ranges, or any combination thereof, may be indicative of a specific subject categorization, including: age ranges, genders, races, etc. Accordingly, an MSV derived from a video sequence of a given subject and/or BPSMP values and/or BPSFC/BPSMA values within specific reference ranges defined to be associated with a specific category (e.g. age range, gender, race, etc.) may indicate that the given subject in the video sequence belongs to the specific category (e.g. age range, gender, race, etc.). Similarly, reference BPSMP value ranges and reference BPSFC/BPSMA value ranges, or any combination thereof, may be indicative of a specific subject intention/behavior categorization. For example, individuals attempting or preparing to attempt a criminal/violent behavior may exhibit identifiable BPSMP value ranges and reference BPSFC/BPSMA value ranges.
Reference is now made to
Referring to
An example of the function of such an IBBE system is illustrated in
Another example of the function of such an IBBE system is illustrated in
Yet another example of the function of such an IBBE system may be the monitoring of a retail establishment. In such systems, biometric profiles of customers and employees may be monitored, possibly by using standard/existing surveillance cameras. Once a human is detected in the retail establishment, the IBBE system may extrapolate Static Biometric Parameters of the human, possibly by use of models, (e.g. skeletal models), as described herein. The system may further track one or more of the detected features over a series of frames to extrapolate dynamic biometric parameters of the individual. The system may then extrapolate biometric profiles of detected humans (static and/or dynamic) and compare the extrapolated profiles to reference suspicious/interesting/normal profiles stored in an associated database. For example, customers placing merchandise in their pockets or bags may be immediately detected. Further, employee and customer behavior at points of sale can be monitored to detect suspicious behavior (e.g. Sweethearting, tempering, return fraud, skimming, etc.). Such a system may issue alerts when detecting suspicious behavior (suspicious biometric profiles) and/or may mark the time and location of the suspicious event to facilitate efficient review of surveillance footage by appropriate personnel.
According to further embodiments, an IBBE system may interact and receive data from other relevant systems and use this data to determine certain events. For example, an IBBE system monitoring a retail establishment may also receive data from the retail establishment's registers. In such embodiments, the IBBE system may compare visual data to the register data to identify different forms of fraud, e.g. a person identified in the video as leaving the store with a diamond while the register has charged for an apple may be identified by the system. Similarly, if 3 persons are viewed entering a concert while only one ticket is charged at the register an alert may be issued.
Some examples of known methods for stealing from retail establishments are presented below, along with methods by which an IBBE system may detect such acts:
According to further embodiments, other behaviors monitored at a POS may include: (1) Checkout manned—Cashier arriving at checkout and sitting at position, (2) Checkout unmanned—Cashier leaving checkout position, (3) Customer present—A customer standing at the checkout lane, (4) no customer present—No customer at the checkout lane, (5) Erratic cashier—Cashier turning and moving excessively in the chair, and (5) Hidden hand(s)—Cashier hands are not fully visible.
According to some embodiments, any subject exhibiting a biometric profile sufficiently distinct/deviating from normal biometric profiles may be monitored, an alert may be sent to the appropriate party and/or the relevant section of the video footage may be marked for review. In this manner, as people committing unlawful/improper acts tend to behave/move/stand differently than normal people, unlawful/improper acts may be identified.
From the above examples, it can be understood that suspicious biometric profiles may be situation and/or area specific—what is considered suspicious at a casino may not be suspicious at a social gathering and vice versa and what may be suspicious at a dentist conference may not be suspicious at a nightclub, etc. Equally, suspicious biometric profiles may be gender and/or age specific, time specific, geographically specific, ethnic specific and so on. Further, as hinted above, interesting profiles may not necessarily be interesting due to suspicion. For example, in the casino scene, a proprietor of the system may also wish to identify biometric profiles of high-rollers or celebrities. According to embodiments of the present invention, any set of biometric parameters indicative of an individual which is desired to be identified may be stored in an IBBE system database as a reference profile, such that any individual matching the profile appearing in an area monitored by the IBBE system will then be identified by the system. Such a reference profile may be indicative of a specific individual (such as in the bank robber example) or indicative of a class or characteristic of individuals (such as in the casino example).
Is some circumstances, it may be desirable to implement an IBBE system to alert of any human presence in an area, or of any human presence other than a given list of humans authorized to be in the area (e.g. in home security).
According to some embodiments, comparison of biometric profiles of individuals to reference biometric profiles may be divided into 6 categories:
Referring to
a. a max/min height of the subject when walking, as illustrated in
b. an amplitude and/or frequency of the spiral pattern;
c. a shape of the spiral pattern;
d. an amplitude and/or frequency of the sideways motion of the subject when walking;
e. a speed of progression; and
f. relations between one or more of the above parameters;
g. any other parameter of the spiral pattern.
Furthermore, characteristics of a subjects spiral pattern when walking may indicate a mood/intention/state of the subject. For example, a man angry and determined may exhibit specific types of patterns discernable from regular walking patterns which may be used to identify subjects about to commit a crime of violence.
It can further be noted that hands/arms of a human walking typically move when the human is walking, going forward and back along the subject's side as the subject walks, as illustrated in
a. a max/min height of the arm/hand when walking, i.e. amplitude of the motion;
b. a frequency/speed of the arm/hand motion;
c. angles between the arm/hand and other body parts in different stages of the motion;
d. positions of the palms and their relations to the arm in different stages of the motion;
e. joint positions and angles during the motion;
f. a relationship between the arm/hand motion and the other bodily motions;
g. relations between one or more of the above parameters; and/or
h. any other parameter of the arm/hand motion.
According to some embodiments, BPSMP's of a walking pattern of a subject appearing in a video may be used to identify a subject by comparing the BPSMP's to reference BPSMP's of walking pattern of subjects stored in a reference database, either alone or in concert with other parameters associated with the subject. According to further embodiments, BPSMP's of a walking pattern of a subject may be used to track the subject when moving through an area including multiple video acquisition subsystems, e.g. in an airport [see
Referring to
According to some embodiments, there may be provided a biometric based authentication system which may include IBBE functionalities and may authenticate a subject by comparing one more BPSMP's of the subject to a record or table of records of reference BPSMP's of registered/permitted users/subjects [see
According to further embodiments, an IBBE authentication may implement passive biometric detection for authentication. In such IBBE systems, the IBBE authentication system may detect humans approaching or in front of an associated computational platform requiring authentication, detect features of these humans, extrapolate static biometric parameters of these humans (possibly by use of models) and further track the detected features during motions of these humans to extrapolate dynamic biometric parameters of their motion (e.g. extrapolate dynamic biometric parameters of their walk as they approach the system). The extrapolated biometric parameters may then be compared to reference biometric profiles of authorized users to authenticate the user (in the event of a match) or deny access (in the event there isn't a match). For example, the approach to a particularly sensitive terminal in a facility may be monitored by an IBBE authentication system, which may extrapolate biometric parameters of people approaching the terminal and compare the extrapolated parameters to reference biometric profiles of users authorized to use the terminal. In this fashion, by the time the user reaches the terminal the terminal will already “know” if the user is authorized or not.
According to some embodiments, the systems and methods described herein may be used to determine static and dynamic biometric parameters of tracked subjects in an area and compare the determined biometric parameters to a reference of natural ranges of motion parameters of humans. Thereby, subjects exhibiting unnatural motion parameters may be identified. Often times persons with unlawful or malicious/suspicious intentions will exhibit unnatural motion parameters. In this fashion, such individuals can be identified in a group of people and the appropriate authorities alerted. This feature can be especially useful in monitoring of areas containing large groups of people, e.g. airports, large events, public speeches, etc. Alternatively, subjects exhibiting unnatural behavior may be identified and further analysis performed in relation to the identified subjects and their biometric parameters to determine if the identified subjects are actually suspicious.
The present disclosure is described in relation to human identification/tracking/monitoring. It should be understood, however, that the principles, components and methods described herein can also be implemented in relation to other objects (e.g. motor vehicles on a road or animals in a zoo) with the appropriate modifications.
Turning now to
The analytics sub-system also includes a subject indicator generation module adapted to analyze one or more static and/or dynamic features of a subject and to quantify the features into one or more subject indicators. A subject identification/categorization sub-system may then use the indicators generated for subject in a video sequence to reference a database include indicator sets associated various individuals and/or various type/categories of individuals. A subject presence response sub-system may reference an identified individual subject or category of subjects in a database in order to determine a response to trigger due the subject's presence in the video sequence.
Turning now to
Turning now to
Turning now to
Turning now to
Turning now to
According to some embodiments, there may be provided a video based subject response system including: (1) a video analytics module to extract subject features (biometric parameters) from an instance of a subject in a video sequence and to generate one or more subject indicators based on the extracted features; (2) an identification or categorization module adapted to correlate the generated one or more subject indicators with reference indicators in an indicator reference database, wherein specific sets of reference indicators are associated with either a specific subject or with a group of subjects; and (3) a presence response module adapted to generate a system response to an identification of a specific subject or group of subjects. At least one of the indicators may indicate subject motion dynamics. The indicator may include at least one body part specific motion profile. The indicator may include at least one body part specific frequency coefficient.
According to further embodiments, the system may include a video acquisition sub-system and the presence response module may be adapted to generate signals intended to alter a security condition of a location associated with said system. A set of reference indicators may be associated with a specific banned or restricted group or set of subjects.
According to further embodiments, the system may include a video acquisition sub-system and the presence response module may be adapted to generate signals intended to alter an environmental condition of location associated with said system. A set of reference indicators may be associated with a family group member.
According to further embodiments, the system may include a video acquisition sub-system and the presence response module may be adapted to generate signals intended to trigger or alter content presented on a display associated with said system. A set of reference indicators may be associated with a specific demographic of people.
It should be understood by one of skill in the art that some of the functions described as being performed by a specific component of the system may be performed by a different component of the system in other embodiments of this invention.
The present invention can be practiced by employing conventional tools, methodology and components. Accordingly, the details of such tools, component and methodology are not set forth herein in detail. In the previous descriptions, numerous specific details are set forth, in order to provide a thorough understanding of the present invention. However, it should be recognized that the present invention might be practiced without resorting to the details specifically set forth.
In the description and claims of embodiments of the present invention, each of the words, “comprise” “include” and “have”, and forms thereof, are not necessarily limited to members in a list with which the words may be associated.
Only exemplary embodiments of the present invention and but a few examples of its versatility are shown and described in the present disclosure. It is to be understood that the present invention is capable of use in various other combinations and environments and is capable of changes or modifications within the scope of the inventive concept as expressed herein.
While certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.
The present application claims priority from U.S. Provisional Patent Application No. 61/869,109, filed by the inventors of the present invention, titled “METHOD, SYSTEM AND APPARATUS FOR BIOMETRIC BODY RECOGNITION AND IDENTIFICATION”, filed on Aug. 23, 2013; The present application is a continuation in part of U.S. patent application Ser. No. 14/128,710, filed by the inventor of the present invention, titled “Methods Systems Apparatuses Circuits and Associated Computer Executable Code for Video Based Subject Characterization, Categorization, Identification and/or Presence Response”, filed on Dec. 23, 2013; U.S. patent application Ser. No. 14/128,710 is a U.S. National Stage application of International Application PCT/IL2012/050562, filed on Dec. 31, 2012 by the inventor of the present application and titled: “Methods Systems Apparatuses Circuits and Associated Computer Executable Code for Video Based Subject Characterization, Categorization, Identification and/or Presence Response”; International Application PCT/IL2012/050562 claims the benefit of U.S. Provisional Application No. 61/559,090, filed on Nov. 13, 2011; all of the aforementioned applications are hereby incorporated herein by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
61869109 | Aug 2013 | US | |
61559090 | Nov 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14128710 | US | |
Child | 14341936 | US |