1. Field of the Disclosure
The present disclosure relates to user profiling, recognition, and authentication. In particular, it relates to user profiling, recognition, and authentication using videophone systems or image capturing devices.
2. General Background
Audiovisual conferencing capabilities are generally implemented using computer based systems, such as in personal computers (“PCs”) or videophones. Some videophones and other videoconferencing systems offer the capability of storing user preferences. Generally, user preferences in videophones and other electronic devices are set up such that the preferences set by the last user are the preferences being utilized by the videophone or electronic device. In addition, these systems typically require substantial interaction by the user. Such interaction may be burdensome and time-consuming.
Furthermore, images captured by cameras in videophones are simply transmitted over a videoconferencing network to the destination videophone. As such, user facial expressions and features are not recorded for any other purpose than for transmission to the other videoconferencing parties. Finally, current videophones and other electrical devices only permit setting up user preferences for a single user.
A method and system of providing user profiling for an electrical device is disclosed. Face representation data is captured with an imaging device. The imaging device focuses on the face of the user to capture the face representation data. A determination is made as to whether a facial feature database includes user facial feature data that matches the face representation data. User preference data is loaded on a memory module of the electrical device when the face representation data matches user facial feature data in the facial feature database. A new user profile is added to the user profile database when the face representation data does not match user facial feature data in the facial feature database.
A user profiling system that includes a facial recognition module, a facial feature database, a user profiling module, and a user profiling database. The facial recognition module receives face representation data, the face representation data being captured by an imaging device. The imaging device focuses on the face of the user to capture the face representation data. The facial feature database stores a plurality of user records, each of the plurality of user records storing face representation data. In addition, each of the plurality of user records may correspond to each of a plurality of users of an electrical device. The user profiling module loads user preference data on a memory module of the electrical device. The user preference data is loaded on the electrical device when the face representation data matches user facial feature data in the facial feature database. The user profiling module creates a new user profile when the face representation data does not match user facial feature data in the facial feature database. Finally, the user profiling database stores a plurality of user profiles and corresponding user preference data, the user profiles corresponding to each of the plurality of users of the electrical device.
By way of example, reference will now be made to the accompanying drawings.
A method and apparatus for automated facial recognition and user profiling is disclosed. The system and method may be applied to one or more electrical systems that provide the option of setting up customized preferences. These systems may be personal computers, telephones, videophones, automated teller machines, personal data assistants, media players, and others.
Electrical systems do not generally store and manage settings and user-specific information or multiple users. Rather, current systems provide user interfaces with limited interfacing capabilities. The method and apparatus disclosed herein automatically maintain preferences and settings for multiple users based on facial recognition. Unlike current systems which are cumbersome to operate and maintain, the system and method disclosed herein automatically generate users preferences, and settings based on user actions, commands, order of accessing information, etc. Once a facial recognition module recognizes a returning user's face, a user-profiling module may collect user specific actions generate and learn user preferences for the returning user. If the user is not recognized by the facial recognition module, a new profile may be created and settings, attributes, preferences, etc., may be stored as part of the new user's profile.
In one example, the videophone 102 captures the face of the user only when the user is in a videoconference communicating with other videophone users. Thus, video recognition and profiling are performed without disturbing the user's videoconferencing session. Thus, the recognition and profiling are processes that are transparently carried out with respect to the user. While the user is on a videoconference, the facial recognition and profiling unit 100 may generate user preference and setting based on the user actions. In another embodiment, the videophone 102 captures the face of the user when the user is operating the videophone 102, and not necessarily during a videoconference. As such, the facial recognition and profiling unit 100 collects user action and behavior data to corresponding to any interaction between the user the videophone 102.
For example, during a videoconference call the user may set the volume at a certain level. This action is recorded by the facial recognition and profiling unit 100 and associated with the user's profile. Then, when the user returns to make another videoconference call, the user's face is recognized by the facial recognition and profiling unit 100, and the volume is automatically set to the level at which the user set it on the previous conference call.
In another example, during a videoconference call, both the near-end caller and the far-end caller is recognized by the facial recognition and profiling unit 100. The near-end user may be a user that has been recognized in the past by the facial recognition and profiling unit 100. When the near-end user receives a call from an far-end caller, the facial recognition and profiling unit 100 searches for the far-end caller profile and load the near-end user preferences with respect to communication with the far-end user. In addition, the far-end caller preferences and data may also be load for quick retrieval or access by the facial recognition and profiling unit 100. The facial recognition and profiling unit 100 may be configured to load any number of user profiles that may be parties of a conference call. The profiles, data and other associated information to the users participating in the conference call may or may not be available to other users in the conference call, depending on security settings, etc.
In yet another example, the outgoing videophone call log may be recorded for each user. The contact information for the parties in communication with each user is automatically saved. When the user returns to engage in another video conference call, the contact information for all of the contacted parties in the call log may be automatically loaded. In one embodiment, the facial recognition and profiling unit 100 stores user profiles for multiple users. Thus, if a second user engages in a video conference call at the same videophone 100, the videophone 100 may recognize the second user's face, and immediately load the contact list pertinent to the second user. As such, by performing facial recognition and automatically generating user profiles, minimal user interaction is required.
The facial features database 102 may store facial feature data for each user in the user profile database 104. In one embodiment, each user has multiple associated facial features. In another embodiment, each user has a facial feature image stored in the facial features database 102. The facial recognition module 106 includes logic to store the facial features associated with each user. In one embodiment, the logic includes a comparison of the facial features of a user with the facial features captured by the camera 110. If a threshold of similarity is surpassed by a predefined number of facial features, then the captured face is authenticated as belonging to the user associated with the facial features deemed similar to the captured face. In another embodiment, if a threshold of similarity is surpassed by at least one facial feature, then the captured face is authenticated as being the user associated with the facial feature deemed similar to the facial features in the user's face. In another embodiment, the facial recognition module 106 includes logic that operates based template matching algorithms. Pre-established templates for each may be configured as part of the recognition module 106 and a comparison be made to determined the difference percentage.
A new user, and associated facial features and characteristics may be added if the user is not recognized as an existing user. In one embodiment, if a threshold of similarity is not surpassed by a predefined number of facial features, then the captured face is added as a new user with the newly captured facial characteristics. In another embodiment, if a threshold of similarity is surpassed by at least one facial feature, then the captured face is added as a new user with the newly captured facial characteristics.
In one example, the facial recognition module 106 stores images for five facial features of the user (e.g. eyes, nose, mouth, and chin) in the facial features database 102. In another example, the facial recognition module 106 stores measurements of each of the facial features of a user. In yet another example, the facial recognition module 106 stores blueprints of each of the facial features of a user. In another example, the facial recognition module 106 stores a single image of the user's face. In another example, the facial recognition module 106 stores new facial feature data if the user is a new user. One or more pre-existing facial recognition schemes may be used to perform facial recognition.
The user profile database 104 may store user preferences, alternative identification codes, pre-defined commands, and other user-specific data. The user maintenance module 108 includes logic to perform user profiling. In one embodiment, the maintenance module includes logic to extract a user profile based on a user identifier. The user identifier may be, for example, the user facial features stored in the facial features database 102. In another embodiment, the maintenance module 108 includes logic to save user settings under the user's profile. In another embodiment, the maintenance module 108 includes logic to interpret user operations as a user preference and save the user preference under the user's profile. In another embodiment, the maintenance module 108 includes logic to interpret user operations as a user preference and save the user preference under the user's profile. In yet another embodiment, the maintenance module 108 includes logic to add a new user if the user is not associated with an existing user profile.
The facial recognition and profiling unit 100 may be connected to one or more peripheral devices for input and output. For example, a camera 110 is coupled with the facial recognition and profiling unit through a communications bus 116. The camera 110 captures the face of a person and generates an image of the user's face. In one embodiment, the camera 110 streams a captured data to the facial recognition module 104 without any presorting or pre-processing the images captured. In another embodiment, the camera 110 is configured to only transmit to the facial recognition module 106 images that resemble a human face. In another example, a keypad 120, a microphone 118, a display 122 and a speaker 124 is connected to the facial recognition and profiling unit 100 via the communications bus 116. Various other input and output devices may be in communication with the facial recognition and profiling unit 100. The inputs form various input devices may be utilized to monitor and learn user behavior and preferences.
In one embodiment, the facial recognition and profiling unit 100 is separated into two components in two separate housings. The facial recognition module 106 and the facial features database 102 is housed in a first housing. The user profile database 104 and the user maintenance module 108 may be housed in the second housing.
In one embodiment, facial recognition entails receiving a captured image of a user's face, for example through the camera 110, and verifying that the provided image corresponds to an authorized user by searching the provided image in the facial features database 102. If the user is not recognized, the user is added as a new user based on the captured faced characteristics. The determination of whether the facial features in the captured image correspond to facial features of an existing user in the facial features database 102 is performed by the facial recognition module 106. As previously stated, the facial recognition module 106 may include operating logic for comparing the captured user's face with the facial feature data representing an authorized user's faces stored in facial features database 102. In one embodiment, the facial features database 102 includes a relational database that includes facial feature data for each of the users profiled in the user profile database 104. In another embodiment, the facial features database 102 may be a read only memory (ROM) lookup table for storing data representative of an authorized user's face.
Furthermore, user profiling may be performed by a user maintenance module 108. In another embodiment, the user profile database 104 is a read-only memory in which user preferences, pre-configured function commands, associated permissions, etc. are stored. For example, settings such as preview inset turned on/off, user interface preferences, ring-tone preferences, call history logs, phonebook and contact lists, buddy list records, preferred icons, preferred emoticons, chat-room history logs, email addresses, schedules, etc. The user maintenance module 108 retrieves and stores data on the user profile database 104 to update the pre-configured commands, preferences, etc. As stated above, the user maintenance module 108 includes operating logic to determine user actions that are included in the user profile.
In addition, the facial recognition and profiling unit 100 includes a computer processor 112, which exchanges data with the facial recognition module 106 and the user maintenance module 108. The computer processor 112 executes operations such as comparing incoming images through the facial recognition module 106, and requesting user preferences, profile and other data associated with an existing user through the user maintenance module 108.
At process block 306, data representing the image of the scanned face is compared against the facial feature data stored in the facial features database 102 according to logic configured in the facial recognition module 106. As such, at decision process block 306 a determination is made whether the data representing the image of the scanned face matches facial feature data representing stored the facial feature database 102. The process 300 then continues to process block 308.
At process block 308, if the data representing the image of the scanned face matches data representing an image of at least one reference facial feature stored the facial feature database 102 user preferences are loaded on the electrical device. In one embodiment, a determination is made as to whether or not there are user preferences pre-set and stored in the user profiled database 102. If there are user preferences already in place, then the user profile and corresponding preferences are loaded on the electrical device. In another embodiment, if there are no pre-established user preferences, the user subsequent requests, actions, commands and input are collected in order to generate and maintain the user profile. In one embodiment, user preferences are automatically generated. Facial expressions, actions, commands, etc., corresponding to recognized user faces are automatically collected and stored in a user profile database. The data stored for each user may include call history logs, user data, user contact information, and other information learned while the user is using the videophone. User profiles may be generated without the need for user interaction. The process 300 then continues to process block 310.
At process block 310, if the data representing the image of the scanned face does not match data representing an image of at least one reference facial feature stored the facial feature database 102 the user is added as a new user to the user profile database 104. Facial features data representing the user's face are added to the facial feature database 102. In addition, the user profile database 104 includes a new record that may be keyed based on the user's face or facial features. Thus, every time a new user is added, a new record with associated facial features and preferences is created. Multiple users may access the system and establish a user account based on user-specific facial features.
The personal data assistant 502 may communicate with the facial recognition and profiling unit 100 to provide user facial features, user operations, and other data as discussed above. In addition, the facial recognition and profiling unit 100 stores user profiles, recognize new and existing user facial features, and exchange other data with the personal data assistant 502.
Thus, the facial recognition and profiling system 600 comprises processor (CPU) 112, memory 114, e.g., random access memory (RAM) and/or read only memory (ROM), facial recognition module 106, and various input/output devices 602, (e.g., storage devices, including but not limited to, a tape drive, a floppy drive, a hard disk drive or a compact disk drive, a receiver, a transmitter, a speaker, a display, an image capturing sensor, e.g., those used in a digital still camera or digital video camera, a clock, an output port, a user input device (such as a keyboard, a keypad, a mouse, and the like, or a microphone for capturing speech commands)).
It should be understood that the facial recognition module 106 may be implemented as one or more physical devices that are coupled to the processor 112 through a communication channel. Alternatively, the facial recognition module 106 may be represented by one or more software applications (or even a combination of software and hardware, e.g., using application specific integrated circuits (ASIC)), where the software is loaded from a storage medium, (e.g., a magnetic or optical drive or diskette) and operated by the processor 112 in the memory 114 of the facial recognition and profiling system 600. As such, the facial recognition module 106 (including associated data structures) of the present invention may be stored on a computer readable medium, e.g., RAM memory, magnetic or optical drive or diskette and the like.
Although certain illustrative embodiments and methods have been disclosed herein, it will be apparent form the foregoing disclosure to those skilled in the art that variations and modifications of such embodiments and methods may be made without departing from the true spirit and scope of the art disclosed. Many other examples of the art disclosed exist, each differing from others in matters of detail only. Accordingly, it is intended that the art disclosed shall be limited only to the extent required by the appended claims and the rules and principles of applicable law.