1. Technical Field
Embodiments of the present disclosure relate to data analysis technology, and particularly to an electronic device and method for analyzing interpersonal relationships of persons in digital images.
2. Description of Related Art
Social network sites (e.g., FACEBOOK, GOOGLE+) provide image sharing function to persons. The person may upload images to the social network sites, and add tag information (e.g., names) for each uploaded image. The social network sites may help the person find their friends in a plurality of images using face detection technology. However, the social network sites cannot determine an interpersonal relationship between two persons (i.e., an association between two people that may range from short-lived to long-lasting), and cannot determine an variation tendency of the interpersonal relationship between two persons. If a person wants to know the variation tendency of the interpersonal relationship (e.g., in which years the relationship were the best) with his/her friend, the person has to look up all of the images with his/her friends in albums, to determine which years have the most images with his/her friends (a number of the images can be used to represent a period of the best relationship). Therefore, a more efficient method for analyzing interpersonal relationships of persons in digital images is desired.
All of the processes described below may be embodied in, and fully automated via, functional code modules executed by one or more general purpose electronic devices or processors. The code modules may be stored in any type of non-transitory computer-readable medium or other storage device. Some or all of the methods may alternatively be embodied in specialized hardware. Depending on the embodiment, the non-transitory computer-readable medium may be a hard disk drive, a compact disc, a digital video disc, a tape drive or other storage medium.
The display device 20 displays digital images (hereinafter referred to as “images”) of different persons and other digital information, and the input device 22 may be a mouse or a keyboard for data input. The storage device 23 may be a non-volatile computer storage chip that can be electrically erased and reprogrammed, such as a hard disk or a flash memory card.
In one embodiment, the data analysis system 24 is used to analyze interpersonal relationships of specified persons based on the images of the specified persons, determine a tendency chart of the interpersonal relationships of the specified persons, and display the tendency chart of the interpersonal relationship on the display device 20. The data analysis system 24 may include computerized instructions in the form of one or more programs that are executed by the at least one processor 25 and stored in the storage device 23 (or memory). A detailed description of the data analysis system 24 is given in the following paragraphs.
In step S10, the data receiving module 240 receives search keywords of a second person input by a first person and a time length of a preset time period for analyzing a variation tendency of an interpersonal relationship between the first person and the second person. The search keywords may be a name of the second person, the time length of the preset time period may be one week, one month, or one quarter. In one embodiment, the first person is a person who uses the data analysis system 24. As shown in
In step S11, the image obtaining module 241 obtains images within every preset time period from an album of the storage device 23. In one embodiment, each image includes a time stamp. For example, if the image includes exchangeable image file format (EXIF) information, the time recorded in the EXIF information is set as the time stamp of the image. If the image does not include the EXIF information, the time when the image is uploaded to a storage device of the social network site (upload time) is set as the time stamp of the image.
For example, if the time length of the preset time period is set as one month by the first person, the image obtaining module 241 obtains the images within every month from the album of the storage device 23 according to the time stamp of each image. For example, the image obtaining module 241 obtains ten images in January, 2012, fifteen images in February, and so on. In other embodiments, the time length of the preset time period may a default duration (e.g., one month), so that the first person does not need to set the time length of the preset time period.
In step S12, the face detecting module 242 determines images from the obtained images which include the first person and the second person within every preset time period. For example, the face detecting module 242 determines six images including the first person and the second person in January, 2012 from the ten images in January, 2012, and determines eight images including the first person and the second person in February, 2012 from the fifteen images in February, 2012.
In detail, the face detecting module 242 detects one or more face blocks in each image within every preset time period, and determines whether one image includes the first person and the second person by comparing the detected face blocks in the one image with a first face template of the first person and a second face template of the second person. In one embodiment, the first face template may be a first head portrait of the first person in the social network site, and the second face template may be a second head portrait of the second person in the social network site.
If one image includes a first face block matching the first face template of the first person and includes a second face block matching the second face template of the second person, the face detecting module 242 determines that the one image includes the first person and the second person.
In step S13, the interpersonal relationship analyzing module 243 calculates a distance between the first person and the second person in each determined image within every preset time period, and calculates a relationship weight between the first person and the second person within every preset time period according to the distance between the first person and the second person in the determined images. In one embodiment, the distance is a relative value that indicates how close two persons stand in each determined image. For example, if the first person is adjacent to the second person in one determined image, the distance between the first person and the second person is “1”, if a number of persons between the first person and the second person is “n”, the distance between the first person and the second person is “n+1”.
In detail, the interpersonal relationship analyzing module 243 obtains each determined image in every preset time period, determines a number “U” of persons included in each determined image according to the detected face blocks in each determined image, and calculates a distance “D” between the first person and the second person in each determined image.
In addition, the interpersonal relationship analyzing module 243 further calculates a relationship strength “E(n)” between the first person and the second person in each determined image according to the number “U” of persons in each determined image and the distance “D” between the first person and the second person using a preset relationship function. In one embodiment, the preset relationship function is “E(n)=1/f(U, D)”. One example of the preset relationship function is “E(n)=1/(U*D)”, where, “*” is a multiplication sign.
When the determined images within every preset time period are processed, the interpersonal relationship analyzing module 243 calculates a relationship weight between the first person and the second person within every preset time period by totaling the relationship strength “E(n)” between the first person and the second person in each determined image within every preset time period. In one embodiment, a relationship weight between the first person and the second person within every preset time period represents an interpersonal relationship between the first person and the second person within every preset time period. A formula for calculating the relationship weight between the first person and the second person is as follows.
In the formula (I), “ETt(a,b)” represents a relationship weight between a first person “a” and a second person “b” within a preset time period “Tt”, “PTt” represents a number of determined images which include the first person “a” and the second person “b” within the preset time period “Tt”, “Un” represents a number of persons included in a nth determined image within the preset time period “Tt”, and “Dn(a,b)” represents a distance “D” between the first person “a” and the second person “b” in the nth determined image within the preset time period “Tt”. For example, the interpersonal relationship analyzing module 243 determines that the relationship weight between the first person “a” and the second person “b” within January, 2012 is 80, and the relationship weight between the first person “a” and the second person “b” within February, 2012 is 90. In one embodiment, a higher relationship weight within one preset time period represents a better relationship between the first person “a” and the second person “b” within the preset time period.
In step S14, the interpersonal relationship displaying module 244 determines a tendency chart 30 of the relationship weight between the first person and the second person according to the relationship weight between the first person and the second person within every preset time period, and displays the tendency chart 30 on the display device 20.
For example, as shown in
In other embodiments, the tendency chart 30 of the relationship weight may further include a movable time block 32 which may be moved along the horizontal axis of the tendency chart 30. The movable time block 32 includes one or more preset time periods and a plurality of determined images including the first person and the second person within each preset time period. As shown in
In other embodiments, the data receiving module 240 may receive search keywords of a second person and a third person (or more persons) input by a first person, where the first person is the person who uses the data analysis system 24. As shown in
In other embodiments, when the data receiving module 240 receives the search keywords of the second person and the third person input by the first person, one relationship curve which records a variation of the relationship weight between the second person and the third person may be also displayed in the tendency chart 30.
In other embodiments, the step S13 may be executed as follows. The interpersonal relationship analyzing module 243 calculates a relationship weight between the first person and the second person within every preset time period according to a number of determined images which include the first person and the second person within every preset time period. For example, a larger number of the determined images within one preset time period represents a higher relationship weight between the first person and the second person within the one preset time period (i.e., a better relationship between the first person and the second person within the one preset time period).
It should be noted that an accuracy of the relationship weight calculated by the distance between the first person and the second person is greater than an accuracy of the relationship weight calculated by the number of the determined images which include the first person and the second person. For example, as shown in
It should be emphasized that the above-described embodiments of the present disclosure, particularly, any embodiments, are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) of the disclosure without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and the present disclosure and protected by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
101146000 | Dec 2012 | TW | national |