Claims
- 1. An imaging method comprising the steps of:
capturing an image of a scene; collecting affective information at capture; and associating the affective information with the scene image.
- 2. The method of claim 1 further comprising the steps of:
collecting user identification data; and associating the affective information and the user identification data with the scene image.
- 3. The method of claim 1, further comprising the steps of:
collecting user identification information; identifying a user based on the user identification data; and associating the affective information and the scene image with the user.
- 4. The method of claim 1 wherein the step of associating the affective information with the image comprises storing the scene image, the affective information and user identification data in a common digital file.
- 5. The method of claim 1, wherein the step of associating the affective information with the image comprises storing the affective information within the scene image.
- 6. The method of claim 2, wherein the affective information and user identifier are stored in association with the scene image.
- 7. The method of claim 2 further comprising the step of using the collected affective information and user identification data to build a personal user profile.
- 8. The method of claim 7 further comprising the step of:
identifying a user based on the user identification data.
- 9. The method of claim 1 wherein the step of collecting affective information comprises monitoring the physiology of the user.
- 10. The method of claim 1 wherein the collected affective information is used build a personal user profile.
- 11. The method of claim 9, wherein the step of collecting affective information comprises the steps of interpreting the collected physiological information to determine the relative degree of importance of the scene image.
- 12. The method of claim 1, wherein the step of collecting affective information at capture comprises collecting an image of a user from which affective information can be obtained.
- 13. The method of claim 12, wherein an electronic image of at least a part of a face of the user is captured and affective information is derived therefrom.
- 14. The method of claim 1, wherein the step of collecting affective information at capture comprises determining a relative degree of importance.
- 15. The method of claim 14 wherein the step of associating the affective information with the image comprises storing the relative degree of importance in association with the scene image.
- 16. The method of claim 11 wherein the relative degree of importance of the image is established at least in part based upon the personal user profile.
- 17. The method of claim 1 wherein the step of automatically collecting affective information includes monitoring the eye gaze of the user.
- 18. The method of claim 1, further comprising the step of obtaining non-affective information at capture.
- 19. The method of claim 18, further comprising the step of interpreting the affective information and non-affective information to determine the relative degree of importance of the scene image.
- 20. The method of claim 19, wherein the non-affective information includes the captured image and the relative degree of importance is at least in part determined by analysis of the scene image.
- 21. The method of claim 1 wherein the step of collecting affective information comprises collecting manually entered affective information.
- 22. The method of claim 1 wherein the step of collecting affective information at capture comprises detecting composition of an image, and collecting affective information during composition.
- 23. The method of claim 1 wherein the step of collecting affective information at capture comprises detecting verification of a scene image and collecting affective information during verification.
- 24. An imaging method comprising the steps of:
capturing an image of a scene; collecting affective signals at capture; determining a relative degree importance of the scene image based at least in part upon the collected affective signals; and associating the relative degree of importance with the scene image.
- 25. The method of claim 24, wherein the affective signals comprise physiological characteristics.
- 26. The method of claim 24, wherein the affective signals comprise facial characteristics.
- 27. The method of claim 24 wherein the affective signals comprise physiological characteristics and facial characteristics.
- 28. A photography method, comprising the steps of:
capturing an image of a scene; obtaining an image of a photographer at capture; determining affective information based at least in part on interpretation of the image of the photographer; and associating the affective information with the scene image.
- 29. The method of claim 28, wherein non-affective information is collected at capture and the step of determining affective information further comprises determining affective information based at least in part upon the non-affective information.
- 30. The method of claim 28, wherein information regarding the physiology of the photographer is collected at capture and the step of determining affective information further comprises determining affective information based at least in part upon the physiology of the photographer.
- 31. The method of claim 28, wherein information regarding the physiology of the photographer is collected at capture and the step of determining affective information further comprises determining affective information based at least in part upon the physiology of the photographer.
- 32. The method of claim 28, wherein the image of the photographer is captured in the non-visible portion of the spectrum.
- 33. An imaging method comprising the steps of:
capturing a stream of images; collecting a stream of affective information during image capture; and associating the stream of affective information with the stream of images.
- 34. The method of claim 33 wherein the stream of affective information includes user identification data.
- 35. The method of claim 33, wherein the stream of affective information comprises is examined to determine when relevant changes in affective information occur and the step of associating the affective information with the corresponding stream of images comprises associating data representing relevant changes in the affective information with the stream of images at a points in the stream of images that correspond to the occurrence of the relevant changes.
- 36. A method for determining affective information comprising the steps of:
obtaining affective signals including facial characteristics and physiological characteristics of a person; analyzing the facial characteristics; analyzing the physiological characteristics; and, determining an emotional state based upon the analysis of the facial and physiological characteristics of the person.
- 37. The method of claim 36 wherein the step of obtaining affective information comprises obtaining affective information at capture of an image.
- 38. The method of claim 37, further comprising the step of storing the affective information with the captured image.
- 39. The method of claim 36 wherein the step of collecting affective information includes manual entering user's reaction.
- 40. The method of claim 36 further comprising the steps of capturing an image of a scene being observed by the person at the time that the affective signals are obtained, wherein the step of determining an emotional state is further based upon analysis of the scene image.
- 41. An c imaging system comprising:
an image capture system adapted to capture an image selected by a user, a memory which stores the image, and a set of sensors adapted to capture affective signals from the user at capture, and a processor adapted to associate the affective information with the captured image.
- 42. The imaging system of claim 41 wherein the set of sensors is further adapted to capture user identification data.
- 43. The imaging system of claim 40, wherein the processor is adapted to determine a user identity based upon analysis of the user identification data.
- 44. The imaging system of claim 43 wherein the processor further comprises a transmitter transmitting a signal containing at least one captured image in association with affective information and the user identification data.
- 45. The imaging system of claim 41 wherein the processor is further adapted to process affective information captured by the set of sensors to provide further affective information.
- 46. The imaging system of claim 41 wherein the camera further comprises at least one source of non-affective information.
- 47. The imaging system of claim 46, wherein the processor is further adapted non-affective information from the at least one source and determines a degree of relative importance based upon the affective information and the non-affective information.
- 48. The imaging system of claim 41 further comprising a second image capture system adapted to capture an image of at least a part of the user.
- 49. The imaging system of claim 41 wherein the second image capture system captures an image of the user in a non-visible portion of the spectrum.
- 50. The imaging system of claim 48 wherein the processor is further adapted to analyze the image of at least a part of the user to determine the identity of the user.
- 51. The imaging system of claim 48 wherein the processor is further adapted to analyze the image of at least a part of the user to determine affective information therefrom.
- 52. The imaging system of claim 41 wherein the processor is further adapted to analyze the image of the scene to determine non-affective information therefrom and to determine a degree of relative importance based upon the affective information and the non-affective information.
- 53. The imaging system of claim 43 wherein the processor is further adapted to create and update a personal user profile that comprises affective information.
- 54. The imaging system of claim 53 wherein a personal user profile further comprises non-affective information.
- 55. The imaging system of claim 41 wherein at least one sensor is adapted to collect affective information from signals carried by the nervous system of a person.
- 56. The imaging system of claim 41 wherein said camera is packaged in a wearable form.
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] Reference is made to commonly assigned U.S. patent application Ser. No. 09/721,222, entitled “Method For Adding Personalized Metadata to a Collection of Digital Images” filed by Parulski et al. on Nov. 22, 2000; Ser. No. 10/036,113, entitled “Method For Creating and Using Affective Information in a Digital Imaging System” filed by Matraszek et al on Dec. 26, 2001; Ser. No. 10/036/123 entitled “Method for Using Affective Information Recorded With Digital Images for Producing an Album Page” filed by Matraszek et al., on Dec. 26, 2001; Ser. No. ______ Docket No. 85243 entitled “Camera System With Eye Monitoring” filed by Miller et. al; Ser. No. ______ Docket No. 84897 entitled “Method And System For Creating And Using Affective Information In An Image Capture Device For Health Monitoring And Personal Security”; filed by Fedorovskaya et al.; Ser. No. ______ Docket No. 85575 entitled “Method and Computer Program Product For Determining an Area of Importance In An Image Using Eye Monitoring Information” filed by Miller et al.; filed herewith, the disclosures of which are incorporated herein by reference.