The present invention relates to an electronic device, a selection method, an acquisition method, an electronic apparatus, a synthesis method, and a synthesis program.
In the related art, a technology that extracts a predetermined object (for example, the face of a person) by pattern-matching from a moving picture is disclosed (for example, refer to Japanese Unexamined Patent Application, First Publication No. 2009-31469). According to Japanese Unexamined Patent Application, First Publication No. 2009-31469, an image region of a predetermined object can be displayed on a display screen.
In the related art, a method that extracts rhythm of music is known. For example, in Japanese Unexamined Patent Application, First Publication No. 2006-192276, a video game apparatus is disclosed which includes sensor means attached to a human body for detecting motion of the attached location, tempo extraction means for extracting detection tempo of a detection value detected by the sensor means, music output means for outputting music, rhythm extraction means for extracting rhythm of the music, and evaluation means for evaluating whether or not the detection tempo extracted by the tempo extraction means synchronizes with the rhythm of the music extracted by the rhythm extraction means.
In the technology disclosed in Japanese Unexamined Patent Application, First Publication No. 2009-31469, since a captured image itself or an object itself is not digitized (indexed), there is a problem that captured images or objects cannot be compared to each other. In addition, since captured images or objects cannot be compared to each other, there is a problem that various application processes which apply comparison results between the captured images or between the objects (for example, grouping of the captured image or the object based on similarity of the captured image or the object, grouping of the imaging device based on the similarity of the captured image or the object by each imaging device, extraction of the captured image or the object similar to the reference captured image or the reference object, and extraction of similar points from different images) cannot be realized.
Moreover, in the related art, in the electronic apparatus in the technology disclosed in Japanese Unexamined Patent Application, First Publication No. 2006-192276, the tempo can be extracted from a signal detected from the sensor which detects the motion, such as an acceleration sensor or the like included in the electronic apparatus, and the information that indicates the tempo can be displayed on the display apparatus. However, since the tempo reflects only the information such as the motion of an operator, there is a problem that variation of expression is few.
An aspect according to the present invention provides an electronic device that can easily compare the captured images or the objects and easily obtains numerical values (indexes) indicating the captured image itself or the object itself.
An object of another aspect of the present invention provides a technology that can express detected information in more expressive way.
According to an aspect of the present invention, an electronic device includes a storage unit that is configured to store rhythm information which indicates a pattern of a spatial change in an image associated with a pattern of a spatial change in a unit region in an image; an imaging unit; a calculation unit that is configured to calculate a pattern of a change in a unit region in an image captured by the imaging unit; and a selection unit that is configured to select the rhythm information from the storage unit, the rhythm information being corresponding to the pattern of the change in the unit region calculated by the calculation unit.
In the above mentioned electronic device, the storage unit may store the rhythm information associated with a combination of a first pattern and a second pattern, the first pattern being a pattern of a change in a unit region and the second pattern being a pattern of a change in a unit region, wherein the calculation unit may calculate a pattern of a change in a unit region which configures a main object in the captured image, and a pattern of a change in a unit region which configures a portion other than the main object, and wherein the selection unit may select the rhythm information, in which the first pattern corresponds to a pattern of a change in a unit region which configures the main object calculated by the calculation unit and the second pattern corresponds to a pattern of a change in a unit region which configures the portion other than the main object calculated by the calculation unit, from the storage unit.
In the above mentioned electronic device, the unit region may be a pixel group configured of adjacent pixels having a predetermined number, and a pattern of a change in a unit region may be information that indicates a spatial change in an average pixel value, a maximum pixel value, a minimum pixel value, or a medium value of a pixel value for the each pixel group.
In the above mentioned electronic device, the unit region may be a pixel group configured of adjacent pixels having a predetermined number, and a pattern of a change in a unit region may be information in which changes in a frequency region and a time region are extracted as rhythm from information of each pixel within the pixel group.
In the above mentioned electronic device, the unit region may be a pixel group configured of adjacent pixels in which a difference of pixel values is a predetermined value or less, and a pattern of a change in a unit region may be information that indicates a spatial change of an average pixel value, a maximum pixel value, a minimum pixel value, or a medium value of a pixel value for each pixel group.
In the above mentioned electronic device, a pattern of a change in a unit region may be information that indicates a distribution of a pixel group configured of adjacent pixels in which a difference of pixel values is a predetermined value or less.
According to another aspect of the present invention, a selection method that selects rhythm information of an image captured by an imaging unit, in an electronic device including a storage unit that is configured to store the rhythm information that indicates a pattern of a spatial change in an image associated with a pattern of a spatial change in a unit region in the image, includes: calculating a pattern of a change in a unit region in the captured image by using a calculation unit of the electronic device, and selecting the rhythm information from the storage unit by using a selection unit of the electronic device, the rhythm information being corresponding to the pattern of the change in the unit region calculated by the calculation unit.
According to still another aspect, an electronic device includes: an imaging unit; an extraction unit that is configured to extract an object graphic which is a graphic indicating a region of object from a moving picture captured by the imaging unit; and an acquisition unit which acquires a variation of an area of the object graphic of an object or a period of a change in the area of the object graphic of the object as rhythm information indicating a temporal change of the object, the first object being extracted by the extraction unit.
In the above mentioned electronic device, the extraction unit may extract a circumscribing rectangle, which circumscribes an object, as the object graphic.
In the above mentioned electronic device, the acquisition unit may acquire a variation of an aspect ratio of the circumscribing rectangle or a period of a change in the aspect ratio of the circumscribing rectangle as the rhythm information, instead of or in addition to a variation of an area of the circumscribing rectangle or a period of a change in the area of the circumscribing rectangle, the circumscribing rectangle being extracted as the object graphic.
According to still another aspect of the present invention, an electronic device includes: an imaging unit; an extraction unit that is configured to extract a circumscribing rectangle, which circumscribes an object, as an object graphic from a moving picture captured by the imaging unit, the object graphic being a graphic indicating a region of an object; and an acquisition unit that is configured to acquire a variation of a length of a long side or a short side of the circumscribing rectangle or a period of a change in the length of the circumscribing rectangle as rhythm information indicating a temporal change in the object, the circumscribing rectangle being extracted as the object graphic of an object by the extraction unit.
In the electronic device, the acquisition unit may acquire a variation of an aspect ratio of the circumscribing rectangle or a period of a change in the aspect ratio of the circumscribing rectangle as the rhythm information, instead of or in addition to a variation of the length of the long side or the short side, or a period of a change of the length of the circumscribing rectangle.
According to still another aspect of the present invention, an acquisition method of rhythm information in an electronic device which acquires the rhythm information from a moving picture, the rhythm information being indicating a temporal change in an object in a moving picture, includes: extracting an object graphic from the moving picture by using an extraction unit of the electronic device, the object graphic being a graphic indicating a region of an object, and acquiring a variation of an area of the object graphic of a first object or a period of the change in the area of the object graphic of the first object as rhythm information by using an acquisition unit of the electronic device, the first object being extracted by the extraction unit and the rhythm information being indicating a temporal change in the object.
According to still another aspect of the present invention, an electronic device includes: an imaging unit; and an extraction unit that is configured to extract rhythm information indicating a pattern of a color change of an object in a moving picture which is captured by the imaging unit.
In the above mentioned electronic device, the device may further include a correction unit that is configured to correct a color of the moving picture to a color obtained in a case the moving picture was captured under a predetermined reference light, wherein the extraction unit extracts the rhythm information from a moving picture which is corrected by the correction unit.
In the above mentioned electronic device, the extraction unit may include: a storage unit that is configured to store the rhythm information associated with a pattern of a color change of a unit region configuring the object; a calculation unit that is configured to calculate the pattern of the color change of the unit region in the moving picture; and a selection unit that is configured to select the rhythm information from the storage unit, the rhythm information being corresponding to the pattern of the color change of the unit region calculated by the calculation unit.
In the above mentioned electronic device, the unit region is a pixel group configured of adjacent pixels having a predetermined number, and the pattern of the color change of the unit region is information that indicates a temporal change of an average pixel value, a maximum pixel value, a minimum pixel value, or a medium value of a pixel value for each pixel group.
In the above mentioned electronic device, the unit region may be a pixel group configured of adjacent pixels in which a difference of pixel values is a predetermined value or less, and the pattern of the color change of the unit region may be information that indicates a temporal change of an average pixel value, a maximum pixel value, a minimum pixel value, or a medium value of a pixel value for each pixel group.
In the electronic device, the unit region may be a pixel group configured of adjacent pixels in which a difference of pixel values is a predetermined value or less, and the pattern of the color change of the unit region may be information indicating a temporal change of a distribution of the pixel group.
In the electronic device, the color change includes changes of any one or two or more of hue, chroma, brightness, chromaticity, and a contrast ratio.
According to still another aspect of the present invention, a selection method that selects rhythm information of a moving picture captured by an imaging unit, in an electronic device comprising a storage unit that is configured to store the rhythm information that indicates a pattern of a color change in an object in the moving picture associated with a pattern of a color change of a unit region configuring the object in the moving picture, includes: calculating the pattern of the color change of the unit region in the moving picture by using a calculation unit of the electronic device, and selecting the rhythm information from the storage unit, the rhythm information being corresponding to the pattern of the color change of the unit region calculated by the calculation unit by using a selection unit of the electronic device.
According to still another aspect of the present invention, an electronic apparatus includes: a plurality of detection units that is configured to detect a plurality of signals from a detection target, the plurality of signals being indicating characteristics of a target; an extraction unit that is configured to extract each of patterns of the signals, which repeatedly appear, from the plurality of signals detected by the plurality of detection units; and a synthesis unit that is configured to synthesize each extracted pattern.
In addition, according to still another aspect of the present invention, a synthesis method includes: a plurality of detection steps that detect a plurality of signals indicating characteristics of a target from a detection target; an extraction step that extracts each of the patterns of the signals, which repeatedly appear, from plurality of signals detected at the plurality of detection steps; and a synthesis steps that synthesizes each extracted pattern.
According to still another aspect of the present invention, a synthesis program causing a computer, which comprises a storage unit in which information indicating a plurality of signals detected by a plurality of detection units is stored, to execute: an extraction step that reads information, which indicates a plurality of signals, from the storage unit, and extracts the each of the patterns of the signals, which repeatedly appear, from the information indicating the plurality of read signals; and a synthesis step of synthesizing each extracted pattern.
According to the aspects of the present invention, numerical values (rhythm information) indicating the captured image itself or the object itself can be simply acquired from the captured image or the object. Moreover, the captured images or the objects can be simply compared to each other using the numerical values. Furthermore, comparison results of the captured images or the objects can be utilized for various application processes (for example, grouping of the captured images or objects based on similarity of the captured images or objects, grouping of the imaging devices based on similarity of the captured images or the objects by each imaging device, extraction of a captured image or an object similar to the reference captured image or the reference object, and extraction of similar points from images different from one another).
Moreover, according to other aspects of the present invention, the detected information can be expressed in expressive way.
Hereinafter, a first embodiment of the present invention will be described with reference to the drawings.
For example, the electronic device 1 is a digital camera, and as shown in
The imaging unit 10 is a camera which captures a still image and a moving picture. The extraction unit 20 extracts rhythm information that indicates a pattern of a spatial change in a captured image (still image) captured by the imaging unit 10. The second storage unit 40 stores the rhythm information extracted by the extraction unit 20.
The first storage unit 22 stores the above-described rhythm information associated with a pattern of a spatial change in a unit region (hereinafter, referred to as a “pixel group”) in an image. Specifically, the first storage unit 22 stores the rhythm information associated with a combination of a first pattern which is a pattern of a change in the pixel group and a second pattern which is a pattern of a change in the pixel group.
The pixel group is configured of adjacent pixels having a predetermined number. The pattern of the change in the pixel group in the image is information that indicates the spatial change in an average pixel value (average value of pixel values of a plurality of pixels in the pixel group) for each pixel group. The spatial change means a change according to a position in the image.
As described above, the first pattern is the pattern of the change in the pixel group, and mainly indicates the pattern of the change in the pixel group which configures a main object (for example, an object captured in a center region) in the image. On the other hand, the second pattern is the pattern of the change in the pixel group which mainly configures portions other than the main object in the image.
An aspect, which stores the rhythm information associated with the combination of the first pattern and the second pattern, is not particularly limited, and in the present embodiment, the aspect may be an aspect in which the first storage unit 22 stores each of the first pattern and the second pattern, and stores the rhythm information for each combination of identification information (hereinafter, referred to as a “first pattern identification information) identifying the first pattern and identification information (hereinafter, referred to as a “second pattern identification information”) identifying the second pattern. Moreover, the aspect, in which each of the first pattern and the second pattern is stored and the rhythm information is stored for each combination of the first pattern identification information and the second pattern identification information, is advantageous for maintenance of each information piece (first pattern, second pattern, and rhythm information).
Moreover, each information piece stored by the first storage unit 22 may be the information which is prepared by the electronic device 1, the information which is acquired by the electronic device 1 from the outside, or the information which is input by a user of the electronic device 1. Moreover, as an aspect which is prepared by the electronic device 1, the calculation unit 24 calculates the first pattern and the second pattern based on a sample image (may be an image captured by the imaging unit 10 or an image acquired from the outside) to calculate (generate) the first pattern and the second pattern in advance, and the first pattern, the second pattern, and the rhythm information are stored in the first storage unit 22.
Hereinafter, a process according to the extraction unit 20 will be described in detail with reference to
An image P shown in
1O (Xj, Yn), 1O (Xj, Yo), 1O (Xk, Yn), and 1O (Xk, Yo) in
1B (Xj, Yl), 1B (Xj, Ym), 1B (Xj, Yp), 1B (Xk, Yl), 1B (Xk, Ym), and 1B (Xk, Yp) in
The first storage unit 22 stores the first pattern shown in
That is, as shown in
As described above, as shown in
In the first storage unit 22, as shown in
The calculation unit 24 extracting the main object calculates the average pixel value for each pixel group configuring the main object. That is, the calculation unit 24 calculates the pattern of the spatial color change of the pixel groups configuring the main object. Moreover, the calculation unit 24 calculates the average pixel value for each pixel group configuring portions other than the main object. That is, the calculation unit 24 calculates the pattern of the spatial color change of the pixel group configuring portions other than the main object. The calculation unit 24 supplies the calculated pattern of the spatial color change of the pixel group configuring the main object, and the calculated pattern of the spatial color change of the pixel group configuring portions other than the main object to the selection unit 26.
The selection unit 26 acquires the pattern of the spatial color change of the pixel group configuring the main object and the pattern of the spatial color change of the pixel group configuring portions other than the main object, from the calculation unit 24, and the selection unit 26 selects the rhythm information, in which the first pattern corresponds to the pattern of the spatial color change of the pixel group configuring the main object and the second pattern corresponds to the pattern of the spatial color change of the pixel group configuring portions other than the main object, from the first storage unit 22.
For example, the selection unit 26 selects the first pattern which coincides with or is most similar to the pattern of the spatial color change of the pixel group configuring the main object, selects the second pattern, which coincides with or is most similar to the pattern of the spatial color change of the pixel group configuring portions other than the main object, among the second patterns which constitutes a pair with the first pattern, and acquires the rhythm information corresponding to the combination of the selected first pattern and second pattern. The selection unit 26 stores the acquired rhythm information in the second storage unit 40. Moreover, the rhythm information stored in the second storage unit 40 is used for comparison of captured images, or the like.
Hereinafter, an operation of the electronic device 1 will be described with reference to a flowchart.
The calculation unit 24 extracts the main object from the captured image (Step S10). The calculation unit 24 calculates the pattern of the spatial color change of the pixel group configuring the main object (Step S12). Specifically, the calculation unit 24 calculates the average pixel value for each pixel group which configures the main object. The calculation unit 24 supplies the pattern of the spatial color change of the pixel group configuring the main object to the selection unit 26.
In addition, the calculation unit 24 calculates the pattern of the spatial color change of the pixel group configuring portions other than the main object (Step S 14). Specifically, the calculation unit 24 calculates the average pixel value for each pixel group which configures portions other than the main object. The calculation unit 24 supplies the pattern of the spatial color change of the pixel group configuring portions other than the main object to the selection unit 26.
The selection unit 26 acquires the pattern of the spatial color change of the pixel group configuring the main object and the pattern of the spatial color change of the pixel group configuring portions other than the main object, from the calculation unit 24, and the selection unit 26 selects the rhythm information, in which the first pattern corresponds to the pattern of the spatial color change of the pixel group configuring the main object and the second pattern corresponds to the pattern of the spatial color change of the pixel group configuring portions other than the main object, from the first storage unit 22 (Step S16), and stores the selected rhythm information in the second storage unit 40. Then, the flowchart ends.
In addition, in the flowchart shown in
As described above, according to the electronic device 1, the rhythm information, which is a numerical value indicating the captured image itself, can be simply acquired from the object. Moreover, the captured images can be simply compared to each other using the rhythm information which is indicated by the numerical value. Moreover, comparison results of the captured images can be utilized for various application processes (for example, grouping of the captured images based on similarity of the captured images, grouping of the imaging devices based on similarity of the captured images by each imaging device, extraction of an object similar to the reference captured image, and extraction of similar points from images different from one another).
Moreover, in the electronic device 1, the pixel group which includes pixels configuring the main object, and the pixel group which includes pixels configuring portions other than the main object are distinguished from each other, and the rhythm information of the captured image is extracted. Accordingly, that is, since the rhythm information of the captured image is extracted considering the pattern of the spatial color change of portions other than the main object in addition to the pattern of the spatial color change of the main object, the rhythm information can be extracted with high accuracy.
In addition, the embodiment is the example in which the information indicating the spatial change in the average pixel value (average value of the pixel values of the plurality pixels in the pixel group) for each pixel group is used as the pattern of the spatial color change of the pixel group. However, the value used as the pattern of the spatial color change of the pixel group is not limited to this. For example, information that indicates a spatial change in a maximum pixel value (maximum value of pixel values of a plurality of pixels in a pixel group) for each pixel group, information that indicates a spatial change in a minimum pixel value (minimum value of pixel values of a plurality of pixels in a pixel group) for each pixel group, information that indicates a spatial change in a medium pixel value (medium value of pixel values of a plurality of pixels in a pixel group) for each pixel group, or the like may be used as the pattern of the color change of the pixel group.
Moreover, instead of the information that indicates the spatial change in the average pixel value (maximum pixel value, minimum pixel value, or medium value of the pixel value) for each pixel group, information, that indicates changes in a frequency region and a time region for each pixel group, may be used as the pattern of the spatial color change of the pixel group. In other words, the pattern of the change in the pixel group (unit region) configured of adjacent pixels having a predetermined number may be the information that extracts the changes in the frequency region and the time region as rhythm, from the information according to each pixel in the pixel group.
Moreover, for example, as a method that extracts the changes in the frequency region and the time region, the changes can be obtained by performing a multi-resolution analysis through a discrete wavelet transform on the imaging information in each pixel in the unit region, and can be obtained by dividing the imaging information in each pixel in the unit region into each frequency band having a certain fixed period and by performing a window Fourier transform on the divided information for each set frequency band. As a result, the image can be a rhythmic change in the frequency region and the time region, and the extraction of characteristics in the rhythm and the comparison thereof are possible.
Moreover, the embodiment has the aspect in which the adjacent pixels having a predetermined number is set to the pixel group, the information indicating the spatial change in the average pixel value (maximum pixel value, minimum pixel value, or medium value of the pixel values) for each pixel group is set to the pattern of the spatial color change of the pixel group, and the rhythm information corresponding to the captured image is extracted based on the pattern. However, the aspect that extracts the rhythm information corresponding to the captured image is not limited to this.
As an example, adjacent pixels, in which a difference of the pixel values is a predetermined value or less, is set to the pixel group, the information indicating the spatial change in the average pixel value (maximum pixel value, minimum pixel value, or medium value of the pixel values) for each pixel group is set to the pattern of the spatial color change of the pixel group, and the rhythm information corresponding to the captured image may be extracted based on the pattern.
In addition, space information 1 to space information n (n is integer of 1 or more) shown in
That is, in the extraction unit 20, as shown in
Moreover, if information contents of the space information of
Moreover, as the information indicating the spatial change in the distribution of the pixel group, for example, instead of the positions and shapes of each pixel group, information with respect to the pixel group for each space may be used. As the information with respect to the pixel group for each space, for example, there is the number of pixel groups, the sizes of each pixel group, the distribution of the color (pixel value), or the like of each pixel group configuring the main object and each pixel group configuring portions other than the main object, for each predetermined region (for example, ¼ region in the upper left, ¼ region in the upper right, ¼ region in the lower left, and ¼ region in the lower right) which is determined in advance in the image.
Hereinafter, a second embodiment of the present invention will be described with reference to the drawings.
For example, the electronic device 201 is a digital camera, and as shown in
The extraction unit 220 extracts an object from an image captured by the imaging unit 210. For example, as shown in
In addition, the extraction unit 220 extracts a graphic (hereinafter, referred to as an object graphic) showing a region of the object which is extracted from the moving picture. For example, as shown in
As shown in
In addition, the extraction unit 220 extracts other subjects which integrally move with the main subject along with the main subject, and may extract an object graphic showing a region of an object which combines the object of the extracted main subject and the object of the other subject. For example, as shown in
Furthermore, the extraction unit 220 may extract other graphics as the object graphic, instead of the circumscribing rectangle. For example, as shown in
The acquisition unit 230 acquires a variation of an area, a variation in the length of the long side or the short side (in a case of a circumscribing rectangle), a variation of an aspect ratio (in a case of a circumscribing rectangle), a period of the change in the area, a period of the change in the length (in a case of a circumscribing rectangle), or a period of the change in the aspect ratio (in a case of a circumscribing rectangle) of the object graphic of one object extracted by the extraction unit 220, as rhythm information indicating a temporal change in the object. Moreover, since the rhythm information indicates the temporal change in each object, the rhythm information is a numerical value (index) that indicates the object itself.
When the object graphic is a circumscribing rectangle, the acquisition unit 230 acquires values of one or more parameters among parameters 1 to 12 (hereinafter, referred to as prm 1 to prm 12) exemplified below, as the rhythm information. In addition, when the object graphic is a graphic other than the circumscribing rectangle, the acquisition unit 230 acquires one or more parameters among prm 1 to prm 6 exemplified below, as the rhythm information. Moreover, for example, a predetermined time in prm 1 to prm 12 is a time (for example, one period) based on a period of the shape change of the object graphic. Moreover, the long sides and the short sides in prm 7-1 to prm 9-2 are determined based on the length of a certain reference time (for example, the beginning of one period). In addition, for convenience, simply, a Y axis direction (or an X axis direction) may be determined as the long side.
(Object Graphic=Circumscribing Rectangle and Graphics other than Circumscribing Rectangle)
prm 1: difference between maximum area and minimum area of circumscribing rectangle within a predetermined time
prm 2: area ratio between maximum area and minimum area of circumscribing rectangle within a predetermined time
prm 3-1: difference between average area and maximum area of circumscribing rectangle within a predetermined time
prm 3-2: difference between average area and minimum area of circumscribing rectangle within a predetermined time
prm 4-1: area ratio between average area and maximum area of circumscribing rectangle within a predetermined time
prm 4-2: area ratio between average area and minimum area of circumscribing rectangle within a predetermined time
prm 5: distribution condition (example: standard deviation) of area of circumscribing rectangle within a predetermined time
prm 6: period of change in area of circumscribing rectangle within a predetermined time
prm 7-1: maximum variation of long side of circumscribing rectangle within a predetermined time
prm 7-2: maximum variation of short side of circumscribing rectangle within a predetermined time
prm 8-1: distribution condition (example: standard deviation) of long side of circumscribing rectangle within a predetermined time
prm 8-2: distribution condition (example: standard deviation) of short side of circumscribing rectangle within a predetermined time
prm 9-1: period of change in long side of circumscribing rectangle within a predetermined time
prm 9-2: period of change in short side of circumscribing rectangle within a predetermined time
prm 10: maximum variation of aspect ratio of circumscribing rectangle within a predetermined time
prm 11: distribution condition (example: standard deviation) of aspect ratio of circumscribing rectangle within a predetermined time
prm 12: period of change in aspect ratio of circumscribing rectangle within a predetermined time
Hereinafter, acquisition of the rhythm information by the acquisition unit 230 will be described in detail with reference to
The acquisition unit 230 calculates each value shown in
Moreover, thereafter, in the acquisition unit 230, the values of each calculated parameter may be appropriately rounded or may be replaced by other values (may be scored) in order to easily compare the objects.
The acquisition unit 230 stores the acquired rhythm information in the storage unit 240. For example, as shown in
Hereinafter, an operation of the electronic device 201 will be described with reference to a flowchart.
The extraction unit 220 extracts the object from the moving picture (Step S210). The extraction unit 220 extracts the object graphic that indicates the region of the extracted object (Step S212), and temporarily stores the object graphic.
The extraction unit 220 determines whether or not the object graphics of one period have been extracted (Step S214). When the extraction unit 220 determines that the object graphics for one period have not been extracted yet (Step S214: No), it is returned to Step S210. That is, the extraction unit 220 repeats Steps S210 and S212 until periodicity of the change in the object graphics are found.
On the other hand, when the extraction unit 220 determines that the object graphics for one period have been extracted (Step S214: Yes), the acquisition unit 230 acquires the rhythm information based on the object graphics for one period which are temporarily stored (Step S216), and stores the acquired rhythm information in the storage unit 240. Then, the flowchart ends.
In addition, the flowchart shown in
As described above, according to the electronic device 201, the rhythm information, which is a numerical value indicating the object itself, can be simply acquired from the object. Moreover, the objects can be simply compared to each other using the rhythm information which is indicated by the numerical value. In addition, comparison results of the objects can be utilized for various application processes (for example, grouping of the objects based on similarity of the objects, grouping of the imaging devices based on similarity of the objects by each imaging device, and extraction of an object similar to the reference object).
Hereinafter, a third embodiment of the present invention will be described with reference to the drawings.
For example, the electronic device 301 is a digital camera, and as shown in
The imaging unit 310 is a camera which captures a still image and a moving picture. The extraction unit 320 extracts an object from a moving picture captured by the imaging unit 310, and extracts rhythm information that indicates a pattern of a change in a color of the object extracted from the moving picture. The second storage unit 340 stores the rhythm information extracted by the extraction unit 320. Hereinafter, a process according to the extraction unit 320 will be described in detail with reference to
Moreover, for convenience of explanation, when the signal is blue, the color of the blue lamp during lighting-on is set to bluish green, and the colors of the yellow lamp and the red lamp during lighting-off are set to black. That is, in
When the signal is yellow, the color of the yellow lamp during lighting-on is set to yellow, and the colors of the blue lamp and the red lamp during lighting-off are set to black. That is, in
When the signal is red, the color of the red lamp during lighting-on is set to red, and the colors of the blue lamp and the yellow lamp during lighting-off are set to black. That is, in
Moreover, when the signal is any one of blue, yellow, and red, all the regions other than the lamp are set to gray.
A pixel group ID (a-4, a-5, . . . ) shown in
Each time (t1, t2, . . . ) shown in
t4 is the imaging timing when the signal is yellow as shown in
Each value (D1 to D7) shown in
That is, as described above,
Characteristic 1: the color of the region (the region r1-1-1 shown in
Characteristic 2: the color of the center region in the region of the main portion of the object (3O1) is periodically changed to black (D4) and yellow (D5).
Characteristic 3: the color of the region (the region r1-3-1 shown in
Characteristic 4: among the region of the main portion of the object (3O1), the center region, the region positioned at the left side of the center region, and the region positioned at the right side of the center region (regions except for the region r1-1-1, the region r1-2-1, and the region r1-3-1 of the region r1 shown in
Characteristic 5: the color of the region (the region r2 shown in
Characteristic 6: after the region (region r1-1-1) positioned at the left side of the center region is changed from bluish green (D2) to black (D3), the center region (region r1-2-1) is changed from black (D4) to yellow (D5).
Characteristic 7: after the center region (the region r1-2-1) is changed from yellow (D5) to black (D4), the region (the region r1-3-1) positioned at the right side of the center region is changed from black (D6) to red (D7).
Characteristic 8: after the region (region r1-3-1) positioned at the right side of the center region is changed from red (D7) to black (D6), the region (the region r1-1-1) positioned at the left side of the center region is changed from black (D3) to bluish green (D2).
Characteristic 9: the region (region r1-1-1) which is changed to bluish green (D2) and is positioned at the left side of the center region, the center region (region r1-2-1) which is changed to yellow (D5), and the region (region r1-3-1) which is changed to red (D7) and is positioned at the right side of the center region have approximately the same size as one another.
Characteristic 10: a time in which the region (region r1-1-1) positioned at the left side of the center region is bluish green (D2) and a time in which the region (region r1-3-1) positioned at the right side of the center region is red (D7) are the same as each other, and the time is approximately 3 times of the time in which the center region (region r1-2-1) is yellow (D5).
As shown in
Moreover, the information (the pattern of the color change for each pixel group configuring the object) stored by the first storage unit 322 may be the information which is prepared by the electronic device 301, the information which is acquired by the electronic device 301 from the outside, or the information which is input by a user of the electronic device 301. Moreover, as an aspect which is prepared by the electronic device 301, the calculation unit 324 calculates the pattern of the color change of the pixel group configuring the object based on a moving picture (may be the moving picture captured by the imaging unit 310) which is set to a sample in advance, and stores the calculated pattern in the first storage unit 322.
As shown in
The calculation unit 324, which extracts the object in each imaging timing, calculates the average pixel value for each pixel group configuring the object in each imaging timing. That is, the calculation unit 324 calculates the pattern of the color change of the pixel group configuring the object. The calculation unit 324, which calculates the pattern of the color change of the pixel group configuring the object, supplies the calculated pattern of the color change to the selection unit 326.
The selection unit 326, which acquires the pattern of the change from the calculation unit 324, selects the rhythm information corresponding to the pattern of the change from the first storage unit 322. More specifically, the selection unit 326 compares one period of the pattern of the change acquired from the calculation unit 324 and one period of the pattern of the change for each rhythm information stored in the first storage unit 322, selects one pattern of the change which coincides with or is most similar to the pattern of the change acquired from the calculation unit 324, and acquires the rhythm information corresponding to the selected pattern of the change. The selection unit 326 stores the acquired rhythm information in the second storage unit 340. In addition, the rhythm information stored in the second storage unit 340 is used for comparison of the objects, or the like.
Hereinafter, an operation of the electronic device 301 will be described with reference to a flowchart.
The calculation unit 324 extracts the object from the moving picture (Step S310). The calculation unit 324 calculates the average pixel value for each pixel group configuring the extracted object (Step S312), and the average pixel value is associated with the imaging timing (time) so as to be temporarily stored.
The calculation unit 324 determines whether or not the color change of the object for one period has been extracted (Step S314). In other words, the calculation unit 324 determines whether or not the periodicity with respect to the pattern of the color change of the pixel group configuring the object has been found. When the calculation unit 324 determines that the color change of the object for one period has not been extracted yet (Step S314: No), it is returned to S310. That is, the calculation unit 324 repeats Steps S310 and S312 until periodicity in the color change is found.
On the other hand, when the calculation unit 324 determines that the color change of the object for one period have been extracted (Step S314: Yes), the temporarily stored average pixel value (the pattern of the color change of the pixel group configuring the object) for each pixel group configuring the object is supplied to the selection unit 326 in each imaging timing.
The selection unit 326, which acquires the pattern of the change from the calculation unit 324, selects the rhythm information corresponding to the pattern of the change from the first storage unit 322 (Step S316), and stores the selected rhythm information in the second storage unit 340. Then, the flowchart ends.
In addition, the flowchart shown in
As described with reference to
Moreover, the embodiment is the example in which the information indicating the temporal change in the average pixel values (the average value of the pixel value of the plurality of pixels in the pixel group) for each pixel group is used as the pattern of the color change of the pixel group. However, the value used as the pattern of the color change of the pixel group is not limited to this. For example, information that indicates the temporal change in the maximum pixel value for each pixel group (the maximum value of the pixel values of the plurality of pixels in the pixel group), information that indicates the temporal change in the minimum pixel value for each pixel group (the minimum value of the pixel values of the plurality of pixels in the pixel group), information that indicates the temporal change in the medium value of the pixel value for each pixel group (the medium value of the pixel values of the plurality of pixels in the pixel group), or the like may be used as the pattern of the color change of the pixel group.
Moreover, the embodiment is the aspect in which adjacent pixels having a predetermined number are set to a pixel group, the information indicating the temporal change in the average pixel value (the maximum pixel value, the minimum pixel value, or the medium value of the pixel value) for each pixel group is set to the pattern of the color change of the pixel group, and the rhythm information corresponding to the color change of the pixel group is extracted. However, the aspect, which extracts the rhythm information corresponding to the color change of the pixel group, is not limited to this.
As an example, adjacent pixels, in which the difference between pixel values is a predetermined value or less, are set to a pixel group, the information indicating the temporal change in the average pixel value (the maximum pixel value, the minimum pixel value, or the medium value of the pixel value) for each pixel group is set to the pattern of the color change of the pixel group, and thus, the rhythm information corresponding to the color change of the pixel group may be extracted.
In
That is, as shown in
As another example, the rhythm information corresponding to the color change of the pixel group may be extracted by setting the adjacent pixels, in which the difference between the pixel values is a predetermined value or less, as the pixel group, and setting the information indicating the temporal change in the distribution of the pixel groups as the pattern of the color change of the pixel group.
In
In
In
Each value (S1 to S7) in the Table of
That is, as shown in
As described above, according to the electronic device 301, the rhythm information, which is a numerical value indicating the object itself, can be simply acquired from the object. Moreover, the aspect shown in
Moreover, the aspect shown in
In addition, the aspect shown in
In addition, as shown in
Specifically, the electronic device 301 may further include a correction unit 311 (not shown) which corrects the color of the moving picture captured by the imaging unit 310. That is, the correction unit 311 may correct the color of the moving picture captured by the imaging unit 310 to the color obtained in a case the moving picture was captured under the predetermined reference light and may output the corrected color to the extraction unit 320, and the extraction unit 320 may extract the rhythm information from the moving picture which is corrected by the correction unit 311. Accordingly, stable rhythm information can be extracted regardless of the conditions of the external light when the moving picture is captured.
Hereinafter, a fourth embodiment of the present invention will be described in detail with reference to the drawings.
The electronic apparatus 401 includes a detection unit 410, a control unit 420, a pattern storage unit 425, and an output unit 430.
First, an outline of the electronic apparatus 401 of the present embodiment will be described. When the electronic apparatus 401 is grasped and swung by an operator, the electronic apparatus detects the motion of the electronic apparatus and the pressure which is applied to the side surfaces of the electronic apparatus, extracts the patterns of signals, which repeatedly appear, from the signal indicating the detected motion and the signal indicating the pressure, and synthesize the extracted patterns. Accordingly, in the electronic apparatus 401, variation in the synthesized pattern can be increased by synthesizing the plurality of patterns, the synthesized pattern is informed to the outside via the output unit 430, and thus, the information detected by the electronic apparatus can be expressed in expressive way.
Hereinafter, processes of each unit will be described. The detection unit 410 detects a plurality of signals that indicates characteristics (for example, the motion of the electronic apparatus and the pressure which is applied to the side surfaces of the electronic apparatus) of a detection target (for example, the electronic apparatus itself) from the detection target. Here, the detection unit 410 includes a motion detection unit 411 and a pressure detection unit 412.
The motion detection unit 411 detects the motion of the electronic apparatus and supplies signals indicating the detected motion to a pattern extraction unit 423. Specifically, for example, the motion detection unit 411 detects the motion of the electronic apparatus when the electronic apparatus is grasped and operated by the operator. For example, an acceleration sensor is provided as the motion detection unit 411.
The pressure detection unit 412 is disposed on the side surfaces of the electronic apparatus 401, detects the pressure which is applied to the side surfaces, and outputs signals indicating the detected pressure to the pattern extraction unit 423. Specifically, for example, the pressure detection unit 412 detects the pressure, which is applied to the side surfaces of the electronic apparatus, by predetermined stages (for example, 256 stages) when the electronic apparatus is grasped and moved by the operator. For example, if the pressure detection unit 412 is divided into 5 points, the pressure detection unit 412 can detect the pressure at 5 points.
For example, a capacitance type pressure sensitive sensor is provided as the pressure detection unit 412.
The process of the motion detection unit 411 and the pressure detection unit 412 will be described with reference to a specific example of
For example, in the example of
The pressure detection unit 412 detects the pressure applied to the side surface of the electronic apparatus when the electronic apparatus is grasped by the operator (user), and outputs the signals indicating the detected pressure to the pattern extraction unit 423.
Returning to
Subsequently, an outline of a process according to pattern extraction unit 423 will be described. The pattern extraction unit 423 extracts the patterns, which repeatedly appear, as the pattern of the motion and the pattern of the pressure, from the signal indicating the motion and the signal indicating the pressure. The pattern extraction unit 423 outputs the information indicating the extracted pattern of the motion and the information indicating the extracted pattern of the pressure, to the normalization unit 424.
An example of the process of the pattern extraction unit 423 will be described with reference to
In the upper side of
In the lower side of
Subsequently, the process of the pattern extraction according to the pattern extraction unit 423 will be described in detail.
In
For example, input data A, which is shown in
[Expression 1]
A={a1,a2, . . . ,an} (1)
Moreover, an arrangement A′ of m terms (m=n/2) which is a portion of the arrangement of the input data A shown in
[Expression 2]
A′={a1,a2, . . . ,am} (2)
Here, A′ is fixed. Moreover, an arrangement B of m terms, in which the arrangement of the input data A shown in
[Expression 3]
B={b1,b2, . . . ,bm}={a1+t,a2+t, . . . ,am+t} (3)
Here, B is a variable.
The pattern extraction unit 423 calculates the autocorrelation function, which is obtained by the shifted width t, according to the arrangement A′ and the arrangement B by the following Equation (4).
Here, i is an index of the element of each arrangement. Moreover, the value approaches 1 as the wave of the element of the arrangement A′ and the wave of the element of the arrangement B, which are drawn with a predetermined interval, are similar to each other.
The pattern extraction unit 423 extracts concave apex data (peak value) on the autocorrelation function R(t). In addition, when the extracted peak value exceeds a predetermined threshold value, the pattern extraction unit 423 extracts the sample number (or time) which takes the peak value. The pattern extraction unit 423 extracts a sample interval (or time interval) during taking the peak value, as the period τ.
The pattern extraction unit 423 divides the input data A for each one period by the period τ obtained by the autocorrelation function. Moreover, when a repeat count is defined as num, the pattern extraction unit 423 calculates the average data for one period ave(n) according to the following Equation (5).
Here, k is an integer. The average data for one period ave(n) is the output data of the pattern extraction unit 423 and corresponds to the curve W43 of
The normalization unit 424 normalizes the information indicating the pattern of the motion and the information indicating the pattern of the pressure to a value of a predetermined range (for example, values from −1 to 1) in parallel, and stores the information indicating the pattern of the motion after normalization and the information indicating the pattern of the pressure after normalization in the pattern storage unit 425.
An example of the process of the normalization unit 424 will be described with reference to
In the example of
Moreover, in the present embodiment, the normalization unit 424 normalizes the signal indicating the motion and the signal indicating the pressure in parallel. However, the normalization is not limited to this, and may be performed in series. In this case, the normalization unit 424 delays either the signal indicating the motion or the signal indicating the pressure by a delay element included in the normalization unit 424, and thus, may be configured of only hardware. In addition, the normalization unit 424 converts either the signal indicating the motion or the signal indicating the pressure into digital signals, temporarily saves the converted digital signals in a buffer included in the normalization unit 424, sequentially reads the saved digital signals, and thus, may normalize the read digital signals.
The synthesis unit 426 reads the information indicating the pattern of the motion after normalization and the information indicating the pattern of the pressure after normalization, from the pattern storage unit 425. When any amplitude of each pattern is larger than a predetermined threshold value (for example, 0.5), the synthesis unit 426 determines the amplitude of the pattern obtained by the synthesis based on the amplitude of each pattern and synthesizes each pattern.
An example of the process of the synthesis unit 426 will be described with reference to
For example, the synthesis unit 426 adds a value 0.6, which is a value on the curve W51 of the pattern of the motion after normalization, and a value 0.8, which is a value on the curve W52 of the pattern of the pressure after normalization, and multiples a coefficient 0.8, which corresponds to the combination (0.6 and 0.8) of the amplitude of each pattern, to the value 1.4 obtained by the addition of the value 0.6 and value 0.8. The obtained value 1.12 is set to the amplitude of a peak P54 on the curve W53 indicating the synthesized signal.
Similarly, for example, the synthesis unit 426 adds a value 0.8, which is on the curve W51 of the pattern of the motion after normalization, and a value 0.8, which is on the curve W52 of the pattern of the pressure after normalization, and multiples a coefficient 0.85, which corresponds to the combination (0.8 and 0.8) of the amplitude of each pattern, to the value 1.6 obtained by the addition of the value 0.8 and value 0.8. The obtained value 1.36 is set to the amplitude of a peak P55 on the curve W53 indicating the synthesized signal.
Similarly, for example, the synthesis unit 426 adds a value 1.0, which is on the curve W51 of the pattern of the motion after normalization, and a value 0.8, which is on the curve W52 of the pattern of the pressure after normalization, and multiples a coefficient 0.9, which corresponds to the combination (1.0 and 0.8) of the amplitude of each pattern, to the value 1.8 obtained by the addition of the value 1.0 and value 0.8. The obtained value 1.62 is set to the amplitude of a peak P56 on the curve W53 indicating the synthesized signal.
The synthesis unit 426 outputs the image data based on the synthesized pattern to a display unit 431, which will be described below, of the output unit 430. Moreover, the synthesis unit 426 generates an electric signal based on the synthesized pattern, and outputs the electric signal to the audio output unit 432.
The output unit 430 informs to the outside of the electronic apparatus based on the pattern synthesized by the synthesis unit 426. Here, the output unit 430 includes the display unit 431 and the audio output unit 432.
The display unit 431 displays the image data based on the input from the synthesis unit 426.
The audio output unit 432 outputs audio to the outside based on the electric signal supplied from the synthesis unit 426.
Next, the normalization unit 424 normalizes the pattern of the motion (Step S404). In parallel with this, the normalization unit 424 normalizes the pattern of the pressure (Step S405). Next, the synthesis unit 426 synthesizes the pattern of the motion and the pattern of the pressure (Step S406). Next, the display unit 431 displays the image based on the synthesized pattern (Step S407). Next, the audio output unit 432 outputs the audio based on the synthesized pattern (Step S408). In this way, the process of the flowchart ends.
As described above, when the electronic apparatus 401 is grasped and swung by the operator, the electronic apparatus detects the motion of the electronic apparatus and the pressure applied to the side surface of the electronic apparatus, and extracts the patterns of the signals, which repeatedly appear, from the signal indicating the detected motion and the signal indicating the pressure. Moreover, in the electronic apparatus 401, each extracted pattern is normalized, and each normalized pattern is synthesized based on each the amplitude.
Accordingly, the electronic apparatus 401 can increase variation in the synthesized pattern by synthesizing the plurality of patterns and can inform the synthesized pattern to the outside by the output unit 430 based on the synthesized pattern, and thus, the information detected by the electronic apparatus can be expressed in expressive way.
Subsequently, a communication system 502 in a fifth embodiment will be described.
In
Moreover, in
With respect to the configuration of the electronic apparatus 401 of
With respect to the configuration of the detection unit 410 of
The image sensor 413 captures the subject. Specifically, for example, if a case where one or a plurality of other electronic apparatuses 500 are swung in the predetermined direction by an operator or a plurality of operators is assumed, the image sensor 413 captures the one or the plurality of other electronic apparatuses 500-J (J is a positive integer other than I) as the subject.
The image sensor 413 supplies a video signal obtained by the imaging to a data extraction unit 422, which will be described below, of the extraction unit 421b. For example, as the image sensor 413, a CCD image sensor is provided.
With respect to the configuration of the control unit 420 of
The extraction unit 421b includes the data extraction unit 422, a normalization unit 424b, and a pattern extraction unit 423b.
The data extraction unit 422 extracts the signal equivalent to the pixel on a diagonal line of a frame, from the video signal supplied from the image sensor 413. In addition, the data extraction unit 422 outputs the extracted signal (extraction video signal) to the pattern extraction unit.
The process of the data extraction unit 422 will be described with reference to
In the right side of
In the example of
Returning to
Moreover, according to a method similar to that of the pattern extraction unit 423 of the fourth embodiment, the pattern extraction unit 423b calculates the autocorrelation function R(t) from the extraction video signal supplied from the data extraction unit 422, and calculates the pattern of the video based on the calculated autocorrelation function R(t). In addition, the pattern extraction unit 423b outputs the information indicating the pattern of the calculated video to the normalization unit 424b.
Similar to the normalization unit 424 of the fourth embodiment, the normalization unit 424b normalizes the information indicating the pattern of the motion input from the pattern extraction unit 423b to the values from −1 to 1. Moreover, the normalization unit 424b stores information Rm_I indicating the pattern of the motion after normalization, in the pattern storage unit 425.
In addition, the normalization unit 424b normalizes the information indicating the pattern of the video input from the pattern extraction unit 423b to the values from −1 to 1. Moreover, the normalization unit 424b stores information Rv indicating the pattern of the video after normalization, in the pattern storage unit 425.
The control unit 420b reads the information Rm_I, which indicates the pattern of the motion after normalization, from the pattern storage unit 425 and outputs the read information Rm_I, which indicates the pattern of the motion after normalization, to the communication unit 440. Moreover, the control unit 420b is controlled to send the information Rm_I, which indicates the pattern of the motion after normalization, from the communication unit 440 to other electronic apparatuses 500-J (J is an integer other than I).
The communication unit 440 is configured to communicate with other electronic apparatuses 500-J by a wire type or a wireless type. The communication unit 440 receives the information Rm_J, which indicates the pattern of the motion after normalization of other electronic apparatuses 500-J, from other electronic apparatuses 500-J, and outputs the received information Rm_J, which indicates the pattern of the motion after normalization, to the synthesis unit 426b.
Similar to the synthesis unit 426 of the fourth embodiment, the synthesis unit 426b reads the information Rm_I, which indicates the pattern of the motion after normalization, from the pattern storage unit 425. Moreover, according to a method similar to that of the synthesis unit 426 in the fourth embodiment, the synthesis unit 426b synthesizes the read information Rm_I, which indicates the pattern of the motion after normalization, and the information Rm_J, which indicates the pattern of the motion after normalization input by the communication unit 440, according to each amplitude value.
Accordingly, the synthesis unit 426b can generate the pattern in which the pattern of the motion of the electronic apparatus itself and the patterns of the motions of other electronic apparatuses 500-J are synthesized.
In addition, the synthesis unit 426b outputs the pattern obtained by the synthesis to the motion video synthesis unit 427 as information Ra indicating the pattern of the aggregated motion.
The motion video synthesis unit 427 reads the information Rv, which indicates the pattern of the video after normalization, from the pattern storage unit 425. The motion video synthesis unit 427 synthesizes the pattern of the aggregated motion, which is synthesized by the synthesis unit 426b, and the pattern of the extracted video. Specifically, the motion video synthesis unit 427 synthesizes the information Ra, which indicates the pattern of the aggregated motion input from the synthesis unit 426b, and the read information Rv, which indicates the pattern of the video after normalization, according to the amplitude.
The process of the motion video synthesis unit 427 will be described with reference to
For example, the motion video synthesis unit 427 adds a value 0.6, which is a value on the curve W121 indicating the pattern of the aggregated motion, and a value 0.8, which is a value on the curve W122 indicating the pattern of the video after normalization, and multiples a coefficient 0.8, which corresponds to the combination (0.6 and 0.8) of the amplitude of each pattern, to the value 1.4 obtained by the addition of the value 0.6 and value 0.8. The obtained value 1.12 is set to the amplitude of a peak P124 on the curve W123 indicating the synthesis pattern.
Returning to
The information Rp indicating the pattern of the location and information A indicating atmosphere are associated with each other and are stored in the atmosphere data storage unit 428.
In Table T1, inherent identification information (ID) in the pattern of the location, the pattern of the location, and atmosphere are associated with one another. For example, when ID is 1, bright atmosphere is associated with the patterns of the location (0.1, 0.3, . . . , 0.1)
Returning to
Moreover, when the information A, indicating the atmosphere corresponding to the information Rp which indicates the pattern of the location, does not exist in the record of the atmosphere data storage unit 428, the collation unit 429 may extract the pattern of the location nearest to the information Rp indicating the pattern of the location, and may read the information A, indicating the atmosphere corresponding to the extracted pattern of the location, from the atmosphere data storage unit 428.
The display unit 431 displays the information indicating the atmosphere based on the information A indicating the atmosphere input from the collation unit 429. Moreover, the audio output unit 432 outputs audio based on the electric signal input from the collation unit 429.
Next, the pattern extraction unit 423b extracts the pattern of the motion from the motion of the electronic apparatus 500-I (Step S502). In parallel with this, the pattern extraction unit 423b extracts the pattern of the acquired video (pattern S503).
Next, the normalization unit 424b normalizes the pattern of the extracted motion (Step S504). In parallel with this, the normalization unit 424b normalizes the pattern of the extracted video (Step S505).
Next, the communication unit 440 receives the information indicating the patterns of the motions of other electronic apparatuses after normalization from other electronic apparatuses (Step S506). Next, the synthesis unit 426b generates the pattern of the aggregated motion which is obtained by synthesizing the pattern of the motion of the electronic apparatus 500-I after normalization and the patterns of the motions of other electronic apparatuses after normalization (Step S507).
Next, the motion video synthesis unit 427 synthesizes the pattern of the aggregated motion and the pattern of the video (Step S508). The collation unit 429 reads the information indicating the atmosphere corresponding to the pattern of the location which is synthesized by the motion video synthesis unit 427 (Step S509). Next, the display unit 431 displays the read information indicating the atmosphere (Step S510). Next, the audio output unit 432 outputs audio based on the information indicating the atmosphere (Step S511). In this way, the process of the flowchart ends.
As described above, the electronic apparatus 500-I in the second embodiment extracts the pattern of the motion from the motion of each electronic apparatus 500-I. Moreover, the electronic apparatus 500-I generates the pattern of the aggregated motion by synthesizing the motion pattern of the electronic apparatus 500-I and the motion patterns of other electronic apparatuses 500-J. The electronic apparatus 500-I generates the pattern of the location, in which the subject exists, by further synthesizing the generated pattern of the aggregated motion and the video pattern, which is based on the luminance change which is obtained from the video capturing other electronic apparatuses 500-J which are the subjects. Moreover, the electronic apparatus 500-I reads the information indicating the atmosphere corresponding to the pattern of the location, from the atmosphere data storage unit 428.
Accordingly, the electronic apparatus 500-I can generate the pattern of the location, from the video signal capturing the location and the information indicating the motion of the electronic apparatus 500-I and other electronic apparatuses 500-J. As a result, the electronic apparatus 500-I can estimate the atmosphere of the location from the generated pattern of the location.
In addition, in the present embodiment, the electronic apparatus 500-I synthesizes the pattern of the motion of the electronic apparatus 500-I and the patterns of the motions of other electronic apparatuses. However, the synthesis is not limited to this. For example, the electronic apparatus 500-I may synthesize the pattern of the pressure applied to the side surface of the electronic apparatus 500-I and the pattern of the pressure applied to the side surfaces of other electronic apparatuses 500-J.
In the present embodiment, when all amplitude of each pattern is larger than a predetermined threshold value, the motion video synthesis unit 427 determines the amplitude of the pattern obtained by the synthesis based on the amplitude of each pattern. However, the determination is not limited to this. The motion video synthesis unit 427 may set the average value of each pattern to the amplitude of the pattern obtained by the synthesis.
Moreover, in the fourth embodiment and the fifth embodiment, when all amplitude of each pattern is larger than a predetermined threshold value, the synthesis units (426 and 426b) determine the amplitude of the pattern obtained by the synthesis based on the amplitude of each pattern. However, the determination is not limited to this. The synthesis units (426 and 426b) may set the average value of each pattern to the amplitude of the pattern obtained by the synthesis.
In all embodiments, the output unit 430 informs to the outside using image and audio, but, is not limited to this, and the output unit may inform to the outside using light or vibration.
Moreover, by recording a program for executing each process of the electronic devices (1, 201, and 301) and the control units (420 and 420b) according to an embodiment of the present invention on a computer-readable recording medium and by executing the program recorded on the recording medium to be read on a computer system, the above-described various process related to the electronic devices (1, 201, and 301) and the control units (420 and 420b) may be performed. In this case, the information indicating the plurality of signals detected by the plurality of detection units is set to be stored in the recording medium. Moreover, the above-mentioned “computer system” may include an OS or hardware such as peripheral equipment. Moreover, when a WWW system is used, the “computer system” is set to include a homepage provision environment (or display environment). In addition, the “computer-readable recording medium” means a writable non-volatile memory such as a flexible disk, a magneto-optical disk, a ROM, a flash memory, a portable medium such as a CD-ROM, and a storage apparatus such as a hard disk which is built in the computer system.
Furthermore, the “computer-readable recording medium” may include a volatile memory (for example, a Dynamic Random Access Memory (DRAM)) inside a computer system which becomes a server or a client when a program is sent via a network such as the Internet or a communication channel such as a telephone channel, in which the volatile memory holds the program during a definite period of time. In addition, the program may be transmitted to other computer systems via a transmission medium or by transmission waves in the transmission medium from the computer system which stores the program in the storage apparatus or the like. Here, the “transmission medium” which transmits the program means a medium, which has a function which transmits information, such as a network (communication network) of Internet or the like or a communication channel (communication line) of a telephone channel. Moreover, the program may execute a portion of the above-described functions. Moreover, the program may be a so-called differential file (differential program), in which the above-described functions can be realized by combining the program and a program which has been recorded on the computer system in advance.
As described above, the embodiments of the present invention are described with reference to the drawings. However, the specific configurations are not limited to the embodiments and include a design or the like within a scope which does not depart from the gist of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2011-067757 | Mar 2011 | JP | national |
2011-083595 | Apr 2011 | JP | national |
2011-089063 | Apr 2011 | JP | national |
2011-095986 | Apr 2011 | JP | national |
Priority is claimed on Japanese Patent Application No. 2011-067757, filed on Mar. 25, 2011, Japanese Patent Application No. 2011-083595, filed on Apr. 5, 2011, Japanese Patent Application No. 2011-089063, filed on Apr. 13, 2011, and Japanese Patent Application No. 2011-095986, filed on Apr. 22, 2011. This is a Continuation Application of International Application No. PCT/JP2012/057134, filed Mar. 21, 2012. The contents of the aforementioned applications are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2012/057134 | Mar 2012 | US |
Child | 14029421 | US |