Tracking camera device

Information

  • Patent Grant
  • 6717612
  • Patent Number
    6,717,612
  • Date Filed
    Friday, July 31, 1998
    26 years ago
  • Date Issued
    Tuesday, April 6, 2004
    20 years ago
Abstract
An integrated data detecting circuit 14 integrates digital data for each of detecting blocks set within a movement detecting area. A tripod-head control circuit 16 detects a correlation value for this integrated data to previously-registered integrated data, and compares the correlation value with a threshold to determine a first detecting block corresponding to an object. The tripod-head control circuit 16 further determines a first detecting area including the most first detecting blocks among a plurality of detecting areas set in the movement detecting area, and supplies to a drive device 18 a tripod-head control signal corresponding to the position. Accordingly, the drive device 18 controls a tripod head 20 such that a camera 12 is shifted in position to a direction of the first detecting area. Since determination is made for a first detecting area including the most first detecting blocks, the camera can be stabilized when an object is moving finely.
Description




BACKGROUND OF THE INVENTION




1. Field of the Invention




The present invention relates to tracking camera devices, and more particularly to a tracking camera device adapted to track an object that is imaged by a camera and captured at a detecting area thereof.




2. Description of the Related Art




There is one example of a tracking camera device to track an object that is imaged by a camera and captured within an detecting area, proposed in Japanese Patent Application No. H6-267347 applied for patent on Oct. 31, 1994. This technology involves a process of detecting a detecting block to be considered the same in color as a particular detecting block among a plurality of detecting blocks set within a motion vector detecting area. A motion vector for the object is estimated depending upon a weight center of the same-colored detecting block, thereby tracking the object according to the motion vector.




In such prior art, however, even where the object is finely moving within the motion vector detecting area, the camera will track the object in response to the movement thereof, resulting in a problem that the camera is placed unstable in position.




SUMMARY OF THE INVENTION




The present invention is, in a tracking camera device for tracking an object that is imaged by a camera and captured within a movement detecting area, the tracking camera device comprising: the movement detecting area including a plurality of detecting areas; each of the detecting areas including a plurality of detecting blocks; an integrating means for integrating pixel data for each of the detecting blocks; a first determining means for determining a plurality of detecting blocks corresponding to the object based on a result of integration; a second determining means for determining a first detecting area including the most of the first detecting blocks; and a control means for controlling a position of the camera to a direction of the first detecting area.




Accordingly, the integrating means integrates, for example, chrominance-related data for each detecting block included in the movement detecting area. Based on a result of the integration, the first determining means determines a plurality of first detecting blocks corresponding to an object. That is, the first determining means, for example, calculates a correlation value of the integration result to a previously-registered reference value, and compares the correlation value with a threshold to determine a first detecting block according to a result of the comparison. The second determining means then determines a first detecting area that contains the most first detecting blocks. The control means, in turn, controls the camera to a direction of the first detecting area.




Therefore, according to this invention, since the camera position is controlled to a direction of a first detecting area containing the most first detecting blocks, it is possible to stabilize the camera position when an object is finely moving or so.











BRIEF DESCRIPTION OF THE DRAWINGS





FIG. 1

is a block diagram showing a prefered exemplary embodiment of the present invention;





FIG. 2

is an illustrative view showing a movement detecting area set on a screen;





FIG. 3

is an illustrative view showing detecting blocks;





FIG. 4

is an illustrative view showing a movement detecting area set on the screen;





FIG. 5

is an illustrative view showing a plurality of detecting areas included in the movement detecting area;





FIG. 6

is an illustrative view showing part the operation of the preferred exemplary embodiment shown in

FIG. 1

to

FIG. 5

;





FIG. 7

is an illustrative view showing part of the operation of the preferred exemplary embodiment shown in

FIG. 1

to

FIG. 5

;





FIG. 8

is an illustrative view showing part of the operation of the preferred exemplary embodiment shown in

FIG. 1

to

FIG. 5

;





FIG. 9

is a flowchart showing part of operation shown in

FIG. 1

to

FIG. 8

; and





FIG. 10

is a flowchart showing a subroutine of the flowchart shown in FIG.


9


.











DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT




Referring to

FIG. 1

a preferred exemplary embodiment of the present invention will be explained. A tracking camera device


10


in this embodiment includes a camera


12


that images an object to output digital data on a pixel-by-pixel basis. This digital data is signal digital data containing chrominance signals such as y, r−y, b−y signals. These signals are integrated for each detecting block


16


as shown in

FIG. 2

, by an integrated data detecting circuit


14


. Explaining the detecting blocks


16


using FIG.


2


and

FIG. 3

, the detecting blocks


16


are set to have vertically 8 blocks and horizontally 8 blocks, i.e. totally


64


blocks, within a movement detecting area


20


set on a screen


18


. Each detecting block


16


contains pixels in number of i×j. A detecting block


16


, positioned at (m, n)=(


5


,


4


), is taken as a particular detecting block


16




a


. The detecting block


16


set in this manner has integrated data as expressed by Equation 1.











Y
mn

=



ij



y
mn










RY
mn

=



ij




(

r
-
y

)

mn










BY
mn

=



ij




(

b
-
y

)

mn







[

Equation





1

]













A tripod-head control circuit


16


first registers integrated data as to a particular detecting block


16




a


as a reference value, and then receives integrated data concerning each detecting block


16


. Then, a calculation is made for a correlating value D


mn


of between the integrated data previously registered and the integrated data thereafter given, according to Equation 2. Further, the correlated value D


mn


is compared with a predetermined threshold to thereby determine a detecting block


16


lower than the threshold, i.e. a first detecting block.








D




mn




=|Y




mn




−Y|+|RY




mn




−RY|+|By




mn




−BY|


  [Equation 2]






Y, RY, BY: registered integrated data




The tripod-head control circuit


16


further determines a detecting area


22


containing the most first detecting blocks, i.e., a first detecting area, among a plurality of detecting areas arranged in the movement detecting area


20


, as shown in FIG.


4


. Here, the plurality of detecting areas


22


are arranged such that each area partly overlaps with an adjacent detecting area


22


except for a center in the movement detecting area


20


, as shown in FIG.


4


and FIG.


5


.




Consequently, if integrated data on a chest of a person shown in

FIG. 6

is registered at a certain field, then at a next field a hatched portion shown in

FIG. 7

is determined as a first detecting block based on the integrated data detected at this field and the integrated chrominance data being registered. Further, a hatched portion shown in

FIG. 8

is then determined as a first detecting area. The tripod-head control circuit


16


supplies to a drive device


18


a tripod-head control signal depending upon a position of the first detecting area so that the drive device


18


makes control on a tripod head


20


. Consequently, the camera


12


is controlled in position to a direction of the first detecting area. Incidentally, in the case of

FIG. 8

, the camera


12


keeps stable without shifting its position.




The tripod-head control circuit


16


is configured by a micro-computer, to process according to flowcharts shown in FIG.


9


and FIG.


10


. That is, it is determined at a first step S


1


whether a tracking switch


22




a


is turned on or not. If “NO”, the process proceeds to a next field process, while if “YES”, the process is consideration as in a tracking mode and it is determined at a step S


3


whether it is in a waiting state for integrated chrominance data of a particular detecting block


16




a


or not. If “YES”, it is determined at a step S


5


whether a registering switch


22




b


is turned on or not. If the determination here is “NO”, the process proceeds to a next field process, while if “YES”, the integrated data is registered at a step S


7


and the process proceeds to a next field. On the other hand, if “NO” at the step S


3


, a first detecting block is determined at a step S


9


, a first detecting area is determined at a step S


11


, and a tripod-head control signal is generated at a step S


13


. Then the process proceeds to the next field process.




At the step S


9


, the tripod-head control circuit


16


makes processing according to a subroutine shown in FIG.


10


. That is, a correlation value D


mn


is first detected by using Equation


2


at a step S


901


, and it is then determined at a step S


903


whether the correlation value D


mn


≦a threshold or not. If “YES”, the detecting block


16


is determined as a first detecting block at a step S


905


, and the process proceeds to a step S


907


. However, if “NO”, the process directly proceeds to the step S


907


. At the step S


907


it is determined whether all the detecting blocks have been completed of processing or not. If “NO”, the process returns to the step S


901


, while if “YES”, the process returns.




According to this embodiment, detection is made for a first detecting area containing the most first detecting blocks to which direction the camera


12


is controlled in position. Accordingly, when an object is moving finely, the first detecting area will be always positioned at a center, thereby making it possible to stabilize the position of the camera


12


.




Incidentally, in the preferred exemplary embodiment the digital data of the y, r−y and b−y signals were employed to detect a first detecting block. However, it is of course possible to use digital data of r, g and b signals or digital data of y, cb and cr signals. Here, integrated data as to r, g and b signals can be expressed by Equation 3, and a correlation value thereof is represented by Equation 4. Also, integrated data as to y, cb and cr signals is expressed by Equation 5, and a correlation value thereof is by Equation 6.











R
mn

=



ij



r
mn










G
mn

=



ij



g
mn










B
mn

=



ij



b
mn







[

Equation





3

]












D




mn




=|R




mn




−R|+|G




mn




−G|+|B




mn




−B|


  [Equation 4]




R, G, B: registered integrated data











Y
mn

=



ij



y
mn










Cb
mn

=



ij



Cb
mn










Cr
mn

=



ij



Cr
mn







[

Equation





5

]












D




mn




=|Y




mn




−Y|+|Cb




mn




−Cb|+|C




mn




−Cr|


  [Equation 6]




Y, Cb, Cr: registered integrated data




Further, where using digital data of normalized chrominance signals such as (r−y)/y and (b−y)/y signals; r/y, g/y and b/y signals; c


b


/y and c


r


/y signals, there is no variation in hue in an image captured by the camera


12


even if there is variation in brightness. Accordingly, it is possible to successfully track an object even when there is change in amount of solar light or variation in brightness due to entering into and outing from a room or the like. The integrated data and correlation value, when using digital data of (r−y)/y and (b−y)/y signals, are respectively expressed by Equation 7 and Equation 8. The integrated data and correlation value, when using digital data of r/y, g/y and b/y signals, are respectively represented by Equation 9 and Equation 10. The integrated data and correlation value, when using digital data of cb/y and cr/y signals, are respectively expressed by Equation 11 and Equation 12.












{


(

R
-
Y

)

/
Y

}

mn

=



ij




{


(

r
-
y

)

/
y

}

mn











{


(

B
-
Y

)

/
Y

}

mn

=



ij




{


(

b
-
y

)

/
y

}

mn







[

Equation





7

]












D




mn




=|{R−Y}/Y}




mn


−(


R−Y


)/


Y|+|{B−Y}/Y}




mn


−(


B−Y


)/


Y|


  [Equation 8]




(R−Y)/Y, (B−Y)/Y: registered integrated data












(

R
/
Y

)

mn

=



ij




(

r
/
y

)

mn











(

G
/
Y

)

mn

=



ij




(

g
/
y

)

mn











(

B
/
Y

)

mn

=



ij




(

b
/
y

)

mn







[

Equation





9

]












D




mn


=|(


R/Y


)


mn




−R/Y


|+|(


G/Y


)


mn




−G/Y


|+|(


B/Y


)


mn




−B/Y|


  [Equation 10]




R/Y, G/Y, B/Y: registered integrated data












(

Cb
/
Y

)

mn

=



ij




(


c
b

/
y

)

mn











(

Cr
/
Y

)

mn

=



ij




(


c
r

/
y

)

mn







[

Equation





11

]












D




mn


=|(


Cb/Y


)


mn




−Cb/Y


|+|(


Cr/Y


)


mn




−Gr/Y|


  [Equation 12]




Cb/Y, Cr/Y: registered integrated data




Incidentally, the figure pictures employed in

FIG. 6

to

FIG. 8

to explain this embodiment are those quoted from a software “Hanako” made by Justsystem Corporation.



Claims
  • 1. A tracking camera device for tracking an object, comprising:a first former for forming a plurality of detecting blocks on a screen; a second former for forming a plurality of fixed detecting areas arranged on the screen simultaneously in a manner such that each of the detecting areas includes two or more detecting blocks out of said plurality of detecting blocks; an integrator for integrating pixel data for each of said detecting blocks; a first determiner for determining a plurality of detecting blocks corresponding to the objected base on a result of integration; a second determiner for controlling a position of said camera to a direction of said first detecting area, wherein whenever said first detecting area is changed to another detecting area, said controller controls the position of said camera to a direction of said another area, thereby tracking the object on an area-by-area basis.
  • 2. A tracking camera device according to claim 1, wherein said first determining means includes a calculating means for calculating a correlation value of the calculation result to a reference value previously registered, a comparing means for comparing the correlation value with a threshold, and a detecting block determining means for determining the first detecting blocks according to a result of comparison.
  • 3. A tracking camera device according to claim 1 or 2, wherein the detecting areas are arranged so as to partly overlap with one another except for a center of said movement detecting area.
  • 4. A tracking camera device according to claim 1 or 2, wherein the pixel data contains chrominance-related data.
  • 5. A tracking camera device according to claim 3, wherein the chrominance-related data contains chrominance signal data.
  • 6. A tracking camera device according to claim 3, wherein the chrominance-related data contains normalized chrominance signal data.
Priority Claims (1)
Number Date Country Kind
8-017498 Feb 1996 JP
PCT Information
Filing Document Filing Date Country Kind
PCT/JP97/00252 WO 00
Publishing Document Publishing Date Country Kind
WO97/28645 8/7/1997 WO A
US Referenced Citations (10)
Number Name Date Kind
5243418 Kuno et al. Sep 1993 A
5371536 Yamaguchi Dec 1994 A
5442397 Yoshimura et al. Aug 1995 A
5512974 Abe et al. Apr 1996 A
5574498 Sakamoto et al. Nov 1996 A
5627586 Yamasaki May 1997 A
5757422 Matsumura May 1998 A
5870141 Matsumura et al. Feb 1999 A
6002428 Matsumura et al. Dec 1999 A
6144405 Toba Nov 2000 A