Control device for image input apparatus

Information

  • Patent Grant
  • 6677990
  • Patent Number
    6,677,990
  • Date Filed
    Tuesday, September 17, 1996
    28 years ago
  • Date Issued
    Tuesday, January 13, 2004
    21 years ago
Abstract
A control device for an image input apparatus which is equipped with an optical system having a magnification varying lens, includes a monitor for displaying input images, an input device which enables an arbitrary position on a display screen of the monitor to be designated, a calculation device for calculating the distance between a predetermined position on the display screen and the arbitrary position on the basis of zooming information of the optical system, and a controller for controlling the image input apparatus in accordance with the calculation results obtained by the calculation device.
Description




BACKGROUND OF THE INVENTION




1. Field of the Invention




This invention relates to a control device for an image input apparatus and, more specifically, to a control device which is suitable for remotely operating a camera as in the case of a video conference system.




2. Description of the Related Art




With the improvement of computers in terms of image processing capability, there have been proposed various techniques in which camera operations, such as zooming, panning and tilting, are performed by operating a computer while the photographic image is being displayed on a monitor screen of the computer. In particular, in a video conference system, it is desirable that the orientation (pan, tilt), magnification, etc., of the camera at the other end of the communications line be remotely controllable. For that purpose, there has been proposed, for example, a system in which camera operation factors, which are to be controlled by using a mouse or the like, are displayed on a part of the monitor screen.




However, in the above system, it is rather difficult to perform fine adjustment. Moreover, it is by no means easy to determine which factor is to be controlled, and to what degree, so that the operator has to depend on trial-and-error methods. When a camera is to be remotely controlled as in the case of a video conference system, the time lag involved in transmitting the control signal must also be taken into account. In addition, the image is subjected to high-efficiency coding before being transmitted. Thus, the image quality is generally rather poor when the image information changes fast as in the case of panning, thereby making the camera operation still more difficult. In other words, a fine adjustment is impossible.




SUMMARY OF THE INVENTION




It is an object of the present invention to provide an image-input-apparatus control device improved in operability.




In accordance with an embodiment of the present invention, there is provided a control device for an image input apparatus which is equipped with an optical system having a magnification varying lens, the control device comprising display means for displaying input images, input means which enables an arbitrary position on a display screen of the display means to be designated; calculation means for calculating the distance between a predetermined position on the display screen and the arbitrary position on the basis of zooming information of the optical system, and control means for controlling the image input apparatus in accordance with calculation results obtained by the calculation means.




Other objects and features of the present invention will become apparent from the following detailed description of the invention and the accompanying drawings.











BRIEF DESCRIPTION OF THE DRAWINGS





FIG. 1

is a schematic block diagram showing an embodiment of the present invention;





FIG. 2

is a schematic diagram of the embodiment connected to a communications network;





FIG. 3

is a flowchart illustrating a first operation of this embodiment;





FIG. 4

is a diagram illustrating how distances as displayed on a monitor screen are related to the corresponding angles of view;





FIG. 5

is a flowchart illustrating a second operation of this embodiments;





FIG. 6

is a flowchart illustrating a third operation of this embodiment; and





FIG. 7

is a flowchart illustrating a fourth operation of this embodiment.











DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT




An embodiment of the present invention will now be described with reference to the drawings.





FIG. 1

is a schematic block diagram showing an embodiment of the present invention as applied to a terminal in a video conference system. Numeral


10


indicates a camera for photographing a user of the system; numeral


12


indicates a photographing zoom lens unit; numeral


14


indicates a zoom control circuit for moving a zooming lens


12




a


of the lens unit


12


in the direction of the optical axis; numeral


16


indicates a focus control circuit for moving a focusing lens


12




b


of the lens unit


12


in the direction of the optical axis; numeral


20


indicates an image sensor which converts optical images obtained by the lens unit


12


and an aperture


18


to electric signals; and numeral


22


indicates a camera signal processing circuit for converting the electric signals obtained by the image sensor


20


to video signals.




Numeral


24


indicates a pan control circuit which is on a pan head


23


and which moves the photographic optical axis of the camera


10


to the right and left; numeral


26


indicates a tilt control circuit which is on the panhead


23


and which moves the photographic optical axis of the camera


10


up and down; and numeral


28


indicates a camera control circuit for controlling the camera


10


as a whole.




Numeral


30


indicates a computer constructed in the same way as ordinary computers; numeral


32


indicates a CPU for overall control; numeral


34


indicates a ROM; numeral


36


indicates a RAM; numeral


38


indicates a video interface to which output video signals from the camera


10


are input; and numeral


40


indicates a communications interface which transmits and receives data and control signals to and from an external communications network and transmits and receives control signals to and from the camera


10


.




Numeral


42


indicates a coordinate input device consisting of a digitizer, a mouse or the like; numeral


44


indicates an interface for the coordinate input device


42


; numeral


46


indicates a video memory (VRAM); and numeral


48


indicates a display control device for controlling the image display of a monitor


50


consisting of a CRT, a liquid crystal display or the like.




As shown in

FIG. 2

, a number of terminals as shown in

FIG. 1

are connected to each other through the intermediation of a communications network.




Next, the operation of this embodiment will be described with reference to

FIGS. 1 and 3

. This embodiment functions in a particularly effective manner when applied to a case where it is used to remotely control a camera at an image transmission end terminal from an image reception end terminal in a video conference which is being executed between terminals. However, for convenience of description, the operation of this embodiment will be explained with reference to a case where the camera


10


is operated within the system shown in FIG.


1


. The features of this embodiment as utilized in the above two cases are the same except for the fact that the processes of image coding and decoding are omitted in the latter case.




A photographic image obtained by the camera


10


is written to the memory


46


through the video interface


38


. The display control circuit


48


successively reads image data stored in the video memory


46


, whereby the monitor


50


is controlled to display the image.




The user designates an arbitrary position (x, y), which he or she intends to be the center, through the coordinate input device


42


(S


1


). The CPU


32


calculates the disparity (ΔX, ΔY) between the designated position (x, y) and the coordinates (a, b) of the center of the photographic image displayed on the screen (when no window display system is adopted, it is the center of the screen of the monitor


50


and, when a window display system is adopted, it is the center of the display window of the photographic image) (S


2


). That is, the CPU


32


calculates the following values:






Δ


X=x−a










Δ


Y=y−b








Then, the CPU


32


transfers a movement command, e.g., a command Mov (ΔX, ΔY), to effect movement through a distance corresponding to the disparity (ΔX, ΔY) to the camera control circuit


28


of the camera


10


through the communications interface


40


(S


3


).




Upon receiving this movement command (S


4


), the camera control circuit


28


first obtains zooming position information of the zooming lens


12




a


from the zoom control circuit


14


(S


5


), and determines the conversion factor k of the amount of movement from the zooming position information thus obtained (S


6


). That is, the photographic image is displayed in a size corresponding to the variable magnification of the lens unit


12


. For example, as shown in

FIG. 4

, distances which appear the same when displayed on the screen of the monitor


50


are different from each other in the actual field depending upon the magnification, i.e., the angle of view. In view of this, it is necessary to convert a distance on the monitor screen to an amount of movement corresponding to the angle of view (the pan angle and the tilt angle). For this purpose, the camera control circuit


28


is equipped with a conversion table for determining the conversion factor k.




The determined conversion factor k is multiplied by the parameters ΔX and ΔY of the movement command Mov (ΔX, ΔY) from the computer


30


to calculate the actual amount of movement (ΔXr, ΔYr) (S


7


). That is, the following values are calculated:






Δ


Xr=kΔX










Δ


Yr=kΔY








The camera control circuit


28


determines the pan and tilt angles of rotation in accordance with the actual amount of movement (ΔXr, ΔYr) (S


8


) to control the pan control circuit


24


and the tilt control circuit


26


, thereby pointing the photographic optical axis of the camera


10


in the designated direction (S


9


).




While in

FIG. 3

the camera control circuit


28


of the camera


10


calculates an amount of movement of the camera


10


with respect to the amount of movement as designated on the monitor screen, it is naturally also possible for this calculation to be performed by the CPU


32


of the computer


30


.

FIG. 5

shows a flowchart for the latter case which differs from the above-described case in that the zooming position information of the zooming lens


12




a


is transferred from the camera


10


to the computer


30


, which calculates the pan and tilt angles of movement and transfers them to the camera


10


.




That is, the user designates an arbitrary position (x, y), which he or she intends to be the center of the photographic image displayed on the monitor


50


, by using the coordinate input device


42


(S


11


). The CPU


32


calculates the disparity (ΔX, ΔY) between the designated position (x, y) and the coordinates (a, b) of the center of the photographic image as displayed on the screen (when no window display system is adopted, it is the center of the screen of the monitor


50


and, when a window display system is adopted, it is the center of the display window of the photographic image) (S


12


). Then, the CPU


32


requires the camera


10


to provide zooming position information (S


13


).




Upon the request of zooming position information from the computer


30


, the camera control circuit


28


of the camera


10


obtains zooming position information from the zoom control circuit


14


(S


14


), and transfers it to the computer


30


(S


15


).




The CPU


32


determines the conversion factor k from the zooming position information from the camera


10


(S


16


, S


17


). In this example, the CPU


32


is equipped with a conversion factor table for converting a distance on the screen of the monitor


50


to an amount of movement corresponding to the angles of view (the pan and tilt angles), and determines the conversion factor k.




The CPU


32


multiplies the determined conversion factor k by the previously calculated ΔX and ΔY to calculate the actual amount of movement (ΔXr, ΔYr) (S


18


), and determines the pan and tilt angles of rotation corresponding to the calculated actual amount of movement (ΔXr, ΔYr), transmitting a movement command of that angle of rotation to the camera


10


(S


20


).




The camera control circuit


28


of the camera


10


receives the movement command from the computer


30


(S


21


), and controls the pan control circuit


24


and the tilt control circuit


26


in accordance with the command to point the photographic optical axis of the camera


10


in the designated direction (S


22


).




Next, a case in which a photographic range is designated by two points on the screen of the monitor


50


will be described with reference to FIG.


6


. The user designates two points (x


1


, y


1


) and (x


2


, y


2


) in the photographic image plane, which is displayed on the monitor


50


, by the coordinate input device


42


(S


31


). The CPU


32


calculates a middle point (x


0


, y


0


) thereof from the designated two points (x


1


, y


1


) and (x


2


, y


2


) (S


32


). The CPU


32


calculates the difference (ΔX, ΔY) between the middle point (x


0


, y


0


) and the coordinates (a, b) of the center of the photographic-image displaying portion of the monitor


50


(when no window display system is adopted, it is the center of the entire screen of the monitor


50


, and when a window display system is adopted, it is the center of the photographic image display window) (S


33


). That is, the CPU


32


calculates the following values:




 Δ


X=x




0





a








Δ


Y=y




0





b








Further, the CPU calculates the difference (Δx, Δy) between the two designated points, that is, the following values (S


34


):






Δ


x=x




1





x




2










Δ


y=y




1





y




2








The CPU


32


transfers a movement command in which the differences (ΔX, ΔY) and (Δx, Δy) are used as parameters to the camera


10


(S


35


). The camera control circuit


28


of the camera


10


receives this command thus transferred (S


36


), and obtains zooming position information of the zooming lens


12




a


from the zoom control circuit


14


(S


37


), determining the conversion factor k of the pan and tilt amounts of movement from the zooming position information thus obtained (S


38


).




The camera control circuit


28


multiplies the determined conversion factor k by the parameters ΔX and ΔY of the movement command from the computer


30


to calculate the actual amount of movement (ΔXr, ΔYr), that is, the following values (S


39


):






Δ


Xr=kΔX










Δ


Yr=kΔY








Further, the camera control circuit


28


calculates the amount of movement to a desired zooming position, Δz, from the parameters Δx and Δy from the computer


30


and the factor k (S


40


).




The camera control circuit


28


calculates the actual amounts of movement corresponding to the actual amount of movement (ΔXr, ΔYr) calculated in step S


39


, and calculates the amount of zooming movement corresponding to the amount of movement Az calculated in step S


40


(S


41


). It then controls the pan control circuit


24


, the tilt control circuit


26


and the zoom control circuit


14


, pointing the photographic optical axis of the camera


10


in the designated direction and changing the magnification of the lens unit


12


(S


42


).




The above-described embodiment is not effective when the user wishes to observe ranges other than that displayed on the monitor screen. In such a case, the user has to move the zooming lens


12




a


of the camera


10


to wide-angle end through another operation, and then perform the above operation. These procedures could be simplified by the following procedures. The CPU sets imaginary screens above and below, and on the right and left-hand sides, of the display screen displaying an image that is being photographed. These imaginary screens may be adjacent to, or partly overlap, or spaced apart from, the actual display screen.

FIG. 7

shows the flowchart of an operation utilizing such imaginary screens.




The user selects one of the imaginary screens by the coordinate input device


42


or the like (S


51


). The CPU


32


calculates the disparity between the center (a


1


, b


1


) of the selected imaginary screen and the coordinates (a, b) of the center of the displayed photographic image (when no window display system is adopted, it is the center of the screen of the monitor


50


and, when a window display system is adopted, it is the center of the display window of the photographic image), that is, the CPU calculates the following values (S


52


):






Δ


X=a




1





a










Δ


Y=b




1





b








Then, the CPU


32


transfers a movement command corresponding to the difference (ΔX, ΔY), e.g., Mov (ΔX, ΔY), to the camera control circuit


28


of the camera


10


through the communications interface


40


(S


53


).




The camera control circuit


28


, which receives this movement command (S


54


), first obtains the zooming position information of the zooming lens


12




a


from the zoom control circuit


14


(S


55


), and then determines the conversion factor k of the amount of movement from the obtained zooming position information, as in the above described case (S


56


).




The camera control circuit


28


multiplies the determined conversion factor k by the parameters ΔX and ΔY of the movement command Mov (ΔX, ΔY) to calculate the actual amount of movement (ΔXr, ΔYr) (S


57


). That is,






Δ


Xr=kΔX










Δ


Yr=kΔY








Further, the camera control circuit


28


determines the pan and tilt angles of rotation corresponding to the actual amount of movement (ΔXr, ΔYr), (S


58


), and controls the pan control circuit


24


and the tilt control circuit


26


to point the photographic optical axis of the camera


10


in the designated direction.




As stated above, the present invention is obviously also applicable to a case where a camera at an image-transmission-end terminal is to be remotely controlled from an image-reception-end terminal in a video conference or the like.




As can be readily understood from the above description, in accordance with this embodiment, a visual and intuitive camera operation can be realized, thereby attaining an improvement in operability. In particular, this embodiment proves markedly effective in cases where remote control is to be performed.




While the above embodiment has been described as applied to optical zooming, which is effected by moving a zooming lens, it is also applicable to electronic zooming, in which captured image data is electronically processed.



Claims
  • 1. A control apparatus for controlling a camera, comprising:a display device arranged to display a current image picked-up by said camera and an imaginary screen which is set to be adjacent to or spaced apart from the current image already picked-up on a same display, wherein said imaginary screen corresponds to a range which shows that said camera can pickup an image by controlling an image pickup direction of said camera; and a control device arranged to control the image pickup direction of said camera on the basis of information on the input of an arbitrary position designated on said imaginary screen by an input device.
  • 2. An apparatus according to claim 1, wherein said control device controls the image pickup direction of said camera in such a way that the image at the arbitrary position comes to coincide with a center of the image.
  • 3. An apparatus according to claim 1, wherein said camera has a zoom lens, and said control device controls an image pickup direction of said camera on the basis of information on the input of the arbitrary position designated on said imaginary screen and a present status of said zoom lens.
  • 4. An apparatus according to claim 1, wherein display device displaying the image by a window display system.
  • 5. An apparatus according to claim 1, wherein said imaginary screen overlaps said current image picked-up by said camera.
  • 6. A method of controlling a camera, comprising:displaying a current image picked-up by said camera and an imaginary screen which is set to be adjacent to or spaced apart from the current image already picked up on a same display, wherein said imaginary screen corresponds to a range which shows that said camera can pickup an image by controlling an image pickup direction of said camera; enabling input of an arbitrary position designated on said imaginary screen; and controlling the image pickup direction of said camera on the basis of information on the input of the arbitrary position designated on said imaginary screen.
  • 7. A method according to claim 6, wherein said controlling controls the image pickup direction of said camera in such a way that the image at the arbitrary position comes to coincide with a center of the image.
  • 8. A method according to claim 6, wherein said camera has a zoom lens, and said controlling controls an image pickup direction of said camera on the basis of information on the input of the arbitrary position designated on said imaginary screen and a present status of said zoom lens.
Priority Claims (1)
Number Date Country Kind
5-185366 Jul 1993 JP
Parent Case Info

This is a continuation application under 37 CFR 1.62 of prior application Ser. No. 08/278,750, filed Jul. 22, 1994, abandoned.

US Referenced Citations (16)
Number Name Date Kind
3579072 Plummer May 1971 A
4081830 Mick et al. Mar 1978 A
4720805 Vye Jan 1988 A
5068735 Tuchiya et al. Nov 1991 A
5164827 Paff Nov 1992 A
5206721 Ashida et al. Apr 1993 A
5278643 Takemoto et al. Jan 1994 A
5396287 Cho Mar 1995 A
5434617 Bianchi Jul 1995 A
5479206 Ueno et al. Dec 1995 A
5523769 Lauer et al. Jun 1996 A
5523783 Cho Jun 1996 A
5530796 Wang Jun 1996 A
5631697 Nishimura et al. May 1997 A
5801770 Paff et al. Sep 1998 A
5815199 Palm et al. Sep 1998 A
Foreign Referenced Citations (10)
Number Date Country
3933255 May 1991 DE
149118 Jul 1985 EP
458373 Nov 1991 EP
59228473 Dec 1984 JP
60-152193 Aug 1985 JP
02-117291 May 1990 JP
04-068893 Mar 1992 JP
4-302587 Oct 1992 JP
04-323990 Nov 1992 JP
04-361494 Dec 1992 JP
Non-Patent Literature Citations (1)
Entry
JEE Journal of Electronic Engineering, “Canon's Variangle Prism Optically Corrects Image Blur,” vol. 29, No. 306, Jun. 1992, Tokyo, Japan, pp. 84-85.
Continuations (1)
Number Date Country
Parent 08/278750 Jul 1994 US
Child 08/714962 US