This invention relates to camera imaging.
According to one or more embodiments of the present invention, a video imaging system comprising a low resolution colour digital video camera and a high resolution monochromatic digital video camera operably connected to a digital processing system. The system can further comprise an object motion module for detecting objects moving within the fields of view of the cameras, and an object position module for determining the position of an object in the overlapping field of view of the cameras.
According to one or more embodiments of the present invention, a method comprising providing an image frame from a low resolution colour digital video camera and a corresponding image frame from high resolution monochromatic digital video camera and fusing the two image frames to obtain a colour image having higher resolution than the image frame from the low resolution colour digital video camera. The method can further comprise providing a three dimensional coordinate system for determining the position of a moving object in the overlapping fields of view of the cameras whereby the two dimensional position of the moving object is determined according its position in the images, whereas the distance from the cameras to the object in the axis perpendicular to the plane of the images is derived from the parallax error between the two image frames to be fused.
According to one or more embodiments of the present invention, a camera imaging system comprising a low resolution colour digital sensor chip, a high resolution monochromatic digital sensor chip, a beam splitter, and a lens, wherein the lens gathers incident light towards the beam splitter, and the beam splitter splits the light towards the two sensor chips. The system further comprises a digital processing system which fuses a low resolution colour image from the colour sensor and a high resolution monochromatic image from monochromatic sensor to produce a high resolution colour image.
Referring to
The cameras 2 and 4 employ charge-coupled device (“CCD”) sensors or complementary metal-oxide-semiconductor (“CMOS”) sensors. Camera 2 is a low resolution colour (“LC”) video camera while camera 4 is a high resolution monochrome (“NM”) video camera. Cameras 2 and 4 are capable of providing streaming video signals as part of a security, surveillance or monitoring system. It will be understood, however, that the applications for the cameras 2 and 4 are not limited to such systems.
Camera 2 has a field of view defined by light rays 8 while camera 4 has a field of view defined by light rays 10. Colour camera 2 and monochrome camera 4 produce separate streaming video signals which are then supplied to the DPS 12. The cameras 2 and 4 are adjacent and can be housed together in a single camera housing (not shown).
The low resolution colour streaming video signals from camera 2 are fused by image fusing module (“FM”) 26 in processor 12 with corresponding high resolution monochrome streaming video signals from camera 4 to produce a fused high resolution colour streaming video signal (“HC”) 28. Fusing the colour and monochrome video signals provides the dual camera system with improved sensitivity capable of acquiring high resolution colour video signals under poor lighting conditions due to the inclusion of the high resolution signal from the monochrome camera and the colour signal from the colour camera.
The colour and monochrome video signals are comprised of individual image frames. Corresponding pairs of video image frames from cameras 2 and 4 are isolated and then fused. Various methods for fusing the frame pairs can be used. For example, image fusion methods for fusing a low resolution multispectral satellite images with high resolution panchromatic satellite images are known in the field of remote sensing and can be adapted to fuse video image frames from cameras 2 and 4. One such fusion method is disclosed in U.S. Pat. No. 7,340,099 (Zhang) which is incorporated herein by reference in its entirety. Other image fusion methods used for satellite imagery include arithmetic based, statistics based, ratio based and wavelet based methods. By substituting colour and monochrome video image frame pairs according to the present invention for multispectral and panchromatic images respectively, prior art image fusing methods can be adapted to fuse video image frames acquired by camera 2 with video image frames acquired by camera 4.
In a further aspect, referring to
In a still further aspect, referring to
The plane of the image frames 32 in
In one or more embodiments, the dual camera system according to the present invention provides colour video with improved sensitivity compared with a conventional video camera, the detection of moving objects, and the three dimensional position of the objects in the common field of view of the cameras 2 and 4.
According to one or more embodiments of the present invention, methods of the present invention can be applied to image frames from two corresponding still cameras.
In a still further aspect, referring to
Cameras 2 and 4 are positioned such that when the light splitter 40 splits the incoming light into two directions, about half of the incident light is directed towards the colour digital sensor 42 and about the other half of the incident light is directed towards the monochromatic digital sensor 44. In this embodiment, the capacity of detecting distance from the camera to a moving object is reduced.
Separate streaming video signals from sensors 42 and 44 are then supplied to the DPS 12 in a similar manner to the signals from cameras 2 and 4 in the system described with initial reference to
Low resolution colour streaming video signals from sensor 42 are fused by the FM 26 in processor 12 with corresponding high resolution monochrome streaming video signals from sensor 44 to produce a fused high resolution colour streaming video signal (“HC’) 28 using the methods described herein.
In a still further embodiment, referring to
It is understood that other devices can be used in place of splitter 40 or a mirror 50, as long as they can direct incident light from lens 20 towards both sensor 42 and sensor 44 simultaneously.
This application claims priority from International Patent Application Number PCT/CA2011/050666 filed on 24 Oct. 2011 which claims priority from U.S. Patent Application Ser. No. 61/405,941 filed 22 Oct. 2010.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/CA2011/050666 | 10/24/2011 | WO | 00 | 8/16/2013 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2012/051720 | 4/26/2012 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3785469 | Stumpf | Jan 1974 | A |
4101916 | Gottschalk et al. | Jul 1978 | A |
4652909 | Glenn | Mar 1987 | A |
5828913 | Zanen | Oct 1998 | A |
5852502 | Beckett | Dec 1998 | A |
6785469 | Ide | Aug 2004 | B1 |
6788338 | Dinev | Sep 2004 | B1 |
7340099 | Zhang | Mar 2008 | B2 |
20020110376 | MacLean et al. | Aug 2002 | A1 |
20020122113 | Foote | Sep 2002 | A1 |
20060119710 | Ben-Ezra et al. | Jun 2006 | A1 |
20060125936 | Gruhike et al. | Jun 2006 | A1 |
20070212056 | Nagata | Sep 2007 | A1 |
20070279494 | Aman | Dec 2007 | A1 |
20080024390 | Baker | Jan 2008 | A1 |
20090231447 | Paik | Sep 2009 | A1 |
20090309987 | Kimura | Dec 2009 | A1 |
20100149338 | Aggarwal | Jun 2010 | A1 |
20100177162 | Macfarlane | Jul 2010 | A1 |
Number | Date | Country |
---|---|---|
WO 2012051720 | Apr 2012 | WO |
Entry |
---|
PCT International Search Report, dated Dec. 1, 2011. |
Number | Date | Country | |
---|---|---|---|
20130335599 A1 | Dec 2013 | US |
Number | Date | Country | |
---|---|---|---|
61405941 | Oct 2010 | US |