Field of the Invention
The present invention relates to generally to video camera systems.
Description of the Background Art
Video camera systems are commonly used for video surveillance of prescribed areas. For example, such systems are used for surveillance of parking lots, department stores, casinos, banks, and other areas of interest.
Conventional video cameras commonly used in such systems include fixed-type and movable-type cameras. The fixed-type cameras may be configured to observe a fixed area, and the movable-type cameras may be configured with pan-and-tilt motor units to observe a wide area range.
One embodiment relates to a method of outputting multiple views from a networked camera. Each imager of an array of imagers in the camera captures image frames and transmits the captured image frames to an associated image flow processor. Each image flow processor processes the captured image frames and transmits the processed image frames to a multi-imager video processor. An updating of parameters for said processing by each image flow processor is performed on a frame-by-frame basis.
Another embodiment relates to a video camera including a plurality of imagers, a plurality of image flow processors, a multi-imager video processor, and a plurality of update queues. Each imager includes a sensor array that is configured to capture image frames from light projected by one of said lenses, and each image flow processor is configured to receive and process the image frames captured from at least one said sensor array. The multi-imager video processor is configured to receive the processed image frames from the plurality of image flow processors, and each update queue is associated with an image flow processor.
Other embodiments and features are also disclosed.
Within each single-imager field of view 110, a select-angle field of view 112 may be selected. In a preferred implementation, the single-imager field of view 110 is 38 degrees wide, and the select-angle field of view 112 is 36 degrees wide.
As further shown, there is a small vertical slice 114 (approximately an inch wide, for example) which is between select-angle fields of view 112 of adjacent imagers. As seen in
In one embodiment, the first imager 212 may be configured as a “day” imager that is optimized for good color fidelity and sharpness, and the second imager 214 may be configured as a “night” imager that is optimized for good low-light performance. In this embodiment, the first imager may be implemented, for example, as a CMOS image sensor array with more than 1.2 million pixels and a pixel size less than 4 microns in width. The image output by the first imager 212 may have a signal-to-noise ratio of greater than 10 dB when scene lighting is greater than 200 Lux. The second imager may be implemented, for example, as a CMOS image sensor array with less than 1.2 million pixels and a pixel size greater than 3 microns in width. The image output by the second imager 214 may have a signal-to-noise ratio of greater than 10 dB when scene lighting is less than 2 Lux.
The first and second imagers 212 and 214 may be controlled by way of control signals received via a control bus 215. The control bus 215 may be implemented as an I2C bus, for example.
A first serial bus 216 may be used to communicate the output image data from the first imager 212 to a day/night multiplexer 220 on the image flow processor (IFP) 211, and a second serial bus 218 may be used to communicate the output image data from the second imager 214 to the day/night multiplexer 220 on the IFP 211. For example, the first and second serial buses may be implemented as MIPI buses.
Based on a multiplexer select (mux select) signal, the multiplexer 220 selects the image data from either the first serial bus 216 (i.e. from the first imager) or the second serial bus 218 (i.e. from the second imager). The selected image data is then received by the IFP core 222. The IFP core 222 may output image data via a serial bus (for example, an MIPI bus) 224 to a multi-imager video processor (MIVP). A connector 215 may be used to connect the control bus 215 and the serial bus 224 to the MIVP.
As shown, the video camera is configured such that the AEC and AWB is turned on at the microcontroller (μcontroller) for only one of the five imagers, and turned off at the microcontroller for the other four imagers. In a preferred embodiment, the AEC and AWB is turned on only at the microcontroller for the middle imager of the array of imagers (for example, the imager being used, either day 122 or night 124, of face 120-2 in
As shown, the sensor array of each imager performs cropping from the source rectangular frame (CTG SOURCE-RECT) to the primary rectangular frame (CTG PRIMARY-RECT). In addition, circuitry may be included in the sensor IC for integration time control and analog gain control. The integration time control is used to provide exposure control, while the analog gain control is used to provide white balance control.
As further shown, the IFP for each imager may include resizing and cropping circuitry. The IFP may be incorporated onto the same integrated circuit as the sensor array, or the IFP may be implemented on a separate integrated circuit. In one embodiment, the IFP first resizes the image in the primary rectangular frame, and then performs a “horizontal” cropping of the resized image. In an alternative embodiment, the horizontal cropping may be performed in the sensor by appropriate adjustment of the primary rectangular frame so as to implement the desired horizontal crop. An example of a resizing of an image is shown in
The MIVP receives the horizontally-cropped resized images from the multiple imagers to obtain a combined image. The MIVP may then perform a “vertical” cropping of the combined image to create a final image frame. The vertical cropping reduces a number of pixels in a horizontal dimension of the image. An example of a vertical cropping of an image is shown in
As further shown in
The above-described capability to switch views (i.e. to re-size and crop) on a frame-by-frame basis may be advantageously applied to enable the camera to simultaneously provide different views to multiple users.
The MIVP may be configured to encrypt and compress the final image frames to generate encoded image frames. The network processor 1102 may be configured to receive the encoded image frames generated by the MIVP and to generate data packets therefrom which are addressed to a plurality of user computers. The data packets addressed to a particular user computer contain the encoded image frames which are customized for that user computer.
For example, consider that the above-discussed camera is networked to a network which includes three user computers: User A; User B; and User C. Further consider that each user computer may select a different view for display by way of a user interface on the user computer. For example, the user interface may allow the user to crop and re-size the image to be viewed. In this particular example, consider that the camera obtains image data at 60 frames per second, and that the priorities between the users is such that User A is to receive video at 30 frames per second, User B is to receive video at 10 frames per second, and User C is to receive video at 20 frames per second. The Update Queues in
CACBCACACBCACACBCACACBCA . . .
For frames taken with parameters A, the network processor may packetize those frames with the network address of User A and transmit those packets to User A via the network. For frames taken with parameters B, the network processor may packetize those frames with the network address of User B and transmit those packets to User B via the network. For frames taken with parameters C, the network processor may packetize those frames with the network address of User C and transmit those packets to User C via the network.
In the above description, numerous specific details are given to provide a thorough understanding of embodiments of the invention. However, the above description of illustrated embodiments of the invention is not intended to be exhaustive or to limit the invention to the precise forms disclosed. One skilled in the relevant art will recognize that the invention can be practiced without one or more of the specific details, or with other methods, components, etc. In other instances, well-known structures or operations are not shown or described in detail to avoid obscuring aspects of the invention. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification and the claims. Rather, the scope of the invention is to be determined by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
The present application claims the benefit of U.S. Provisional Application No. 61/244,869, filed Sep. 22, 2009, the disclosure of which is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5715490 | Ishito | Feb 1998 | A |
7520685 | Lee | Apr 2009 | B2 |
20030095183 | Roberts et al. | May 2003 | A1 |
20030202099 | Nakamura et al. | Oct 2003 | A1 |
20030206739 | Lu | Nov 2003 | A1 |
20040130655 | Yanakawa et al. | Jul 2004 | A1 |
20050012818 | Kiely et al. | Jan 2005 | A1 |
20050044258 | Nakamura | Feb 2005 | A1 |
20050140787 | Kaplinsky | Jun 2005 | A1 |
20050146610 | Creamer et al. | Jul 2005 | A1 |
20060165309 | Yachi et al. | Jul 2006 | A1 |
20060255986 | Takanezawa et al. | Nov 2006 | A1 |
20070098397 | Chen et al. | May 2007 | A1 |
20070156854 | Wei et al. | Jul 2007 | A1 |
20070182819 | Monroe | Aug 2007 | A1 |
20080056708 | Kim | Mar 2008 | A1 |
20090040293 | Foo et al. | Feb 2009 | A1 |
20090141143 | Alm | Jun 2009 | A1 |
20090207246 | Inami et al. | Aug 2009 | A1 |
20100092037 | Peleg | Apr 2010 | A1 |
Number | Date | Country |
---|---|---|
2 066 114 | Mar 2009 | EP |
Number | Date | Country | |
---|---|---|---|
20150009350 A1 | Jan 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12887263 | Sep 2010 | US |
Child | 14271313 | US |