A. Field of the Invention
This relates to recording video images at a remote location and placing them into an easily readable form at a remote facility. The method and apparatus will provide adequate information about the images both from a temporal as well as a spatial perspective.
B. Prior Art
This particular device records video images at remote locations and places them in easy and readable forms at different remote facilities. This system employs cameras, which can be mounted at a variety of different places. There are other systems that use video capture devices to record scenes. The cameras themselves may have infrared or low level light capability. Representative examples of this type of device in the prior art include Vernon, U.S. Pat. No. 7,016,518, and Ciolli, U.S. Pat. No. 6,546,119.
The Vernon device provides an infrared illuminator in a camera for capturing an image of an automobile vehicle license plate. This particular device allows the operator of the device to determine whether or not to allow a vehicle into a secured area. The Ciolli device is an automated traffic violation monitoring and reporting system. This device uses a camera system, which is coupled to a data processing system for compiling vehicle and scene images produced by the camera system. This is important for law enforcement agencies.
Another device to measure the speed of a vehicle includes Kupersmit, U.S. Pat. No. 5,734,337. Again, this uses a camera and specifically detects motion.
Other examples in the prior art, which include video survey, are Gates, U.S. Pat. No. 5,073,819, and Bishop, U.S. Pat. No. 4,589,140.
None of the prior art references incorporate all the features of this particular method and apparatus.
This particular method and apparatus would be a temporal data interpretation mechanism, which would accomplish three separate objectives.
Firstly, the method would provide the ability to conduct a search of objects and events in a scene. Secondly, the method would provide the ability to query geographically a particular object and lastly the method would provide a compact, easily readable presentation. The images may be captured and stored in a remote location and then viewed by individuals in a plurality of different, geographically distant locations.
Although the railroad industry will be highlighted with this application, this method and apparatus may be used in a variety of other applications. In fact any event which involves moving objects would be suitable for this application. Some representative venues for this type of application may include highways, airports, parking garages or large shopping centers.
There are several different components to this method and apparatus. One component captures and integrates the temporal information. Another component of the method allows a particular event to be viewed apart from other events or dissected apart from the others. The third is the object and event search; this is important if the time of an event is known. Another feature of this method is the ability to read text from objects by incorporating optical character recognition software; this feature is valuable if the numbering on a specific item such as a railroad car or container is known.
All captured images in the method and apparatus may be stored for security, law enforcement purposes or forensic purposes. A search of the stored information on the database can direct the operator to a certain time and space for improved security operations.
This is a method by which events can be recorded and stored into a database 10 for eventual later use. The events are stored both in terms of time as well as space with this method 7. Particularly in the area of train travel this specific technology will be welcomed but may be used with other different applications as well.
For purposes of illustration the example of train cars 6 will be used. The train 5 will pass through a portal 1, which is a structure to which is mounted a plurality of cameras 2. These cameras 2 will capture video images of the train 5 as it passes through the portal and the cameras that are used are light sensitive 2A and may also have infrared capability 2B. The camera may also produce a color image 2C.
Incorporated into the method and apparatus is a method by which optical flow is measured such as the Lucas Kanade optical flow. This technology determines the pixel flow of the scene and from that information certain other values can be determined, including the speed of the train. The optical flow technology determines only pixel flow through a scene. It is from an analysis of the pixel flow information that the speed of the train can be calculated.
As the train 5 passes through the portal 1, the pixel flow is measured and a video image of each train car 6 is captured. The time of image capture is also recorded. The rate of capture is dependent on the speed of the train as it passes through the portal 1 and the camera speed in terms of frames per second. These images are then stored in a database 10.
The cameras 2, which are mounted on the portal trusses, are designed to operate in periods of low lighting 2A and will probably have infrared 2B capability as well. The camera 2 is likely to be a color 2C camera as well. The camera 2 will capture images of the number of rail cars 6 including the number of containers on the rail cars and the general condition of any particular container or car. All the video images are then fed to a remote location 20 and stored in a database 10.
Software is integrated into this method that will process the pixel flow information and produce a linear panorama 12 of the train so that an operator at a remote facility can view the image of the entire train in an easy to view format. For example it is much more difficult to view individual frames of a moving object but if those individual frames are placed in a linear panorama 12 the view is improved and the interpretation of the image is much better and quicker. The length of the panorama that is presented can be extended or shortened at the request of the operator, if and when desired.
The method also allows the database to store individual images of individual 14 rail cars and assign a specific time to the capture of a specific video image. These images can also be separated from each other so that the operator can view a single car, if desired. This may be important in terms of the security of a particular rail car or the security of the train in general.
The method allows the operator to search for a specific car with the aid of optical recognition software 15. Optical recognition software 15 will allow a search of text, such as the number of a car or wording that may appear on the car of a particular container or rail car. This ability enhances the ability of the method and apparatus to search for very specific items, which are known or designated by the operator.
Although the example of train travel has been highlighted in this application, this method may be used in any application where moving objects are involved such as port facilities, airports, warehouses or parking garages, to name just a few examples. The method may also be a useful law enforcement tool in terms of searching for specific items or to search an area for specific items or events.
Certain background analysis 16 is performed by the capture of the image as well as some environmental parameters 17 that are built into the system. Additionally, the real car height and width 18 is estimated by the video capture. From that information a linear panorama of the moving object, in the case the train can be provided. This provides real time analysis of possible problems.
Certain information about the cars and the train itself may be stored in the database 10 so that, if certain rated parameters are violated, an individual at the remote facility will be alerted and corrective action may be taken.
Number | Name | Date | Kind |
---|---|---|---|
4589140 | Bishop | May 1986 | A |
4847772 | Michalopoulos et al. | Jul 1989 | A |
5073819 | Gates | Dec 1991 | A |
5392034 | Kuwagaki | Feb 1995 | A |
5583765 | Kleehammer | Dec 1996 | A |
5734337 | Kupersmit | Mar 1998 | A |
5809161 | Auty et al. | Sep 1998 | A |
5938717 | Dunne et al. | Aug 1999 | A |
6067367 | Nakajima et al. | May 2000 | A |
6198987 | Park et al. | Mar 2001 | B1 |
6404902 | Takano et al. | Jun 2002 | B1 |
6496598 | Harman | Dec 2002 | B1 |
6538579 | Yoshikawa et al. | Mar 2003 | B1 |
6546119 | Ciolli et al. | Apr 2003 | B2 |
6628804 | Edanami | Sep 2003 | B1 |
6681195 | Poland et al. | Jan 2004 | B1 |
6696978 | Trajkovic et al. | Feb 2004 | B2 |
6734896 | Nobori et al. | May 2004 | B2 |
6766038 | Sakuma et al. | Jul 2004 | B1 |
6985827 | Williams et al. | Jan 2006 | B2 |
6996255 | Sakuma et al. | Feb 2006 | B2 |
7016518 | Vernon | Mar 2006 | B2 |
7027616 | Ishii et al. | Apr 2006 | B2 |
7439847 | Pederson | Oct 2008 | B2 |
20020034316 | Ishii et al. | Mar 2002 | A1 |
20020047901 | Nobori et al. | Apr 2002 | A1 |
20020094110 | Okada et al. | Jul 2002 | A1 |
20020141618 | Ciolli et al. | Oct 2002 | A1 |
20030021490 | Okamoto et al. | Jan 2003 | A1 |
20040218786 | Murakoshi et al. | Nov 2004 | A1 |
20050008194 | Sakuma et al. | Jan 2005 | A1 |
20070122058 | Kitaura et al. | May 2007 | A1 |
20080013790 | Ihara et al. | Jan 2008 | A1 |
20080211914 | Herrera et al. | Sep 2008 | A1 |