The present application claims priorities from Japanese patent applications JP2003-163918 filed on Jun. 9, 2003 and JP2003-338676 filed on Sep. 29, 2003, the contents of which are hereby incorporated by reference herein.
This application relates to the subject matters of the following U.S. patent applications.
U.S. patent application Ser. No. 10/820,031 (Applicants' Ref.: US11476 (W1405-01EJ) assigned to the same assignee of the present invention and filed on Apr. 8, 2004 in the names of Tsuyoshi Kawabe, Hirotada Ueda and Kazuhito Yaegashi and entitled “Video Distribution Method and Video Distribution System”, the disclosure of which is hereby incorporated by reference herein.
U.S. patent application Ser. No. 10/840,366 (Applicants' Ref.: US11480 (W1503-10EJ) assigned to the same assignee of the present invention and filed on May 7, 2004 in the names of Tsuyoshi Kawabe and Hirotada Ueda and entitled “Change Detecting Method and Apparatus”, the disclosure of which is hereby incorporated by reference herein.
The present invention relates to a change detecting technology for detecting and notifying the generation of an image change, and more particularly to a change detecting method and apparatus and a monitoring system using the method or apparatus that transmits monitor information, generated by detecting the generation of an image change in a monitoring system, to PCs (Personal Computer) or portable or mobile terminals connected via a network.
In recent years, video accumulation and video distribution technologies using the network technology of the Internet or a LAN have been developed for use in monitoring intruders using a monitor camera. Techniques have also been developed for accumulating images as digital data in a storage unit such as a hard disk or a DVD (Digital Versatile Disk).
A technology for detecting a change in an image captured by a monitor camera using the image recognition technology and for sending information on the change to a PC or a portable terminal connected to a network as the monitor information is disclosed, for example, in Japanese Patent Application No. JP2002-347202. Further, a monitor information transmission technology for specifying monitor schedules and monitor regions by providing a table for holding parameters for the times and regions is disclosed in U.S. patent application Ser. No. 10/840,366 (Applicant's Ref.: US11480 (W1503-01EJ)) and its corresponding Korean patent application No. 2004099144A (Applicant's Ref.: KR61199(W1503-02EJ))(claiming priority from JP-A-2003-139179).
An image recognition technology is also known that traces the moving direction of a moving object by detecting an image change, calculating the size of the moving object and its center of gravity, and continuously processing them for a plurality of frames. For example, see U.S. Pat. No. 6,445,409.
According to conventional monitor information transmission technologies, the notification destination to which monitor information detected at abnormality time is to be sent is predetermined and there is no means for automatically changing the notification destination based on detected monitor information. To change the notification destination, the notification destination of the monitor information produced by the monitor information transmission technologies must be manually rewritten.
However, there is a need for transmitting monitor information, acquired from images captured by one monitor camera, to different notification destinations according to the location where a moving object is detected, the size of a moving object, or the combination of them.
It is an object of the present invention to provide a change detecting method and apparatus capable of transmitting monitor information to different notification destinations according to the location of a moving object, the size of a moving object, and so on.
It is another object of the present invention to provide a monitoring system having the above-described change detecting apparatus as a notification apparatus.
According to one aspect of the present invention, there is provided a change detecting apparatus comprising an input unit that receives a monitor image picked up by a pickup unit; a region specification unit that specifies N regions (N is a positive integer equal to or larger than 2) in the monitor image; a notification destination specification unit that specifies notification destinations of image changes in the monitor image in advance according to characteristics or features of the image changes; a change detection unit that detects an image change in each of the N regions; a characteristics extraction unit that extracts at least one characteristic or feature of the image changes from the change detection unit; a monitor information generation unit that generates monitor information related to each of the detected image changes; and a transmission unit that transmits the monitor information, wherein the transmission unit transmits the monitor information to a predetermined notification destination, which is set in the notification destination specification unit, based on the detected characteristic or feature of the image change.
In one embodiment, the characteristic or feature extracted by the characteristics extraction unit includes identification information on a region in which an image change was detected and monitor information related to the image change is transmitted to a predetermined notification destination based on the identification information.
In one embodiment, the characteristic extracted by the characteristics extraction unit further include size information on a region, in which the image change was detected, in addition to the identification information on the region, and monitor information related to the image change is transmitted to a predetermined notification destination based on the identification information and the size information on the region.
In one embodiment, the characteristic or feature extracted by the characteristics extraction unit include a moving direction of the image change in the monitor image, and monitor information related to the image change is transmitted to a predetermined notification destination based on the moving direction of the image change.
Although the term “monitor information” is used in this specification, the terms such as “alarm information” and “detection information” are equivalent and those terms are encompassed by the present invention. The “monitor information” is a generic term for information transmitted from a notification apparatus to other apparatuses but is not limited by the meaning of “monitor”. The notification apparatus in this specification refers to a change detecting apparatus having a function to transmit information such as monitor information.
Although the term “monitoring system” is used in this specification, the terms such as “notification system” and “object detecting system” are equivalent and those terms are encompassed by the present invention.
Other objects, features, and advantages of the present invention will be made more apparent by the description of the embodiments of the present invention given below, taken in conjunction with the accompanying drawings.
Some embodiments of the present invention will be described with reference to the drawings. In the drawings, the same reference numerals denote the same structural elements.
First, referring to
In
The numeral 1002 indicates a transmission path of video signals such as a LAN (Local Area Network), and the numerals 1003-1, 1003-2, . . . , 1003-n (n=1, 2, . . . ) indicate Web encoders. The numeral 1004 indicates an image accumulation unit having the function of accumulating the images from monitor cameras.
The numerals 1005-1, 1005-2, . . . , 1005-m (m=1, 2, . . . ) indicate browser PCs having the function of managing the whole monitoring system. The numeral 1006 indicates a hub, the numeral 1007 indicates a notification apparatus, the numeral 1008 indicates a modem, the numeral 1009 indicates a transmission path implemented by a public line, the numeral 1010 indicates a WAN (Wide Area Network) network such as the Internet, the numeral 1011 indicates a mobile phone service provider's exchange system, the numerals 1012-1, 1012-2, . . . , 1012-1 (l=1, 2, . . . ) indicate portable terminals, and the numerals 1013-1, 1013-2, . . . , 1013-p (p=1, 2, . . . ) indicate client PCs.
The configuration may have only one monitor camera and one Web encoder, or a plurality of monitor cameras may be connected to one Web encoder. It is also possible to use a unit in which the functions of the monitor camera, the Web encoder, the image accumulation unit, and notification apparatus are integrated. The system described with reference to
The monitor camera 1001, Web encoder 1003, image accumulation unit 1004, hub 1006, notification apparatus 1007, modem 1008, and client PC 1013 are interconnected via the transmission path 1002 such as a LAN. The mobile phone service provider's exchange system 1011 is connected to the modem 1008 via the transmission path 1009 and the network 1010. The mobile phone service provider's exchange system 1011 is connected wirelessly to the portable terminal 1012.
The numeral 1104 indicates a storage unit. The storage unit 1104, used as the storage unit of the image accumulation unit 1004 to record the images captured by the monitor camera 1001, uses a large-capacity recording medium, for example, a VTR. Random access recording media, such as a magnetic disk (HD: hard disk) and a DVD (Digital Versatile Disc), are also preferable. The numeral 1105 indicates an input interface, the numeral 1108 indicates an input device such as a keyboard, the numeral 1109 indicates a pointing device such as a mouse, the numeral 1106 indicates a video interface, the numeral 1107 indicates a monitor, and the numeral 1110 indicates a bus.
All the devices from the CPU 1101 to the video interface 1106 are interconnected via the bus 1110. The monitor 1107 is connected to the bus 1110 via the video interface 1106. The input device 1108 and the pointing device 1109 are connected to the bus 1110 via the input interface 1105. Also, the network interface 1103 is connected to the LAN transmission path 1002. In addition, the network interface 1103 may be connected to the transmission path 1009 of a public line as necessary. When the configuration in
Assume that the monitor camera 1001 is installed at a predetermined monitor position. This monitor camera constantly picks up images, and the images thus picked up are accumulated in the image accumulation unit 1004 via the LAN transmission path 1002, Web encoder 1003, and hub 1006.
The notification apparatus 1007 has the function of retrieving an image from the image accumulation unit 1004 and comparing the image retrieved previously with the image retrieved immediately before and detects an image change. The notification apparatus 1007 thus has the function of detecting and accumulating an abnormality through the so-called image recognition technique. The technique of detecting an abnormality through image recognition technology is a well-known method for detecting a change in the brightness components of the preceding and following frame screens or for comparing the video signal spectra and, therefore, is not explained in detail.
If an image change is found as a result of comparison and an abnormality is detected, the notification apparatus 1007 determines that there is an abnormality and stores therein the image in which the abnormality is detected and the date/time at which the abnormality is detected, as well as a required message. At the same time, the notification apparatus 1007 selects the notification destination according to the contents of the change and delivers the information to the portable terminal 1012 and the client PC 1013, which are the notification destinations, as the monitor information. That is, the monitor information is distributed from the notification apparatus 1007, via the hub 1006, modem 1008, and network 1010, to the portable terminal 1012 via the mobile phone service provider's exchange system 1011 or to the client PC 1013 via the modem 1008. The message described above may be, for example, “Abnormality occurred: month/day/hour/minute/second”.
Next, the notification apparatus (change detecting apparatus) 1007 will be described with reference to
Referring to
An image receiving unit 1202 retrieves an image, captured by the monitor camera, from the image accumulation unit 1004 and outputs the retrieved image to a detection processing unit 1203.
The detection processing unit 1203 consists of a plurality of image recognition processing units, that is, image recognition processing unit 1203-1, image recognition processing unit 1203-2, . . . , image recognition processing unit 1203-q (q=1, 2, . . . ), and those image recognition processing units each read the region table stored in the memory 1201-1, memory 1201-2, . . . , and memory 1201-q.
The detection processing unit 1203 performs image processing for an image received from the image receiving unit 1202 based on the monitor region defined by the region table for detecting an intruding object. That is, the detection processing unit 1203 performs known image recognition processing only for a region of the area of the image picked up by the monitor camera for detecting an image change wherein the region is defined by each region table stored in the memory unit 1201. When an image change is detected, the changed part of the image, produced as the result of detection by the detection processing unit 1203, is output to a characteristics extraction unit (i.e. feature extraction unit) 1204. In the description below, the detected changed part of an image is treated as a monitor target.
Based on the detection result received from the detection processing unit 1203, the characteristics extraction unit 1204 detects the characteristics or features of the detected target and outputs them to a conversion unit 1205 as characteristic or feature information on the target. The characteristic or feature includes the size of the target, the shape of the target, the color of the target, the moving speed of the target, the direction into which the target moves, the region in which the target is detected, and so on. The characteristics extraction unit 1204 identifies whether the detected target is a person or a car and, if the target is a car, the color of that car. It is easily understood that other distinctions can also be made as necessary. The characteristics or feature information may also include the time of day, year/month/day at which the detection processing unit 1203 detected the image change, monitor camera number, and so on. It is to be understood that the characteristics or features need not always include all items described above but only the required number of required items need be included according to the monitor target and the monitor purpose. For the detection of the characteristics or features of a target, known technology in the art, for example, the technology disclosed by U.S. Pat. No. 6,445,409, can be used.
The conversion unit 1205, which consists of a monitor information generation unit 1206, a notification destination determination unit 1207, a transmission unit 1209, and a notification destination table A1208, generates monitor information, determines the notification destination, and transmits the monitor information based on the output result of the characteristics extraction unit 1204.
The monitor information generation unit 1206 generates monitor information based on the characteristics or feature information on the target received from the characteristics extraction unit 1204.
The notification destination determination unit 1207 searches the notification destination table A1208 based on the characteristics or feature information on the target received from the characteristics extraction unit 1204 to acquire the notification destination to which the monitor information to be transmitted. The notification destination table A1208 contains information on a location to which the monitor information is to be transmitted such as the mail address of the notification destination. The notification destination table A1208 also contains conditions for determining the notification destination. That is, the table contains in advance the notification destinations to be selected according to the monitor region in which the target is detected, the size of the target, and so on.
The transmission unit 1209 transmits the monitor information, generated as described above, to the notification destination determined by the notification destination determination unit 1207.
It is also possible to store the detection result of the detection processing unit 1203, the transmitted monitor information, and so on in the memory in the notification apparatus as a log.
The function of the notification apparatus 1007 shown in
An example of the operation of the monitoring system shown in
In step 201, the monitor operation, that is, the monitor operation of the monitoring system, is started. The Web encoder 1003 digitally compresses a monitor image from the predetermined monitor camera 1001 to generate image compression data. This image compression data is accumulated in the image accumulation unit 1004 via the hub 1006.
The image compression data stored in the image accumulation unit 1004 is a digitally compressed image stored with information such as the pickup date/time, the channel number of the monitor camera 1001, and the compression format. An image from which monitor camera is to be captured is determined in various ways; for example, the image to be captured is scheduled in advance by the management of the browser PC 1005 or is selected based on abnormality detection information.
In step 202, the image receiving unit 1202 of the notification apparatus 1007 acquires one frame of image from the image accumulation unit 1004. In this step, all images input from the monitor camera 1001 to the image accumulation unit 1004 are read in order of input and supplied to the image receiving unit 1202 of the notification apparatus 1007.
In step 203, the detection processing unit 1203 of the notification apparatus 1007 performs image recognition processing and compares the previous image with the current image received from the image receiving unit 1202, for example, in the brightness value, to detect an image change. As described above, the detection processing unit 1203 performs image recognition processing only for the monitor region defined by the region table stored in the memory unit 1201 to detect an image change in the monitor region.
In step 204, the detection processing unit 1203 checks if an image change is detected as the result of the image recognition processing in step 203. Of course, whether or not an image change is detected is determined for each monitor region. Whether or not there is an image change is determined, for example, by detecting a change in the brightness value. In this case, the occurrence of a notification error can be minimized, for example, by a establishing a predetermined threshold value for abnormality detection as necessary to prevent a change smaller than the predetermined value from being treated as abnormal.
If it is determined that there is an image change as the result of detection, control is passed to step 205; if it is determined that there is no change, control is returned to step 202 to perform the same processing for the next input image.
In step 205, the characteristics extraction unit 1204, which has received the detection result of the detection processing unit 1203 in step 204, detects the changed part of the image, that is, the detected characteristics (for example, region in which the change is detected, the size of the region, etc.) of the target. The detected characteristics information is transmitted to the monitor information generation unit 1206 and the notification destination determination unit 1207 provided in the conversion unit 1205.
In step 206, the monitor information generation unit 1206, which has received the characteristics or feature information from the characteristics extraction unit 1204, generates monitor information. The generated monitor information is output to the transmission unit 1209. The contents of the monitor information may be a message describing at least one of the time of day at which the image change was detected, year/month/day, the monitor camera number, the characteristics or features of the detected target, and so on. Of course, the monitor information may include, as necessary, the still image and/or the moving image of the image captured by the monitor camera when the image changed.
The message described above may be superimposed on the still images and other images captured by the monitor camera. It is also possible to change the size, that is, the number of pixels or compression rate, of the image depending upon the size of data receivable by the client PC 1013 or the capacity of the communication line so that the user can receive the data.
In step 207, the notification destination determination unit 1207 selects the notification destination of the monitor information. The notification destination determination unit 1207 searches the notification destination table A1208 to select the notification destination based on the characteristics information received from the characteristics extraction unit 1204. The selected notification destination is output to the transmission unit 1209. If there is no notification destination, no destination is output to the transmission unit 1209.
In step 208, the transmission unit 1209 transmits the monitor information to the portable terminal 1012 and the client PC 1013. That is, the monitor information, generated by the monitor information generation unit 1206, is transmitted to the notification destination selected by the notification destination determination unit 1207. The monitor information is transmitted usually via electronic mail; or any method other than an electronic mail may also be used if the portable terminal 1012 and the client PC 1013 can receive the monitor information. In step 209, the monitor processing ends.
As described above, the notification apparatus 1007 in this embodiment allows a plurality of monitor regions to be set in an image area picked up by the monitor camera and to transmit the monitor information according to the detection result in the monitor regions. Setting a plurality of monitor areas in this way gives more detailed detection information about an area picked up by the monitor camera and allows monitor information to be transmitted to a notification destination where the monitor information is needed. At the same time, this apparatus performs image recognition processing more quickly than when the whole area of the image is processed and reduces the memory amount required for image recognition processing.
In the above embodiment, one or more parts of the image area are established in advance as monitor regions, and information is transmitted to a predetermined notification destination when an image change, that is, a target, is detected in the monitor regions. Another method is also possible in which image recognition processing is performed for the whole image area, the position of a target is determined from the center of gravity of the detected target, a corresponding notification destination is selected from the notification destination table based on the position, and the notification destination of the monitor information is switched.
Note that the processing steps in
Next, a notification apparatus in another embodiment of the present invention will be described with reference to FIG. 1–
In this example, image recognition processing is performed to detect to which place, either the first building or the restricted zone, the person entering at the front gate is going. When the person is going to the first building, the monitor information is transmitted to the front desk of the first building; when the person is going to the restricted zone, the monitor information is transmitted to the guardroom.
A blank block indicates a block for which no image recognition processing is performed. In this example, when the person 104 intrudes into the monitor region (1) 102, the blocks indicated by “1” enters the detection state. Similarly, when the person 104 intrudes into the monitor region (2) 103, the blocks indicated by “2” enters the detection state.
A monitor region is set by the operator of the browser PC 1005, for example, using the pointing device 1109 such as a mouse provided on the browser PC 1005. More specifically, a desired monitor region can be set by specifying the range by clicking the blocks with the mouse or by performing the drag & drop operation with the mouse. The monitor region (1) and the monitor region (2), which are set, are stored respectively in the memory 1201-1 and the memory 1201-2, shown in
The notification destination is specified by a mail address when monitor information is transmitted via an electronic mail, and a telephone number when a telephone call is used. Any other notification means and form may also be used as long as the place, to which the monitor information is to be transmitted, is specified by identifiable information. In this way, the notification destinations of the monitor information are set in advance in the notification destination table A1208 when the monitoring system is installed. It is also possible to set the notification destination table A1208 from the browser PC 1005, portable terminal 1012, or client PC 1013 after installing the monitoring system.
The following describes an example of processing with reference to the flowchart in
In step 401, the detection processing unit 1203 (
In step 402, the characteristics extraction unit 1204 determines the number of the region in which the image change was detected. In the example in
When, for example, an image change was detected by the image recognition processing unit 1203-1, the characteristics extraction unit 1204 can determine the region number such that the image change was detected in the monitor region (1). It is of course possible to use a method other than this to determine the region in which an image change was detected.
In step 403, the conversion unit 1205 generates monitor information if the image change was detected in the monitor region (1), references the notification destination table A1208 shown in
In step 404, the conversion unit 1205 generates monitor information if the image change was detected in the monitor region (2), references the table in
In step 405, whether or not the monitor processing is to be continued or ended is judged. The CPU of the notification apparatus judges whether to continue or end the monitor processing, for example, based on whether the monitor end instruction is received from the user or based on the table (not shown) in which the operation schedule of the notification apparatus is stored. Control is passed to step 401 if the monitor processing is to be continued as the result of the judgment in step 405. If the monitor processing is to be ended as the result of the judgment, the monitor processing is ended.
The judgment in step 405 for judging if the monitor processing is to be continued or ended may be made for each monitor region or for each notification destination. For example, for the front desk of the first building that is the notification destination of the monitor region (1), the monitor information is not transmitted or image processing is not performed for the monitor region (1) when the first building is closed because it is out of the business hours. On the other hand, because the guardsmen are in the guardroom that is the notification destination of the monitor region (2) on a round-the-clock basis, the monitor information is transmitted to, and the image processing is performed for, the monitor region (2) continuously.
Similarly, in step 403 and step 404, it is also possible to take into consideration the time at which an image change was detected.
In the example shown in
In the example shown in
If an image change is detected in a monitor region when a plurality of monitor regions are set up in advance for an image to be captured by a monitor camera according to the method described above, the notification destination of the monitor information can be switched on a monitor region basis.
Because a plurality of monitor regions are provided for the image processing of the notification apparatus, the same effect as that of a plurality of cameras is achieved by one camera. This enables the notification apparatus to give more detailed monitor information and to transmit the monitor information to the notification destinations according to the need.
Next, a notification apparatus in a still another embodiment of the present invention will be described with reference to
In this example, the image captured by the monitor camera shown in
In the example shown in
In the example shown in
In step 601, the detection processing unit 1203 determines if there is an image change in the monitor region (1) 102 or the monitor region (2) 103. Control is passed to step 602 if an image change is detected; otherwise, control is passed to step 608.
In step 602, the characteristics extraction unit 1204 determines the number of the region in which the image change was detected. Control is passed to step 603 if the image change was detected in the monitor region (1), and to step 606 if the image change was detected in the monitor region (2).
In step 603, the characteristics extraction unit 1204 determines the number of blocks in which the image change was detected. Control is passed to step 608 if the number of blocks in which the image change was detected is less than six, to step 604 if the number of blocks is six or more but less than 12, and to step 605 if the number of blocks is 12 or more.
In step 604, the conversion unit 1205 generates monitor information, acquires the notification destination of the monitor information corresponding to six or more but less than 12 blocks, in which the image change was detected, from the notification destination table for the monitor region (1) shown in
In step 605, the conversion unit 1205 generates monitor information, acquires the notification destination of the monitor information corresponding to 12 or more blocks, in which the image change was detected, from the notification destination table for the monitor region (1) shown in
In step 606, the characteristics extraction unit 1204 determines the number of blocks in which the image change was detected as in step 603. Control is passed to step 608 if the number of blocks in which the image change was detected is less than six, and to step 607 if the number of blocks is six or more.
In step 607, the conversion unit 1205 generates monitor information, acquires the notification destination of the monitor information corresponding to six or more blocks, in which the image change was detected, from the notification destination table for the monitor region (2) shown in
In step 608, the conversion unit 1205 determines whether or not the monitor processing is to be continued or ended. Control is passed to step 601 if the monitor processing is continued.
The method described above allows the notification apparatus to determine the type of a moving object based on the number of blocks in which an image change was detected, that is, based on the size of the moving object, and to switch the notification destination of the monitor information according to the type of the moving object.
Although a plurality of monitor regions are set in the description of this embodiment, the same processing may also be applied, of course, when only one monitor region is set for an image captured by the monitor camera or when the whole area of an image captured by the monitor camera is used as one monitor region.
In the two embodiments described above, the notification destination of monitor information can be switched when an image change is detected in the monitor region (1) or the monitor region (2). However, it is impossible to determine to which location a person detected in the monitor region (1) is going; from the front gate to the first building or, conversely, from the first building to the front gate.
That is, when a person detected in the monitor region (1) is going from the first building to the front gate, there is no need to transmit the monitor information to the front of the first building and it is required to eliminate such an unnecessary transmission. The next embodiment, which satisfies this need, will be described with reference to
The characteristics extraction unit 1204′ has a timer unit 1401 that measures the elapsed time. In this embodiment, simple processing is performed using the time history of a monitor region, in which a moving object is detected, to trace the moving direction of a moving object and to reduce unnecessary transmissions. This embodiment will be described below more in detail.
Similarly, a person going from the front gate to the restricted zone is detected in the monitor region (3) 701 and then in the monitor region (2) 103, and a person going from the restricted zone to the front gate is detected in the monitor region (2) 103 and then in the monitor region (3) 701.
In the example in
That is, both a person going from the front gate to the first building and a person going from the restricted zone to the first building are detected in the monitor region (3) 701 and then in the monitor region (1) 102. This applies to other places.
Next, the processing flow of this embodiment will be described with reference to the flowchart shown in
In step 801, the detection processing unit 1203 detects if there is an image change. Control is passed to step 802 when there is a change, and to step 811 when there is no change.
In step 802, the characteristics extraction unit 1204′ determines if the region in which the image change was detected is the monitor region (3) 701. Control is passed to step 803 if the region is the monitor region (3); otherwise, control is passed to step 811. This is because, when the person goes to the first building or to the restricted area, the person is detected first in the monitor region (3) 701. As described above, when the person goes from the first building to the restricted zone, the person is detected first in the monitor region (3) 701.
In step 803, the characteristics extraction unit 1204′ determines the number of blocks in which the image change was detected. Control is passed to step 804 if the number of detected blocks is six or more but less than 12, that is, if the person 104 is detected. Otherwise, control is passed to step 811.
In step 804, the measurement of the elapsed time since the person is detected in monitor region (3) 701 is started. The characteristics extraction unit 1204′ resets the timer unit 1401 to determine the elapsed time and newly starts measuring the elapsed time.
In step 805, the characteristics extraction unit 1204′ determines if the person 104 is detected in the monitor region (1). Control is passed to step 807 if the person is detected in monitor region (1); otherwise, control is passed to step 806.
Step 805 is executed when the detection processing unit 1203 detects an image change by processing (not shown for brevity) executed on the image sent from the image receiving unit 1202 after step 804. In the description below, processing, which is required after step 805 and step 806, for determining the number of blocks, in which the image change was detected, is also omitted.
In step 806, the characteristics extraction unit 1204′ determines if the person 104 was detected in the monitor region (2). If the person 104 was detected in the monitor region (2), control is passed to step 809; otherwise, control is passed to step 810.
In step 807, the characteristics extraction unit 1204′ determines the elapsed time from the time when the image change was detected in the monitor region (3) to the time when the image change was detected in the monitor region (1). The characteristics extraction unit 1204′ reads the current elapsed time from the timer unit 1401. Control is passed to step 808 if the elapsed time is within a predetermined set time; otherwise control is passed to step 811.
In step 808, the conversion unit 1205 generates monitor information, acquires the notification destination of the monitor information, which is to be transmitted when an image change is detected in the monitor region (1), from the notification destination table A1208 shown in
In step 809, the characteristics extraction unit 1204′ determines the elapsed time from the time the image change was detected in the monitor region (3) to the time when the image change was detected in the monitor region (2). As in step 807 described above, the characteristics extraction unit 1204′ reads the current elapsed time from the timer unit 1401. Control is passed to step 810 if the elapsed time is within a predetermined set time; otherwise control is passed to step 811.
The set time used in the determination in step 809 may be different from that used in step 807. For example, the time used in the determination in step 807 or step 809 is defined in advance according to the distance from the monitor region (3). In addition, the elapsed time need not be measured using the timer unit 1401; instead, the elapsed time may be determined by recording the time at which the image change was detected in each monitor region. In that case, detection history information, composed of the number of the monitor region in which an abnormality was detected and information such as an abnormality detection time, is stored in the memory of the notification apparatus.
In step 810, the conversion unit 1205 generates monitor information, acquires the notification destination of the monitor information, which is to be transmitted when an image change is detected in the monitor region (2), from the notification destination table A1208 shown in
In step 811, whether the monitor processing is to be continued or ended is determined. When the monitor processing is continued, control is passed to step 801.
According to the method described above, sophisticated processing is possible in which the moving direction is determined by the time history of the monitor region where a moving object is detected and in which, depending upon the moving direction of the moving object, either the notification destination of the monitor information is switched or monitor information is not transmitted. That is, from the moving direction, it is possible to predict in which destination the moving object is moving.
In step 807 and step 809 described above, the detection interval is determined. For example, when a person goes from the front gate directly to the first building, the time required from the monitor region (3) to the monitor region (1) is short. On the other hand, when a person takes a walk from the front gate to the first building and does not have something to do in the first building, the time required from the monitor region (3) to the monitor region (1) is long and the detection interval is long. Therefore, the notification apparatus determines not only the moving direction but also the detection interval, thus transmitting only the needed monitor information to the notification destination and reducing unnecessary transmissions.
Note that the monitor information may be, of course, transmitted when a target is detected in the monitor region (1) or in the monitor region (2) after detecting the target in the monitor region (3) without determining the detection interval. Again, in this case, the transmission frequency of the monitor information can be reduced as compared when the monitor information must always be transmitted upon detecting it in one of the monitor regions.
In this embodiment, the moving direction of a moving object can be determined without using the known trace processing technologies that are complex, such as template matching, but can be determined through simple processing in which the time history of the monitor region, where the moving object is detected, is used. This reduces the load of the CPU in the notification apparatus. Of course, it is also possible to combine the conventional known image recognition technology with the trace processing technology to identify an object, to trace the identified object more accurately, and to determine the moving direction and destination so that unnecessary transmissions can be reduced and the monitor information can be transmitted to a notification destination where the monitor information is needed.
It is also possible to transmit preliminary alarm information to a predetermined notification destination as a temporary alarm when a moving object is detected in one of the monitor region (1) 102, monitor region (2) 103, and monitor region (3) 701.
In the above embodiments, image processing is performed for a monitor region, which is set in a part of the whole area of an image, to detect an intruding object. It is also possible to perform image processing for the whole area of an image and, when an object is detected, transmit low-level preliminary monitor information to a predetermined notification destination as a temporary alarm. In this case, the load of the notification apparatus, such as the memory capacity and the CPU, is increased because image processing is performed also for the whole area of the image. However, detection through image processing is also performed for each monitor region separately and, therefore, coarse detection processing using large pixel blocks may be performed as the image processing for the whole area.
The characteristics extraction unit can detect not only the characteristics or features of an object as described in the above embodiments but also the color of the object, the moving speed of the object, and so on to switch the destination of the monitor information. Similarly, the notification destination of the monitor information may also be switched according to the time zone in which the object is detected.
As described above, monitor information can be transmitted to different notification destinations according to the location where a moving object is detected or the size of the moving object in the above embodiments.
The application of the present invention is not limited to the field described above but includes various fields. For example, the present invention can be applied to a field other than monitoring.
While the embodiments have been described above, it is to be understood that the present invention is not limited to those embodiments but that various modifications and changes will be apparent to those skilled in the art without departing the spirit and the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2003-163918 | Jun 2003 | JP | national |
2003-338676 | Sep 2003 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
5412708 | Katz | May 1995 | A |
6445409 | Ito et al. | Sep 2002 | B1 |
6466258 | Mogenis et al. | Oct 2002 | B1 |
6496592 | Lanini | Dec 2002 | B1 |
6587046 | Joao | Jul 2003 | B1 |
Number | Date | Country |
---|---|---|
2001169270 | Jun 2001 | JP |
2001283225 | Oct 2001 | JP |
Number | Date | Country | |
---|---|---|---|
20040246123 A1 | Dec 2004 | US |