This application claims priority from Japanese Patent Applications No. 2003-386481 filed Nov. 17, 2003, No. 2003-387882 filed Nov. 18, 2003, No. 2003-380733 filed Nov. 11, 2003 and No. 2004-238444 filed Aug. 18, 2004, which are hereby incorporated by reference herein.
1. Field of the Invention
The present invention relates to the field of processing of images captured by a camera, and, more particularly, to an image processing apparatus, a network camera system, an image processing method and a program for enabling an image captured by a camera to be displayed by a display device that is connected to the camera via a network.
2. Description of Related Art
With the recent popularity and high-speed technology of the Internet and intranets, information transmission of still images and moving images via a network has become commonplace. For such information transmission, a network camera system has been put on the market, which is capable of capturing a surrounding image in real time and which allows the captured image to be displayed on a display device via the network so as to be viewable by a remote user. One example of the network camera system is WebView Livescope® System using Network Camera Server VB150 produced by Canon® Inc.
The network camera system typically includes a camera unit, a camera server and a display unit. The camera unit is controllable for panning, tilting and zooming in response to commands received from the user side. The camera server distributes images captured by the camera unit over the network. The display unit is connected to the network, which may be a personal computer. The network camera system thus enables a user on the user side of the display unit to view an image acquired at a remote place where the camera unit is located, and to control the operation of the camera unit for capturing the image.
There is a known technology for generating a panoramic image or normal image from an image captured using a wide-angle optical system or omnidirectional image-capture system, such as a fish-eye lens or solid-of-revolution mirror, and for allowing a user to view the panoramic image or normal image via a network.
For example, in the article entitled “Telepresence by Real-time View-dependent Image Generation from Omnidirectional Images”, by Y. Onoe, K. Yamazawa, N. Yokoya, and H. Takemura, in Technical Report of the Institute of Electronics, Information and Communication Engineers, PRMU97-20, May 1997, there is a disclosure of a telepresence system for transmitting an omnidirectional image captured using a solid-of-revolution hyperbolical mirror to a remote user and for generating a perspective projection image corresponding to the visual line of the user. In addition, in U.S. Pat. No. 6,043,837, assigned to Be Here Corporation, there is a disclosure of a method for transmitting a designated fan-like partial area of an omnidirectional image and for transforming the fan-like area to a rectangular area so as to be displayed on the user side.
At the center of the omnidirectional image shown in
The omnidirectional image shown in
In conventional network camera systems in which the performance of a display unit, such as a mobile phone or a portable terminal, is insufficient or the performance of a network is insufficient, it is very difficult for a user to understand which area of a captured image transmitted from a camera unit is changing.
The present invention is directed to overcoming the above-described drawbacks. The present invention provides an image processing apparatus, a network camera system, an image processing method and a program, for enabling a user to adequately understand a change in circumstances of a captured image even in a system in which the performance of a display unit is insufficient or the performance of a network is insufficient.
In an aspect of the present invention, there is provided an image processing apparatus for processing an image captured by a camera. The image processing apparatus includes: a detection device for detecting a change in circumstances surrounding the camera; an acquisition device for acquiring an image captured by the camera in response to detection by the detection device; an extraction device for extracting, from the captured image, a partial image corresponding to an area in which the change in circumstances has occurred; and an output device for outputting the partial image extracted by the extraction device.
The above and further features and advantages of the present invention will become apparent to those skilled in the art upon reading of the following detailed description of embodiments thereof when taken in conjunction with the accompanying drawings, in which like reference characters designate the same or similar parts throughout the figures thereof.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
Embodiments of the invention will be described in detail below with reference to the drawings.
The optical system 111 is used for capturing an image. The sensor 113 detects a change in circumstances surrounding the camera unit 11 and a direction in which an area where such a change has occurred exists. The image capture portion 112 includes a CCD (charge-coupled device) or a CMOS (complementary metal oxide semiconductor) image sensor. The camera control portion 114 performs a camera control operation including focusing, aperture setting, white balance, shutter release, etc., processing of a signal from the sensor 113, compression of image data from the image capture portion 112, and extraction of a partial image corresponding to an area where a change has occurred. The wireless I/F 115 is adapted for transmitting, through wireless communication, the extracted partial image to the server unit 12.
When the sensor 113 detects a change in circumstances surrounding the camera unit 11 and a direction in which an area where such a change has occurred exists, the image capture portion 112 and the camera control portion 114 capture a surrounding image formed by the optical system 111. The camera control portion 114 then extracts, from the captured image, a partial image located in the direction detected by the sensor 113 and transmits the extracted partial image to the server unit 12 via wireless communication. The server unit 12 transmits the received image data to the network 13. The network 13 may be the Internet, an intranet or the like. The viewer 14 receives the image data from the network 13 and displays an image on the display device 143. The viewer 14 can be located anywhere as long as it is connectable to the network 13. Thus, a remote user can find a change in circumstances at the place where the camera unit 11 is located.
Communication between the camera unit 11 and the server unit 12 is performed via wireless communication. Thus, the camera unit 11 is separated from the server unit 12, the place of which is restricted due to connection to the network 13, which usually employs wire communication. Accordingly, the camera unit 11 can be freely placed at any position where the user wishes to monitor. Depending on the application or usage of the system, communication between the camera unit 11 and the server unit 12 may be performed thorough wire communication by cables or through direct connection.
As one example of the wireless communication method, there is the Bluetooth standard employing spread spectrum communication technology, which is a low-cost communication method developed for consumer use. The Bluetooth standard uses spread spectrum modulation of frequency-hopping of the 2.4 GHz band and is suited for transmitting data of about 700 kbps at an interval of 10-100 m. The Bluetooth standard enables a small-sized, low-cost and low-power-consumption circuit element, which can, therefore, be incorporated into a small-sized apparatus.
The optical system 111 enables a wide range of surveillance with a single camera unit by employing a fish-eye lens having an angle of view of about 180 degrees or a solid-of-revolution mirror having an angle of view of 360 degrees on one side and reflecting an omnidirectional image. An optical system for use in an ordinary camera can be used as the optical system 111. In the following discussion, an omnidirectional optical system using a solid-of-revolution mirror is taken as an example of the optical system 111.
In the server unit 12, the wireless I/F 121 receives, through wireless communication, image data from the camera unit 11. The server control portion 122 processes the received image data to correct distortion of a captured image caused by the solid-of-revolution mirror of the camera unit 11 and performs a network sever function. The network I/F 123 transmits distortion-corrected, rectangular image data to the network 13.
As an example of the network server function, WebView Protocol produced by Canon® Inc. is usable with WWW (World Wide Web) browsers widely used in the Internet.
The viewer 14 receives rectangular image data from the server unit 12 via the network 13 and displays an image represented by the image data on the display device 143. In the example shown in
In addition, a wireless portable terminal that is typified by a mobile phone using a wireless public network can be used as the viewer 24. In such a case, the user can find a change in the monitored place wherever he is within a coverage area of the mobile phone service. The function of a wireless router in that case may be performed by a network router, a telephone exchange, a wireless local station, etc., that belong to the telephone carrier.
The server unit 52 includes a wireless communication I/F 521, a communication I/F 522, a memory 523, a processor 524, a charging portion 525, a network interface 526 and a power source portion 527. The wireless communication I/F 521 receives data via wireless communication from the camera unit 51. The communication I/F 522 is used when the camera unit 51 is connected directly to the server unit 52. The processor 524 receives data from the wireless communication I/F 518 or the communication I/F 519, converts image data distorted by the solid-of-revolution mirror (optical system) into distortionless rectangular image data, and transmits the rectangular image data to the network interface 526. Further, the processor 524 functions as an image server on a network 53. The memory 523 is used for processing by the processor 524. The network interface 526 performs transmission and reception of data via the network 53. The charging portion 525 charges the battery 5111 of the camera unit 51 when the camera unit 51 is connected directly to the server unit 52. The power source portion 527 supplies electric power to the entirety of the server unit 52.
In the camera unit 51, electric power is normally supplied only to a very limited number of parts, such as the sensors 515A to 515D and the sensor control portion 516 for detecting a change in surrounding circumstances. The other parts are normally in a sleep mode so as to reduce power consumption of the battery 5111.
When at least one of the sensors 515A to 515D has detected a change in circumstances surrounding the camera unit 51, the whole camera unit 51 transitions from the sleep state to an operating state and instantaneously captures an omnidirectional image. The image compression portion 513 converts image data obtained by the CCD 511 and the image-capture processing portion 512 into compressed data in the JPEG or MPEG format. Then, the memory 514 stores the compressed data.
As an example of each of the sensors 515A to 515D, there is a pyroelectric motion sensor that detects a change in infrared rays emitted from a human body or the like. Since the pyroelectric motion sensor has the angular directivity of several tens of degrees, a plurality of sensors are required, as shown in
The sensor control portion 516 detects a direction in which an area where a change has occurred exists, by determining which of the four sensors 61A to 61D (515A to 515D) has detected the change. If an increased number of sensors having finer directivity are used, the precision of detection of the direction can be increased.
As another example of each of the sensors 515A to 515D, an audio sensor using a microphone can also be used. In such a case, a plurality of audio sensors having directivity characteristics in the same manner as shown in
If infrared sensors typified by pyroelectric motion sensors and audio sensors typified by microphones are used in combination, detection of an intruder or the like can be performed more accurately.
Referring to
The processor 517 extracts a partial image located in an area centered in the direction detected by one of the sensors 515A to 515D. The extracted partial image is transmitted to the server unit 52 and is then subjected to processing for removing image distortion. Considering that the extracted partial image is displayed on the viewer after transmission via the network, it is appropriate to extract a fan-shaped image 85 as shown in
At step S13, the processor 517 transmits the extracted partial image to the server unit 52 via the wireless communication I/F 518 or the communication I/F 519.
The wireless communication I/F 518 is used for transmission of the extracted partial image in cases where the camera unit 51 is used separately from the server unit 52 so as to allow the camera unit 51 to be freely disposed at any location. Electric power to the camera unit 51 is supplied from the battery 5111 via the battery control portion 5110. Thus, the camera unit 51 is used in a wireless condition.
In such a wireless condition, the camera unit 51 is driven with power of the battery 5111. Therefore, it is important to reduce power consumption of the camera unit 51. According to the extraction process described above, processing by the processor 517 required for extraction aims only at extraction of a fan-shaped image centered in the detected direction. Thus, a processor for use in a low-consumption portable device or the like suffices for such processing.
In cases where the camera unit 51 is directly connected to the server unit 52, the communication I/F 519 is used to transmit data to the server unit 52. The communication I/F 519 can use a variety of communication standards, for example, USB (universal serial bus), IEEE1394, etc.
These communication standards enable high-speed data communication as compared to wireless communication. Therefore, in the case of direct connection, a large amount of image data, such as a moving image, can be transmitted. Further, in the case of direct connection, the battery charging function is required to charge the battery 5111 inside the camera unit 51.
As described above, according to the first embodiment, the camera unit captures a surrounding image in response to detection timing of a plurality of sensors 61A to 61D having directivity for detecting a change in circumstances surrounding the camera unit and a direction in which an area where the change has occurred exists. The camera unit then extracts a partial image located in the area corresponding to the detected direction and transmits data of the extracted partial image to the network. Accordingly, observation from a remote location can be performed by simply placing the camera unit in an arbitrary monitoring position, for example, in the center of a room.
Furthermore, in addition to use of wireless communication, a battery is used as the power source of the camera unit, so that no connection cables are required. Accordingly, the freedom of placement of the camera unit increases dramatically, and specific work for installation of the camera unit is unnecessary. The aim of monitoring of circumstances can be achieved by simply placing the camera unit in an intended location when needed.
Furthermore, electric power is normally supplied only to the sensors to monitor the surrounding circumstances, and the whole camera unit is activated only when the sensors have detected a change in circumstances. In addition, the processing operation of the camera unit is simplified. Accordingly, low consumption is attained, and long-term monitoring of circumstances can be performed even with the battery-powered camera unit.
While, in the first embodiment, a system in which the camera unit is network-connected to the viewer via the server unit has been described, the system is not limited to such a construction. For example, the camera unit may be connected directly to a wireless public network, thereby making it possible to further increase locations where the camera unit can be placed.
Change-indicating image data extracted from an image captured when a change in surrounding circumstances has been detected is transmitted directly to the wireless public network, and is then received by a viewer, such as a mobile phone, for display.
A network camera system according to a second embodiment of the invention differs from the first embodiment in the method of detecting a change in circumstances and extracting a partial image.
In the network camera system according to the second embodiment, the hardware arrangement, the appearance of a camera unit and a server unit, the construction of the camera unit and the server unit, and the arrangement of sensors are the same as those of the first embodiment shown in
In the second embodiment, a surrounding image is captured at preset timing in the normal situation where there is no change. Then, the surrounding image is stored in the memory 514 shown in
Referring to
At step S23, the processor 517 compares the captured image data with the comparative surrounding image so as to determine if a change has occurred. The determination of whether a change has occurred uses a data value of each image area and determines whether the difference in the data values of each image area exceeds a predetermined threshold value. If it is determined that there is no change, the flow returns to step S22, where the processor 517 waits for detection by the sensors 515A to 515D.
If it is determined at step S23 that a change has occurred, the flow proceeds to step S24. At step S24, the processor 517 specifies pixels or blocks of pixels where the change has occurred and extracts a change-indicating partial image from the captured image on the basis of the specified pixels or blocks of pixels.
Image data for use in detecting a change may be data obtained before image compression or data obtained after image compression.
At step S25, the processor 517 transmits the extracted partial image to the server unit 52 via the wireless communication I/F 518 or the communication I/F 519.
While, in the second embodiment, detection of a change in surrounding circumstances is performed by the sensors 515A to 515D, image data captured by the CCD 511 may be used to detect a change in surrounding circumstances.
In this case, the CCD 511, the image-capture processing portion 512, the image compression portion 513, the processor 517 and the memory 514 in the camera unit 51 are always kept in an operating state so as to capture a surrounding image continuously or at intervals of a short period in seconds. The processor 517 compares the currently-captured latest image data with the previously-captured image data and detects a change in surrounding circumstances by determining that a difference data value exceeds a predetermined threshold value.
A network camera system according to a third embodiment of the invention differs from the first embodiment and the second embodiment in an image transmission process and an image display process.
In the network camera system according to the third embodiment, the hardware arrangement, the appearance of a camera unit and a server unit, the construction of the camera unit and the server unit, and the arrangement of sensors are the same as those of the first embodiment shown in
Operation of the network camera system according to the third embodiment is described below with reference to
The server unit 52 converts the circular omnidirectional image 1308 into a rectangular image 1302, which is easy for an observer to recognize. The rectangular image 1302 is a horizontally long image with a resolution of 400×1600 pixels.
An image 1304 results from subjecting the rectangular image 1302 to reduction processing in accordance with the display resolution (for example, 120×160 pixels) of the small-sized display device 1204 of the viewer 1200.
An omnidirectional image 1301 is captured by the camera unit 51 when a change in surrounding circumstances has been detected by the sensors 515A to 515D. On the basis of a comparison between the omnidirectional image 1301 and the pre-captured normal omnidirectional image 1308, a change-indicating partial image in the form of a fan indicated by dotted lines is extracted by the camera unit 51.
A rectangular partial image 1305 is converted from the change-indicating partial image as extracted by the camera unit 51. The server unit 52 converts the fan-shaped partial image into the rectangular partial image 1305.
The server unit 52 transmits the rectangular partial image 1305 to the viewer 1200 via the network 1205. The viewer 1200 stores the rectangular partial image 1305 as an image 1306 having the same resolution.
An image 1307 is obtained by superimposing the change-indicating partial image 1306 on the reduced surrounding image 1304 after adjusting their resolution and positional relationship.
In order to acquire an omnidirectional image, a user first installs the camera unit 51 in a desired place, such as a room, to be monitored. After installation of the camera unit 51, the user performs an operation for starting a monitoring action. For example, the user turns on the power supply of the camera unit 51 and the server unit 52. The camera unit 51 causes the processor 517, etc., to produce a predetermined delay time from the timing of the turning-on of the power supply. After the elapse of the delay time, the camera unit 51 captures an omnidirectional image for one frame and sets it as a normal omnidirectional image 1308. Providing such a delay time makes it possible for a user who has placed the camera unit 51 to acquire a normal omnidirectional image having no image of the user himself captured.
Then, the camera unit 51 transmits the omnidirectional image 1308 to the server unit 52. After completion of this transmission, the camera unit 51 goes into a sleep state, i.e., a standby state. The server unit 52 converts the circular image 1308 into a rectangular image 1302 and stores the rectangular image 1302 in the memory 523. The resolution of the rectangular image 1302 is large compared with the display resolution of an ordinary mobile phone or the like (for example, 120×160 pixels) Therefore, the rectangular image 1302, if left as it is, can be displayed only in part on the mobile phone or the like. Accordingly, the server unit 52 performs a reduction process for converting the rectangular image 1302 into an image 1304 having a resolution coinciding with the vertical resolution of the display device 1204 of the viewer 1200 (for example, 120 Pixels).
The server unit 52 then transmits the reduced rectangular image 1304 to the viewer 1200 via the network 1205.
The viewer 1200 receives the reduced rectangular image 1304, stores it as a normal surrounding image in the memory 1204, and displays the normal surrounding image on the display device 1204. Such a function of displaying on the viewer 1200 a surrounding image obtained at the time of installation of the camera unit 51 makes it possible to inform the user of completion of the correct installation of the camera unit 51.
Furthermore, in this case, the user may be informed of completion of the installation of the camera unit 51 with characters displayed on the display device 1204 or sound produced by the viewer 1200 in addition to the displayed surrounding image.
A method for displaying the reduced rectangular image on the viewer 1200 according to the third embodiment is described below with reference to FIGS. 12 to 15.
At step S31, the processor 517 sets, as a comparative surrounding image, surrounding image data stored in the memory 514. This surrounding image data is, as described above, an image in a normal condition which has been captured at the time of installation of the camera unit 51. Then, the processor 517 waits for detection of a change in circumstances by the sensors 515A to 515D.
This surrounding image is initially used as a normal comparative image. In order to periodically detect a change in surrounding circumstances occurring with time, such a method may be employed that, for example, a timer is used to allow the camera unit 51 to capture an image at intervals of a predetermined period of time, and each captured image is used as a new normal surrounding image.
At step S32, when the sensors 515A to 515D have detected a change in circumstances surrounding the camera unit 51, the camera unit 51 comes into an operating state from the sleep state and instantaneously captures an omnidirectional image. Image data obtained by the CCD 511 is then stored in the memory 514 via the image-capture processing portion 512 and the image compression portion 513.
At step S33, the processor 517 compares the image data captured at the timing of detection by the sensor 515A to 515D with the comparative surrounding image to determine if a change has occurred with a data value of each image area exceeding a predetermined threshold value. If it is determined that there is no change, the flow returns to step S32, where the processor 517 waits for detection by the sensors 515A to 515D.
If it is determined at step S33 that a change has occurred, the flow proceeds to step S34. At step S34, the processor 517 specifies pixels or blocks of pixels where the change has occurred and extracts a change-indicating partial image from the captured image on the basis of the specified pixels or blocks of pixels.
As shown in
At step S36, the server unit 52 converts the extracted partial image into a rectangular image 1305 and transmits the rectangular image 1305 to the viewer 1200. When transmitting the extracted fan-shaped image to the server unit 52, the camera unit 51 additionally transmits information on the location of the extracted image relative to the omnidirectional image 1301. On the basis of this location information, the server unit 52 performs conversion into the rectangular image 1305. Further, when transmitting the rectangular image 1305 to the viewer 1200, the server unit 52 additionally transmits the location information.
The rectangular image 1305 is then stored in the memory 1202 of the viewer 1200 as a change-indicating partial image 1306 having the same resolution. The change-indicating partial image 1306 can be displayed on the display device 1204 without changing its resolution. However, the viewer 1200 forms an image 1307 by superimposing the change-indicating partial image 1306 on the reduced omnidirectional image 1304 after adjusting their resolution and positional relationship. At step S37, the viewer 1200 displays the combined image 1307 on the display deice 1204 which enables the user to more accurately recognize the surrounding circumstances.
The server unit 52 converts the circular omnidirectional image as received to a rectangular image (
The camera unit 51 compares the omnidirectional image captured at the time of detection by the sensors 515A to 515D (
The camera unit 51 transmits the fan-shaped extracted image (
The server unit 52 transmits the rectangular extracted image (
Therefore, the third embodiment provides the function of superimposing the rectangular extracted image on the normal omnidirectional image previously transmitted to the viewer 1200. As shown in
As described above, according to the third embodiment, when a change in circumstances, for example, intrusion of a person, at a monitoring place is detected, only a partial image corresponding to the change included in a surrounding image captured at that time is transmitted to a display device via a network. At the display device, the partial image is displayed in superimposition on a normal surrounding image previously received. Accordingly, even with the use of a small display device with a resolution of 120×160 pixels mounted on a small-sized portable apparatus (for example, a portable communication terminal typified by a mobile phone), a user can accurately recognize the circumstances of the monitoring place.
Furthermore, when a camera unit is installed, a normal surrounding image having no image of a user himself is transmitted to a viewer and is displayed thereon. In addition, the user is informed of such transmission by the viewer. Accordingly, the user can confirm completion of steady installation of the camera unit.
In the second and third embodiments, the comparative surrounding image stored in the memory 514 in the initial stage of a starting operation of the system is a surrounding image in a normal condition captured at the time of installation of the camera unit 51.
In a fourth embodiment of the invention, one of or a combination of two or more of the following timing defining methods (1) to (3) are employed to update the comparative surrounding image at any time in order to deal with a change in surrounding of the camera unit 51 occurring with time. In association with updating of the comparative surrounding image, a surrounding image stored in the memory 1204 of the viewer 1200 is also updated at any time in accordance with the same method.
(1) a timer or the like is used to cause the camera unit 51 to capture an image at intervals of a predetermined period of time, and each captured image is used as a new normal surrounding image.
(2) A user operates the viewer 1200 to transmit, to the camera unit 51 via the network 53 (1205) and the server unit 52, a command for capturing a new surrounding image so as to update the existing surrounding image.
(3) The camera unit 51 is provided with a luminance sensor for detecting surrounding luminance. When a predetermined change in luminance is detected by the luminance sensor, the camera unit 51 automatically updates the surrounding image. In addition, a capturing operation for the comparative surrounding image and the omnidirectional image at the time of detection of a change may be always accompanied by flash emission, carrying out both the function of capturing a clear image and the function of giving warning to an intruder.
The process based on each of the above methods (1) to (3) can be performed, for example, at step S21 shown in
In the above-described embodiments, when an image display process is performed at the viewer 1200, the resolution of a surrounding image stored in the viewer 1200 may not coincide with that of an extracted image transmitted to the viewer 1200. In a fifth embodiment of the invention, a process for adjusting resolution is performed in accordance with one of the following methods (1) to (4) to appropriately display the extracted image in superimposition on the surrounding image.
(1) The surrounding image stored in the viewer 1200 is an image having a vertical resolution reduced to, for example, 120 pixels. Before the extracted image is transmitted from the server unit 52 to the viewer 1200, the server unit 52 performs, on the extracted image, a reduction process having the same reduction ratio as that of the surrounding image. After that, the server unit 52 transmits to the viewer 1200 the reduced, extracted image together with location information for superimposition.
(2) The server unit 52 transmits to the viewer 1200 the extracted image with its resolution kept unchanged without performing a reduction process on the extracted image. The viewer 1200 stores the extracted image. After that, the viewer 1200 performs on the stored, extracted image the same reduction process as that of the above method (1).
(3) The server unit 52 transmits to the viewer 1200 the extracted image with its resolution kept unchanged without performing a reduction process on the extracted image. The viewer 1200 stores the extracted image. After that, in cases where a user intends to display the stored, extracted image in a given size, the viewer 1200 performs a magnifying/reduction process on the extracted image, and also performs a magnifying/reduction process on the surrounding image stored in the viewer 1200 at an optimum magnifying/reduction ratio with respect to the whole of the surrounding image or its part displayed on the viewer 1200.
(4) In cases where the surrounding image is magnified in a large size as a result of the magnifying/reduction process performed in accordance with the above method (3) although the resolution of the surrounding image stored in the viewer 1200 is low as compared with the extracted image, the surrounding image may be blurred due to a pixel interpolation process associated with the magnifying process. To solve this problem, with regard to a part of the surrounding image displayed on the viewer 1200, after displaying the blurred surrounding image, the viewer 1200 requests the server unit 52 to transmit a surrounding image not yet subjected to the reduction process, receives such a higher-resolution surrounding image, and replaces the existing surrounding image with the higher-resolution surrounding image having the same resolution as that of the extracted image.
The process based on each of the above methods (1) to (4) can be performed, for example, at step S36 and step S37 shown in
In the above-described embodiments, only one image is captured when a change in circumstances has been detected. Therefore, it is only possible to recognize a stationary state of an object causing such a change. In a sixth embodiment of the invention, a continuous shooting operation of the camera unit 51 and a continuous displaying operation of the viewer 1200 are provided to make it also possible to recognize the movement of an object.
A method for displaying a reduced rectangular image on the viewer 1200 according to the sixth embodiment is described below with reference to
On the viewer 1200 according to the sixth embodiment, an image is displayed in the same manner as shown in
At step S41, the processor 517 sets, as a comparative surrounding image, surrounding image data stored in the memory 514. This surrounding image data is, as described above, an image in a normal condition which has been captured at the time of installation of the camera unit 51. Then, the processor 517 waits for detection of a change in circumstances by the sensors 515A to 515D.
At step S42, when the sensors 515A to 515D have detected a change in circumstances surrounding the camera unit 51, the camera unit 51 enters into an operating state from the sleep state and instantaneously captures the first omnidirectional image. After that, if the sensors 515A to 515D have continuously detected a change in the surrounding circumstances for a continuous period of time, the camera unit 51 captures a plurality of omnidirectional images in series, for example, at intervals of a given period during the detection period. Image data for a plurality of captured omnidirectional images are then sequentially stored in the memory 514 via the image-capture processing portion 512 and the image compression portion 513.
At step S43, the processor 517 compares the image data for each of a plurality of omnidirectional images captured at the timing of detection by the sensor 515A to 515D with the comparative surrounding image to determine if a change has occurred with a data value of each image area exceeding a predetermined threshold value. If it is determined that there is no change, the flow returns to step S42, where the processor 517 waits for detection by the sensors 515A to 515D.
If it is determined at step S43 that a change has occurred, the flow proceeds to step S44. At step S44, the processor 517 specifies pixels or blocks of pixels where the change has occurred and extracts change-indicating partial images from the captured image on the basis of the specified pixels or blocks of pixels. Thus, the processor 517 produces a plurality of extracted partial images.
At step S46, the server unit 52 converts the extracted partial images into rectangular images 1802 and 1803 and transmits the rectangular images 1802 and 1803 to the viewer 1200. When transmitting the extracted fan-shaped image to the server unit 52, the camera unit 51 additionally transmits information about the location of the extracted image relative to the omnidirectional image 1801. On the basis of this location information, the server unit 52 performs conversion into a rectangular image. Further, when transmitting the rectangular image to the viewer 1200, the server unit 52 additionally transmits the location information. The above-described process is sequentially performed for each of a plurality of omnidirectional images captured at the timing of detection of changes.
At step S47, the viewer 1200 forms and displays each of a plurality of images 1806 and 1807 by superimposing each of a plurality of change-indicating partial images 1804 and 1805 on a reduced omnidirectional image 1808 after adjusting their resolution and positional relationship.
The plurality of images 1806 and 1807 each having a superimposed partial image are displayed sequentially over time, so that a user can accurately recognize the surrounding circumstances including the movement of an object. In this instance, such a sequential displaying operation may be performed in sequence in accordance with timing of receiving extracted images from the server unit 52, or may be performed in order after the viewer 1200 receives a series of change-indicating partial images.
In the initial stage of a starting operation of the system, the comparative surrounding image stored in the memory 514 is a surrounding image in a normal condition captured at the time of installation of the camera unit 51. After that, the comparative surrounding image may be updated at any time in accordance with one of the methods (1) to (3) described in the fourth embodiment.
The server unit 52 converts the circular omnidirectional image received to a rectangular image (
The camera unit 51 compares each of a plurality of (n=2 in the case of
The camera unit 51 transmits n fan-shaped extracted images (
If the viewer 1200 displays the rectangular extracted images as shown in
To solve this problem, the viewer 1200 produces n images as shown in
The viewer 1200 sequentially displays the n images each having a superimposed partial image which enables a user to correctly recognize the surrounding images while monitoring the movement of a target object.
In this instance, the resolution of a surrounding image stored in the viewer 1200 may not necessarily coincide with that of an extracted image (
As described above, according to the sixth embodiment, when a change in circumstances, for example, intrusion of a person, at a monitoring place is detected, only partial images corresponding to the change included in a surrounding image captured at that time are transmitted as frame-advance images to a display device via a network. At the display device, the partial images are sequentially displayed in superimposition on a normal surrounding image previously received. Accordingly, even with the use of a small display device with a resolution of 120×160 pixels mounted on a small-sized portable apparatus (for example, a portable communication terminal typified by a mobile phone), a user can accurately recognize the circumstances of the monitoring place.
Furthermore, when a camera unit is installed, a normal surrounding image having no image of a user himself is transmitted to a viewer and is displayed thereon. In addition, the user is informed of such transmission by the viewer. Accordingly, the user can confirm completion of steady installation of the camera unit.
The present invention can also be achieved by providing a system or apparatus with a storage medium that stores program code of software for realizing the functions of the above-described embodiments, and causing a computer (or a CPU (central processing unit), MPU (micro-processing unit) or the like) of the system or apparatus to read the program code from the storage medium and then to execute the program code.
In this case, the program code itself read from the storage medium realizes the functions of the embodiments.
In addition, the storage medium for providing the program code includes a flexible disk, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM (compact disk—read-only memory), a CD-R (compact disk—recordable), a CD-RW (compact disk—rewritable), a DVD-ROM (digital versatile disk—read-only memory), a DVD-RAM (digital versatile disk—random-access memory), a DVD-RW (digital versatile disk—rewritable), a DVD-R (digital versatile disk—recordable), a magnetic tape, a non-volatile memory card, a ROM (read-only memory), etc.
Furthermore, besides the program code read by the computer being executed to realize the functions of the above-described embodiments, the present invention includes an OS (operating system) or the like running on the computer performing an actual process in whole or in part according to instructions of the program code to realize the functions of the above-described embodiments.
Moreover, the present invention also includes a CPU or the like contained in a function expansion board inserted into the computer or in a function expansion unit connected to the computer, the function expansion board or the function expansion unit having a memory in which the program code read from the storage medium is written, the CPU or the like performing an actual process in whole or in part according to instructions of the program code to realize the functions of the above-described embodiments.
The invention has been described in detail with particular reference to exemplary embodiments thereof, but it will be understood that variations and modifications can be effected within the scope of the invention as described above, and as noted in the appended claims, by a person of ordinary skill in the art without departing from the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2003-386481 | Nov 2003 | JP | national |
2003-387882 | Nov 2003 | JP | national |
2003-380733 | Nov 2003 | JP | national |
2004-238444 | Aug 2004 | JP | national |