The present invention relates to a person count apparatus, a person count method, and non-transitory computer-readable storage medium.
In recent years, surveillance cameras have increasingly been installed in various places. In addition, a technique of performing analysis for the purpose of security and sales promotion by installing surveillance cameras in a building and reproducing and browsing images captured by the surveillance cameras by the manager of the building or the like has recently been developed.
For example, PTL 1 describes a technique of detecting the number of specific objects such as persons from an image captured by an image capturing apparatus and outputting information corresponding to the number. Furthermore, PTL 2 describes a technique of displaying a detected result to be readily understood by a manager or the like. More specifically, PTL 2 discloses a system for displaying the histogram of the numbers of persons entering a predetermined area for each time period, and displaying, in response to selection of a graph of an arbitrary time period of the histogram, an image captured by an image capturing apparatus during the selected time period
PTL 1: Japanese Patent Laid-Open No. 2014-6586
PTL 2: Japanese Patent Laid-Open No. 2010-181920
Depending on the purpose of analysis of an image, it is necessary to grasp, for each time period, the number of persons existing in a predetermined region and the number of persons passing a predetermined position in the predetermined region in consideration of movements of persons. In addition, it is important to present (display) the counted numbers of persons in a form to be readily understood by a manager or the like. However, the techniques disclosed in the above-described literatures do not satisfy these requirements.
The present invention provides a technique of detecting the number of persons existing in a predetermined region and the number of persons passing a predetermined position in the predetermined region and performing display in accordance with each detection method in consideration of the above-described problem.
As one solution for achieving the above object, a person count apparatus of the present invention has the following arrangement. That is, there is provided a person count apparatus comprising an obtaining unit configured to obtain a person count result of a region person count that counts the number of persons existing in a predetermined region of an image captured by an image capturing unit and a person count result of a passing person count that counts the number of persons passing a predetermined line set in an image captured by the image capturing unit; and a display control unit configured to display, on a display unit, a plurality of person count results for each time period based on the region person count based on the person count result of the region person count obtained by the obtaining unit, and displaying, on the display unit, a plurality of person count result for each time period based on the passing person count based on the person count result of the passing person count obtained by the obtaining unit, wherein the display control unit displays, on the display unit, a still image captured in a time period corresponding to a person count result which is selected from the plurality of person count results for each time period based on the region person count, and reproduces a moving image captured in a time period corresponding to a person count result which is selected from the plurality of person count results for each time period based on the passing person count.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
The present invention will be described in detail below based on embodiments of the present invention with reference to the accompanying drawings. Note that arrangements to be described in the following embodiments are merely examples, and the present invention is not limited to the illustrated arrangements.
Image data 108 is image data generated by performing image capturing by a camera (not shown) as an image capturing unit connected to the person count management apparatus 101. The image data 108 is saved in an HDD (Hard Disc Drive) 204 (
The result image creation unit 104 creates, from the image data received from the image acquisition unit 105 or data received from a database 109 via the file input/output unit 106, a result image (person count result image) to be presented to the user. In this embodiment, the result image creation unit 104 is configured to create a result image but can create a result other than an image. The graph creation unit 103 receives, via the display control unit 107, information set in a display unit 208 (
The CPU 201 comprehensively controls the person count management apparatus 101. The ROM 202 stores a program and the like to be used by the CPU 201 to control the person count management apparatus 101. Note that a secondary storage device may be used instead of the ROM 202. The RAM 203 is a memory for deploying a program read out from the ROM 202 and for executing processing. The RAM 203 serves as a temporary storage memory, and is also used as a storage area to temporarily store data to undergo various processes. The HDD 204 stores the database 109. The network interface 205 serves as an interface for performing communication via a network 206 to acquire the image data 108 of a connection destination. The display unit 208 is a device including a screen such as a display for displaying the information received from the image acquisition unit 105, the graph creation unit 103, the result image creation unit 104, or the like. The operation unit 207 is used by an input operation performed by the user, and includes, for example, a mouse and a touch panel.
As described above, the hardware arrangement of the person count management apparatus 101 includes the same hardware components as those mounted on a general PC (Personal Computer). Therefore, various functions implemented by the person count management apparatus 101 can be implemented as software operating on a general PC.
The operation of the person count management apparatus 101 will be described. First, the display control unit 107 displays the image data 108 received via the image acquisition unit 105 on the screen of the display unit 208.
In the image display area 302, the image of the image data 108 acquired by the image acquisition unit 105 via the network interface 205 is displayed. The region person count button 303 is a button for instructing to count the number of persons existing in a predetermined region for each time period. The passing person count button 304 is a button for instructing to count the number of persons passing a predetermined position in the predetermined region for each time period. The graph setting dialog activation button 305 is a button for instructing to display, on the display unit 208, a graph setting dialog to be used to make settings for creating/displaying a graph.
The operation of the person count management apparatus 101 when the user selects each of the region person count button 303, the passing person count button 304, and the graph setting dialog activation button 305 via the operation unit 207 will be described below.
A case in which the user selects the region person count button 303 via the operation unit 207 will first be explained. Note that the image data 108 for which the object detection unit 102 measures the number of persons when the region person count button 303 is selected is a moving image or a still image. In response to selection of the region person count button 303, the object detection unit 102 measures the number (person count) of persons in the image data 108. The object detection unit 102 continuously acquires the image data 108, and continuously measures the person count. The object detection unit 102 outputs, as measurement data, pieces of information of the image capturing time of the image data 108 as a person count measurement target, the measured person count, the coordinates and size of each detected person in the image, and the image data 108 to the file input/output unit 106 in association with each other. The file input/output unit 106 saves the measurement data received from the object detection unit 102 in the database 109. Note that the saving destination of the data is not limited to the database 109.
A case in which the user selects the passing person count button 304 via the operation unit 207 will be described next. Note that the image data 108 which serves as a person count measurement target of the object detection unit 102 when the passing person count button 304 is selected is a moving image. In response to selection of the passing person count button 304, the object detection unit 102 measures the numbers of persons passing a detection line (passage line) set in the image of the image data 108 both in the same direction (positive direction) as a set direction and in a direction (opposite direction) opposite to the set direction. The detection line and the direction may be set by the user, as will be described later with reference to
The object detection unit 102 continuously acquires the image data 108, and continuously measures the person count. The object detection unit 102 outputs, as measurement data, pieces of information of the image capturing time of the image data 108 as a person count measurement target, the measured person count, the coordinates and size of each detected person in the image, the set detection line, the set direction (positive direction), and the image data 108 to the file input/output unit 106 in association with each other. The file input/output unit 106 saves the measurement data received from the object detection unit 102 in the database 109. Note that the saving destination of the data is not limited to the database 109.
A case in which the user selects the graph setting dialog activation button 305 via the operation unit 207 will be described next. In response to selection of the graph setting dialog activation button 305, the display control unit 107 displays, on the screen of the display unit 208, the graph setting dialog to be used to make settings for creating/displaying a graph.
The graph setting dialog 601 shown in
The start date and time/end date and time setting region 602 is a region where a graph creating target period is set, and the user sets the target period by specifying a start date and time and an end date and time. In the type setting region 603, the user can select one of the “region person count” and the “passing person count” from, for example, a pull-down list. If the “region person count” is selected, the graph creation unit 103 creates a graph corresponding to the person count measured in response to selection of the region person count button 303. If the “passing person count” is selected, the graph creation unit 103 creates a graph corresponding to the person count measured in response to selection of the passing person count button 304.
When the user completes the settings in the start date and time/end date and time setting region 602 and the type setting region 603, and selects the search button 604, the detection region setting region 605 and the totalization unit setting region 606 which have been selected by the CPU 201 to correspond to the settings in the start date and time/end date and time setting region 602 and the type setting region 603 are displayed. The detection region setting region 605 is a region where an image capturing target region is set, and the user can select the region from, for example, a pull-down list. Note that it may be configured to designate, as the detection region setting region 605, the image capturing unit used to generate the image data 108. The totalization unit setting region 606 is a region where the totalization unit time of graph creation is set, and the user can select the region from, for example, a pull-down list.
When the settings in the regions up to the totalization unit setting region 606 are complete, and the user selects the display button of the display/cancel buttons 607, the graph creation unit 103 creates a graph based on the set information. After the graph is created, the display control unit 107 displays a screen shown in
On the other hand, the graph setting dialog 611 shown in
The start date and time/end date and time setting region 612, the type setting region 613, the search button 614, the totalization unit setting region 616, and the display/cancel buttons 617 are similar to the start date and time/end date and time setting region 602, the type setting region 603, the search button 604, the totalization unit setting region 606, and the display/cancel buttons 607 in
When the settings in the regions up to the totalization unit setting region 616 are complete, and the user selects the display button of the display/cancel buttons 607, the graph creation unit 103 creates a graph based on the set information. After the graph is created, the display control unit 107 displays a screen shown in
Note that the graph setting dialogs shown in
When the user selects the bar 702 via the operation unit 207, the result image creation unit 104 selects measurement data whose image capturing time is earliest among the measurement data which are measured in a time period of a totalization time unit in which the selected bar 702 exists and saved in the database 109. Then, the result image creation unit 104 acquires, from the database 109 via the file input/output unit 106, the measured person count, the coordinates and size of each detected person in the image, and the image data, all of which are included in the selected measurement data. If it is impossible to acquire the image data from the measurement data, the result image creation unit 104 may acquire the image data 108 from the image acquisition unit 105. Subsequently, the result image creation unit 104 creates a result from the acquired data. In this embodiment, the result image creation unit 104 creates a region person count result image to be displayed as a result on the display unit 208.
Note that although
When the user selects the bar 902 via the operation unit 207, the result image creation unit 104 operates, as follows.
(1) The result image creation unit 104 selects the measurement data whose image capturing time is earliest, among the measurement data which have been measured in the time period of the totalization unit time in which the selected bar 902 exists and saved in the database 109.
(2) The result image creation unit 104 acquires, from the database 109 via the file input/output unit 106, the measured person count, the coordinates and size of each detected person in the image, and the image data, all of which are included in the selected measurement data. If it is impossible to acquire the image data from the measurement data, the result image creation unit 104 acquires the image data 108 from the image acquisition unit 105.
(3) The result image creation unit 104 creates, from the acquired data, a passing person count result image to be displayed as a result on the display unit 208.
(4) By returning to (1), the result image creation unit 104 selects the measurement data, whose image capturing time is second earliest, in the time period of the totalization unit time in which the bar 902 exists. If there is no measurement data, the operation ends.
As the display form (the form of the passing person count result image 1001) of (3) above, the following forms are also possible.
(A) If a bar representing the number (In person count) of persons passing the detection line in the positive direction is selected, only an image at an instance (a predetermined time before and after passing) when a person passes the detection line in the positive direction is created and displayed as at least part of the passing person count result image 1001.
(B) If a bar representing the number (Out person count) of persons passing the detection line in the opposite direction is selected, only an image at an instance (a predetermined time before and after passing) when a person passes the detection line in the opposite direction is created and displayed as at least part of the passing person count result image 1001.
(C) If bars representing the numbers (In person count and Out person count) of persons in the positive direction and the opposite direction are selected, result images of data of a time period of a time unit in which the bars exist regardless of the direction are created as at least parts of the passing person count result image 1001, and all the images are displayed in time series or simultaneously.
As described above, according to this embodiment, measured results can be saved, the saved results can be called later, the results during a specific period can be plotted and displayed, and a measurement result image as a graph element can be displayed from the displayed graph. It is possible to present, to the user, display to be analyzed more easily, by changing the display form in accordance with the type of a person count to be measured.
According to the present invention, it is possible to detect the number of persons existing in a predetermined region and the number of persons passing a predetermined position in the predetermined region, and to perform display in accordance with each detection method.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Number | Date | Country | Kind |
---|---|---|---|
JP2017-107067 | May 2017 | JP | national |
This application is a Continuation of International Patent Application No. PCT/JP2018/015324, filed Apr. 12, 2018 which claims the benefit of Japanese Patent Application No. 2017-107067, filed May 30, 2017, both of which are hereby incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
7936372 | Murakami et al. | May 2011 | B2 |
10474972 | Iwai | Nov 2019 | B2 |
20060067456 | Ku | Mar 2006 | A1 |
20070248244 | Sato | Oct 2007 | A1 |
20080016541 | Murakami et al. | Jan 2008 | A1 |
20080212099 | Chen | Sep 2008 | A1 |
20110158482 | Johnson | Jun 2011 | A1 |
20120020518 | Taguchi | Jan 2012 | A1 |
20140059134 | Chiu | Feb 2014 | A1 |
20140152763 | Lim | Jun 2014 | A1 |
20140355829 | Heu | Dec 2014 | A1 |
20150278588 | Matsumoto | Oct 2015 | A1 |
20150294183 | Watanabe | Oct 2015 | A1 |
20160307049 | Hagisu | Oct 2016 | A1 |
20160321507 | Yang | Nov 2016 | A1 |
20170206669 | Saleemi | Jul 2017 | A1 |
20170308843 | Iwai | Oct 2017 | A1 |
20170330330 | Seki | Nov 2017 | A1 |
20180307913 | Finn | Oct 2018 | A1 |
Number | Date | Country |
---|---|---|
2005-135188 | May 2005 | JP |
2005135188 | May 2005 | JP |
2005-143016 | Jun 2005 | JP |
2005148863 | Jun 2005 | JP |
2008-016895 | Jan 2008 | JP |
2010-181920 | Aug 2010 | JP |
2014-006586 | Jan 2014 | JP |
2016-085688 | May 2016 | JP |
2018221030 | Dec 2018 | WO |
Entry |
---|
International Search Report and Written Opinion dated Jul. 10, 2018, in International Patent Application No. PCT/JP2018/015324. |
Extended European Search Report dated Sep. 16, 2020, issued in counterpart European Application No. 18809856.0. |
Mukherjee et al., “Unique people count from monocular videos”, Visual Computer, Springer, Berlin, DE, vol. 31, No. 10, Sep. 24, 2014, pp. 1405-1417. |
Number | Date | Country | |
---|---|---|---|
20200097736 A1 | Mar 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2018/015324 | Apr 2018 | US |
Child | 16697425 | US |