This application claims the benefit of People's Republic of China application Serial No. 201610142680.X, filed Mar. 14, 2016, the disclosure of which is incorporated by reference herein its entirety.
The disclosure relates in general to a processing method and a processing system, and more particularly to an image processing method and an image processing system.
Along with the development of the image processing technology, various image detections, such as people detection, objection detection, motion detection and car detection, are invented. Those image detections are widely used for several applications, such as environmental monitoring, driving recording, or web video chatting.
However, in some of the applications, if only one kind of image detections is performed for the whole frame of the image data, it does not adequately meet the variety of needs. Thus, this issue causes a major bottleneck of the development of the image processing technology.
The disclosure is directed to an image processing method and an image processing system, a plurality of image detections are performed on a plurality of regions of an image data, such that the detections on the image data can adequately meet the variety of needs.
According to an embodiment, an image processing method is provided. The image processing method includes the following steps: An image data is cropped to obtain a plurality of regions. A plurality of image detections are performed on the regions.
According to another embodiment, an image processing system is provided. The image processing system includes a cropping unit and a processing unit. The cropping unit is for cropping an image data to obtain a plurality of regions. The processing unit is for performing a plurality of image detections on the regions.
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
In one embodiment of the present invention, a plurality of image detections are performed on a plurality of regions of an image data, such that the detections on the image data can adequately meet the variety of needs.
Please refer to
The image processing system 100 includes a cropping unit 120, a processing unit 130 and an analyzing unit 150. The image processing system 100 is used for processing an image data D1. The image data D1 may be obtained from a network interface, a storage unit or an image sensor.
The cropping unit 120 is used for performing a frame cropping process. The processing unit 130 is used for performing various image detections. The analyzing unit 150 is used for analyzing the result of the image detections performed by the processing unit 130 to determine whether an event is needed to be recorded or reported. Each of the cropping unit 120, the processing unit 130 and the analyzing unit 150 may be a circuit, a chip, a circuit board, a computer, or a storage device storing a plurality of program codes. Two or three of the cropping unit 120, the processing unit 130 and the analyzing unit 150 may be integrated to be one piece.
The processing unit 130 includes a plurality of detectors, such as a first detector 131, a second detector 132, and etc. The first detector 131 and the second detector 132 are used for performing different image detections, such as people detection, objection detection, motion detection and car detection, etc.
The operation of the image processing system 100 is illustrated by a flowchart. Please refer to
As shown in
In the step S120, the cropping unit 120 crops the image data D1 to obtain a plurality of regions. Please refer to
As shown in
In the step S130, the processing unit 130 performs various image detections on those regions. For example, human or pets may pass through the front door and will be shown in the region R11, so the first detector 131 of the processing unit 130 performs the motion detection on the region R11. The trees shown in the region R12 and the region R13 are easily swung by winds, so the people detection is performed on the region R12 and the region R13 instead of the motion detection.
As such, in a complex frame, appropriate image detections may be respectively performed on different regions for increasing the detection accuracy and reducing false positives/negatives.
Please refer to
Please refer to
Please refer to
Please refer to
Please refer to
In the step S240, the self-learning unit 240 adjusts the regions according to a result of the image detections. Please refer
Please refer to
In the step S350, the self-learning unit 340 adjusts the image detections according to a result of the image detections. Please refer to
In one embodiment, the image processing method may include the step S240 of the
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2016 1 0142680 | Mar 2016 | CN | national |
Number | Name | Date | Kind |
---|---|---|---|
6061014 | Rautanen et al. | May 2000 | A |
7602944 | Campbell et al. | Oct 2009 | B2 |
9129385 | Xie | Sep 2015 | B2 |
9208554 | Shehata et al. | Dec 2015 | B2 |
9779331 | Bulan | Oct 2017 | B2 |
9801550 | Ferrantelli | Oct 2017 | B2 |
9819865 | Evans, V | Nov 2017 | B2 |
9928596 | Beall | Mar 2018 | B2 |
20040120581 | Ozer et al. | Jun 2004 | A1 |
20090087085 | Eaton | Apr 2009 | A1 |
20090252435 | Wen | Oct 2009 | A1 |
20090263021 | Takamori | Oct 2009 | A1 |
20100002071 | Ahiska | Jan 2010 | A1 |
20110050939 | Tsurumi | Mar 2011 | A1 |
20120033896 | Barrows | Feb 2012 | A1 |
20120327241 | Howe | Dec 2012 | A1 |
20130114703 | DeForest et al. | May 2013 | A1 |
20130230099 | DeForest et al. | Sep 2013 | A1 |
20130242079 | Zhou | Sep 2013 | A1 |
20130259385 | Xie | Oct 2013 | A1 |
20130286193 | Pflug | Oct 2013 | A1 |
20140177946 | Lim et al. | Jun 2014 | A1 |
20140355829 | Heu et al. | Dec 2014 | A1 |
20140369417 | Shi et al. | Dec 2014 | A1 |
20150125032 | Yamanaka et al. | May 2015 | A1 |
20150269427 | Kim et al. | Sep 2015 | A1 |
20150310624 | Bulan et al. | Oct 2015 | A1 |
20170126972 | Evans, V | May 2017 | A1 |
20170280055 | Kaida | Sep 2017 | A1 |
20180013953 | Evans, V | Jan 2018 | A1 |
20180211104 | Zhao | Jul 2018 | A1 |
20180330510 | Watanabe | Nov 2018 | A1 |
20190005654 | Takahashi | Jan 2019 | A1 |
Number | Date | Country |
---|---|---|
101216885 | Jul 2008 | CN |
101867699 | Oct 2010 | CN |
101901334 | Dec 2010 | CN |
102004918 | Apr 2011 | CN |
102955929 | Mar 2013 | CN |
103150552 | Jun 2013 | CN |
104364824 | Feb 2015 | CN |
104427337 | Mar 2015 | CN |
102750527 | Aug 2015 | CN |
104866842 | Aug 2015 | CN |
105118072 | Dec 2015 | CN |
Entry |
---|
CN Office Action dated Aug. 3, 2018 in corresponding Chinese application (No. 201610142680.X). |
CN Office Action dated Aug. 2, 2019 in corresponding with Chinese application (No. 201610142680.X), pp. 1-6. |
CN Office Action dated Mar. 5, 2019 in corresponding Chinese application (No. 201610142680.X). |
Number | Date | Country | |
---|---|---|---|
20170262998 A1 | Sep 2017 | US |