The present disclosure relates to computer modeling of structures and property. More specifically, the present disclosure relates to systems and methods for rapidly developing annotated computer models of structures.
It is well-known in the field of computer-aided design (CAD) to utilize software products to create computer models of structures. Indeed, for many years, computers have been used to develop models of structures and property for various purposes. For example, various, customized software packages have been developed for many years which allow insurance companies, contractors, and other entities, to create models of structures and properties. One such example is the XACTIMATE software package, which is widely used in the insurance claims processing industry to create computerize models of buildings and materials for purposes of repairing and replacing structures due to property damage and other causes.
In addition to the above, there have been rapid advances in the fields of computer-generated models of structures and property by applying computer vision techniques to digital imagery (e.g., aerial imagery, satellite imagery, etc.) to create three-dimensional (3D) models of such structures. Examples of widely-used software packages which generate such models from aerial imagery include the GEOMNI ROOF and GEOMNI PROPERTY software packages. These systems create complex, three-dimensional models of structures by processing features in aerial images.
While the advent of computer vision techniques have made the process of creating models of structures (and property) easier to accomplish than was previously possible, there is still a need to rapidly create annotated computer models of structures, e.g., models of buildings, property, and other structures which not only accurately model the real-world structures that they represent, but also are annotated with rich information delineating real-world attributes relating to such structure. Accordingly, the systems and methods of the present disclosure address these shortcomings of existing technologies.
The present disclosure relates to systems and methods for rapidly developing annotated computer models of structures and property. The system can generate three-dimensional (3D) models of structures and property using a wide variety of digital imagery, and/or can process existing 3D models created by other systems. The system processes the 3D models to automatically identify candidate objects within the 3D models that may be suitable for annotation, such as roof faces, chimneys, windows, gutters, etc., using computer vision techniques to automatically identify such objects. Additionally, for each identified object, the system also automatically searches for and identifies additional related objects that may be suitable candidates for annotation. Once the candidate objects have been identified, the system automatically generates user interface screens which gather relevant information related to the candidate objects, so as to rapidly obtain, associate, and store annotation information related to the candidate objects. Additionally, the system dynamically adjusts questions presented to users in the user interface screens so as to increase the speed with which relevant information is obtained from users and associated with the objects of the model. When all relevant annotation information has been gathered and associated with model objects, the system can create a list of materials that can be used for future purposes, such as repair and/or reconstruction of real-world structures and property.
The foregoing features of the disclosure will be apparent from the following Detailed Description, taken in connection with the accompanying drawings, in which:
The present disclosure relates to systems and methods for rapid development of annotated computer models of structures and property, as described in detail below in connection with
In step 14, the images and metadata (package) can be processed by the system using one or more computer vision algorithms to create a three-dimensional (3D) model of the property/structure, as well as damage to such property/structure. It is noted that the system need not create a 3D model from aerial images, and indeed, the system could receive and process a previously-created 3D model that is transmitted to the system from another computer, if desired. If it is desired to create a 3D model from multiple images of the same property, there are numerous ways that such a 3D model can be generated. Known software tools can be used which perform sophisticated image processing algorithms to automatically extract the information from the images and generate the model. Other software tools allow operators to manually generate the models with some computer assistance. Still other tools use a combination of automatically generated and manually generated models. In any case, the result is a raw geometric model consisting of polygons, line segments and points. If the system is utilized to generate a 3D model from the aerial imagery, various techniques could be used by the system to carry out such modeling, including but not limited to, one or more of the techniques disclosed in issued U.S. Pat. Nos. 9,679,227 and 9,501,700; published PCT Application No. PCT/US2016/065947; and U.S. patent application Ser. No. 15/277,359, the entire disclosures of which are expressly incorporated herein by reference, or any other suitable techniques.
In step 16, the system identifies attributes of objects in the property, and annotates the 3D model of the property/structure. In particular, in this step, the system automatically identifies components of the 3D model, such as points, lines, panels, geometric shapes, and free shapes, as candidates for annotation. For example, the system could automatically include points of the model as candidate structures for annotation as roof vents, lines of the model as candidate structures for annotation as gutters, panels (planar sections) of the model as candidate structures for annotation as skylights or windows, and various geometric shapes as candidates for annotation as other structures such as trees, vegetation, etc. Such automatic selection of objects of the 3D model as candidates for annotation could be accomplished using known computer vision techniques such as edge detection and classification techniques, region growing techniques, machine learning, etc. Such attributes and associated annotation(s) can include, but are not limited to:
Any elements can have specific features associated therewith. For example, exterior walls could be made from brick, stone, stucco, metal or some other material. Roofs could be made from asphalt shingles, ceramic tile, shake, metal, etc. Roof vents could be turbine, turtle or some other type of vent. When the features are associated with a candidate object for annotation, they are stored as part of the annotation. The annotation information can be generated manually, through an automated process, or through some combination of the two. The automated process utilizes computer vision and machine learning techniques. The automated annotations can be broken into two types: structural and non-structural annotations. Structural annotations are elements that are attached to the 3D model of a structure. Examples of structural annotations are: roof vents, skylights, solar panels and roof materials. Non-structural annotations are those not related to any 3D model. Examples of non-structural annotations are: pools, trees, trampolines, and concrete flatwork.
When annotations are entered manually, the system assists the operator in identifying, locating and entering elements associated with real-world items into the model. The system projects properly-oriented models onto different images of the property from different sources and orientations, to assist the user with annotation. The operator can then interact with the tool by adding a new property feature to the model, removing an existing property feature, or adding additional information about an existing property feature. These manual annotations can be entered at the time of the model creation or anytime afterward.
In step 18, the system refines the 3D model after attributes of objects in the model have been identified and annotated, as noted above. An interactive process is implemented in step 18, and is illustrated in greater detail in connection with
Turning back to
As shown in
The annotation process enabled by the system of the present disclosure will now be described in greater detail. When the user clicks on the object (roof face) 62, the user interface of the system automatically generates a user interface screen 70 as shown in
Note that the “New Material Type” question shown in the user interface 70 in
As part of the iterative process, a set of real-world items is generated each time an answer to a question is changed. This is illustrated in the version of the user interface 70 shown in
Having thus described the system and method in detail, it is to be understood that the foregoing description is not intended to limit the spirit or scope thereof. It will be understood that the embodiments of the present disclosure described herein are merely exemplary and that a person skilled in the art may make any variations and modification without departing from the spirit and scope of the disclosure. All such variations and modifications, including those discussed above, are intended to be included within the scope of the disclosure.
This application claims the benefit of U.S. Provisional Patent Application No. 62/585,078, filed on Nov. 13, 2017, the entire disclosure of which is expressly incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5701403 | Watanabe | Dec 1997 | A |
6446030 | Hoffman et al. | Sep 2002 | B1 |
6448964 | Isaacs et al. | Sep 2002 | B1 |
8533063 | Erickson | Sep 2013 | B2 |
8843304 | Dupont et al. | Sep 2014 | B1 |
8868375 | Christian | Oct 2014 | B1 |
8983806 | Labrie et al. | Mar 2015 | B2 |
9158869 | Labrie et al. | Oct 2015 | B2 |
9501700 | Loveland et al. | Nov 2016 | B2 |
9679227 | Taylor et al. | Jun 2017 | B2 |
10127670 | Lewis et al. | Nov 2018 | B2 |
10181079 | Labrie et al. | Jan 2019 | B2 |
10289760 | Oakes, III et al. | May 2019 | B1 |
10387582 | Lewis et al. | Aug 2019 | B2 |
10445438 | Motonaga et al. | Oct 2019 | B1 |
10529028 | Davis et al. | Jan 2020 | B1 |
11314905 | Childs et al. | Apr 2022 | B2 |
20020116254 | Stein et al. | Aug 2002 | A1 |
20030009315 | Thomas et al. | Jan 2003 | A1 |
20070080961 | Inzinga et al. | Apr 2007 | A1 |
20070276626 | Bruffey | Nov 2007 | A1 |
20090179895 | Zhu | Jul 2009 | A1 |
20100110074 | Pershing | May 2010 | A1 |
20100114537 | Pershing | May 2010 | A1 |
20100296693 | Thornberry et al. | Nov 2010 | A1 |
20110056286 | Jansen | Mar 2011 | A1 |
20110157213 | Takeyama et al. | Jun 2011 | A1 |
20110191738 | Walker et al. | Aug 2011 | A1 |
20120026322 | Malka et al. | Feb 2012 | A1 |
20120179431 | Labrie et al. | Jul 2012 | A1 |
20120253725 | Malka et al. | Oct 2012 | A1 |
20120253751 | Malka et al. | Oct 2012 | A1 |
20130201167 | Oh et al. | Aug 2013 | A1 |
20130206177 | Burlutskiy | Aug 2013 | A1 |
20130226451 | O'Neill et al. | Aug 2013 | A1 |
20130262029 | Pershing | Oct 2013 | A1 |
20130314688 | Likholyot | Nov 2013 | A1 |
20140043436 | Bell et al. | Feb 2014 | A1 |
20140195275 | Pershing et al. | Jul 2014 | A1 |
20140301633 | Furukawa et al. | Oct 2014 | A1 |
20140320661 | Sankar et al. | Oct 2014 | A1 |
20150029182 | Sun et al. | Jan 2015 | A1 |
20150073864 | Labrie et al. | Mar 2015 | A1 |
20150093047 | Battcher et al. | Apr 2015 | A1 |
20150116509 | Birkler et al. | Apr 2015 | A1 |
20150153172 | Starns et al. | Jun 2015 | A1 |
20150193971 | Dryanovski et al. | Jul 2015 | A1 |
20150213558 | Nelson | Jul 2015 | A1 |
20150227645 | Childs et al. | Aug 2015 | A1 |
20150269438 | Samarasekera et al. | Sep 2015 | A1 |
20150302529 | Jagannathan | Oct 2015 | A1 |
20160098802 | Bruffey et al. | Apr 2016 | A1 |
20160110480 | Randolph | Apr 2016 | A1 |
20160246767 | Makadia | Aug 2016 | A1 |
20160282107 | Roland et al. | Sep 2016 | A1 |
20170124713 | Jurgenson et al. | May 2017 | A1 |
20170132711 | Bruffey et al. | May 2017 | A1 |
20170132835 | Halliday et al. | May 2017 | A1 |
20170169459 | Bruffey et al. | Jun 2017 | A1 |
20170193297 | Michini et al. | Jul 2017 | A1 |
20170206648 | Marra et al. | Jul 2017 | A1 |
20170221152 | Nelson et al. | Aug 2017 | A1 |
20170316115 | Lewis et al. | Nov 2017 | A1 |
20170330207 | Labrie et al. | Nov 2017 | A1 |
20170345069 | Labrie et al. | Nov 2017 | A1 |
20180053329 | Roberts | Feb 2018 | A1 |
20180067593 | Tiwari et al. | Mar 2018 | A1 |
20180089833 | Lewis et al. | Mar 2018 | A1 |
20180286098 | Lorenzo | Oct 2018 | A1 |
20180330528 | Loveland | Nov 2018 | A1 |
20180357819 | Oprea | Dec 2018 | A1 |
20180373931 | Li | Dec 2018 | A1 |
20190114717 | Labrie et al. | Apr 2019 | A1 |
20190221040 | Shantharam et al. | Jul 2019 | A1 |
20190340692 | Labrie et al. | Nov 2019 | A1 |
20190377837 | Lewis et al. | Dec 2019 | A1 |
20200100066 | Lewis et al. | Mar 2020 | A1 |
20200143481 | Brown et al. | May 2020 | A1 |
20210076162 | Wang et al. | Mar 2021 | A1 |
20210103687 | Harris et al. | Apr 2021 | A1 |
20210350038 | Jenson et al. | Nov 2021 | A1 |
20220309748 | Lewis et al. | Sep 2022 | A1 |
Number | Date | Country |
---|---|---|
2014151122 | Sep 2014 | WO |
2016154306 | Sep 2016 | WO |
2017100658 | Jun 2017 | WO |
Entry |
---|
International Search Report of the International Searching Authority dated Feb. 11, 2019, issued in connection with International Application No. PCT/US18/60762 (3 pages). |
Written Opinion of the International Searching Authority dated Feb. 11, 2019, issued in connection with International Application No. PCT/US18/60762 (7 pages). |
U.S. Appl. No. 62/512,989, filed May 31, 2017 entiitled, “Systems and Methods for Rapidly Developing Annotated Computer Models of Structures” (47 pages). |
Extended European Search Report dated Jul. 1, 2021, issued by the European Patent Office in connection with European Application No. 18876121.7 (8 pages). |
Invitation to Pay Additional Fees issued by the International Searching Authority dated Feb. 2, 2022, issued in connection with International Application No. PCT/US21/63469 (2 pages). |
Extended European Search Report dated Feb. 18, 2022, issued in connection with European Patent Application No. 19866788.3 (9 pages). |
Office Action dated Dec. 27, 2021, issued in connection with U.S. Appl. No. 16/580,741 (13 pages). |
Notice of Allowance dated Dec. 16, 2021, issued in connection with U.S. Appl. No. 14/620,004 (12 pages). |
International Search Report of the International Searching Authority dated Mar. 27, 2017, issued in connection with International Application No. PCT/US2016/65947 (3 pages). |
Written Opinion of the International Searching Authority dated Mar. 27, 2017, issued in connection with International Application No. PCT/US2016/65947 (7 pages). |
Office Action dated Sep. 26, 2018, issued in connection with U.S. Appl. No. 15/374,695 (33 pages). |
Notice of Allowance dated May 13, 2019, issued in connection with U.S. Appl. No. 15/374,695 (7 pages). |
Extended European Search Report dated Jun. 11, 2019, issued in connection with European Patent Application No. 16873975.3 (8 pages). |
Communication Pursuant to Article 94(3) EPC issued by the European Patent Office dated Apr. 22, 2020, issued in connection with European Patent Application No. 16873975.3 (6 pages). |
International Search Report of the International Searching Authority dated Dec. 12, 2019, issued in connection with International Application No. PCT/US2019/52670 (3 pages). |
Written Opinion of the International Searching Authority dated Dec. 12, 2019, issued in connection with International Application No. PCT/US2019/52670 (5 pages). |
Office Action dated Feb. 5, 2020, issued in connection with U.S. Appl. No. 16/580,741 (15 pages). |
International Search Report of the International Searching Authority dated May 14, 2015, issued in connection with International Application No. PCT/US15/015491(3 pages). |
Written Opinion of the International Searching Authority dated May 14, 2015, issued in connection with International Application No. PCT/US15/015491 (9 pages). |
Fung, et al., “A Mobile Assisted Localization Scheme for Augmented Reality,” Department of Computer Science and Engineering, The Chinese University of Hong Kong, 2012 (76 pages). |
Sankar, et al., “Capturing Indoor Scenes With Smartphones,” UIST'12, Oct. 7-10, 2012, Cambridge, Massachusetts (9 pages). |
Office Action dated Aug. 8, 2017, issued in connection with U.S. Appl. No. 14/620,004 (26 pages). |
Office Action dated Aug. 28, 2018, issued in connection with U.S. Appl. No. 14/620,004 (33 pages). |
Farin, et al., “Floor-Plan Reconstruction from Panoramic Images,” Sep. 23-28, 2007, MM '07, ACM (4 pages). |
Office Action dated Mar. 29, 2019, issued in connection with U.S. Appl. No. 14/620,004 (22 pages). |
Office Action dated Dec. 10, 2019, issued in connection with U.S. Appl. No. 14/620,004 (27 pages). |
Zhang, et al., “Wallk&Sketch: Create Floor Plans with an RGB-D Camera,” Sep. 5-8, 2012, UbiComp '12, ACM (10 pages). |
Office Action dated Jul. 8, 2020, issued in connection with U.S. Appl. No. 14/620,004 (27 pages). |
Office Action dated Sep. 22, 2020, issued in connection with U.S. Appl. No. 16/580,741 (14 pages). |
Office Action dated Feb. 2, 2021, issued in connection with U.S. Appl. No. 14/620,004 (28 pages). |
Communication Pursuant to Article 94(3) EPC issued by the European Patent Office dated Feb. 18, 2021, issued in connection with European Patent Application No. 16/873,975.3 (5 pages). |
Examination Report No. 1 dated Mar. 30, 2021, issued by the Australian Patent Office in connection with Australian Patent Application No. 2016366537 (6 pages). |
Office Action dated Apr. 21, 2021, issued in connection with U.S. Appl. No. 16/580,741 (15 pages). |
Notice of Allowance dated Aug. 19, 2021, issued in connection with U.S. Appl. No. 14/620,004 (11 pages). |
Examiner-Initiated Interview Summary dated Aug. 10, 2021, issued in connection with U.S. Appl. No. 14/620,004 (1 page). |
Office Action dated Mar. 25, 2022, issued in connection with U.S. Appl. No. 16/545,607 (56 pages). |
International Search Report of the International Searching Authority dated Apr. 8, 2022, issued in connection with International Application No. PCT/US21/63469 (5 pages). |
Written Opinion of the International Searching Authority dated Apr. 8, 2022, issued in connection with International Application No. PCT/US21/63469 (6 pages). |
Dino, et al., “Image-Based Construction of Building Energy Models Using Computer Vision,” Automation in Construction (2020) (15 pages). |
Fathi, et al., “Automated As-Built 3D Reconstruction of Civil Infrastructure Using Computer Vision: Achievements, Opportunities, and Challenges,” Advanced Engineering Informatics (2015) (13 pages). |
International Search Report of the International Searching Authority dated Jul. 25, 2022, issued in connection with International Application No. PCT/US22/22024 (3 pages). |
Written Opinion of the International Searching Authority dated Jul. 25, 2022, issued in connection with International Application No. PCT/US22/22024 (5 pages). |
Office Action dated Sep. 2, 2022, issued in connection with U.S. Appl. No. 16/580,741 (13 pages). |
Office Action dated Oct. 13, 2022, issued in connection with U.S. Appl. No. 16/545,607 (53 pages). |
International Search Report of the International Searching Authority dated Nov. 18, 2022, issued in connection with International Application No. PCT/US22/030691 (6 pages). |
Written Opinion of the International Searching Authority dated Nov. 18, 2022, issued in connection with International Application No. PCT/US22/030691 (11 pages). |
Notice of Allowance dated Dec. 9, 2022, issued in connection with U.S. Appl. No. 17/705,130 (10 pages). |
Examination Report No. 1 dated Dec. 15, 2022, issued by the Australian Patent Office in connection with Australian Patent Application No. 2021282413 (3 pages). |
Communication Pursuant to Article 94(3) EPC dated Jan. 31, 2023, issued in connection with European Patent Application No. 16873975.3 (8 pages). |
Notice of Allowance dated Feb. 14, 2023, issued in connection with U.S. App. No. 17/705,130 (5 pages). |
Number | Date | Country | |
---|---|---|---|
20190147247 A1 | May 2019 | US |
Number | Date | Country | |
---|---|---|---|
62585078 | Nov 2017 | US |