The present invention generally relates to property measurement. More specifically, the present invention relates to computerized tools for measuring points of interest within a property and generating associated documentation.
Determining the elevation of a structure (e.g., a home, building, tower, or sculpture), or of a particular portion of a structure (e.g., sea level or another elevation point potentially significant for flood risk) for governmental or flood insurance rating purposes has traditionally been performed by a surveyor or survey team. Starting from a known benchmark, the survey team foresights or back-sights to the property in question and notes the rise or fall in elevation along the way. A prominent location on a lot next to the structure in question is typically chosen by the survey team and accurately geo-located using map coordinates. Next, the elevation is determined for the chosen location based upon the multitude of measurements taken from a reference benchmark location with a known elevation. Subsequent structure measurements, such as the height of the first floor, second floor, split-level, or basement floor are referenced from the chosen prominent location.
Once the surveyor or the survey team completes their measurements, these measurements are typically entered into paper-based documents such as the Federal Emergency Management Agency (FEMA) “Elevation Certificate” document. The Elevation Certificate must then be certified by the engineer or surveyor responsible for taking the measurements. Photographs showing the structure are also typically taken and attached to the Elevation Certificate. The Elevation Certificate is then provided to an insurance underwriter where the measurements are reviewed and appropriate rate information is determined utilizing flood premium charts provided by a government entity.
This traditional process of measuring the property is time-consuming, cumbersome, and difficult. This is in addition to the concerns of inaccuracy and error if the surveying process is done incorrectly. This process must also be performed by surveyors or individuals with experience surveying property. This means that conducting the survey can also be costly due to hiring costs of qualified surveyors. The survey can also be delayed due to unavailability of qualified surveyors on a particular date or in a particular region. The filling out of documents is equally time consuming and offers the possibility for error in transcription as well as loss of documentation, especially originals or certified forms.
There is a need in the art for improved systems and methods for conducting property measurement and associated document workflow.
One exemplary method for property measurement includes receiving a first location from a first global positioning system (GPS) receiver located near a user device. The method also includes receiving a second location from a second global positioning system (GPS) receiver located near a structure. The method also includes calculating a distance between the first location and the second location. The method also includes receiving a digital photograph depicting at least one side of the structure and a point of interest that is part of the structure. The method also includes calculating an elevation of the point of interest using the digital photograph and the calculated distance. The method also includes storing the calculated elevation of the point of interest in a memory of the user device. The method also includes generating an electronic document that includes the calculated elevation of the point of interest.
One exemplary system for property measurement includes a first global positioning system (GPS) receiver. The system also includes a second global positioning system (GPS) receiver located near a structure. The system also includes a communication transceiver. The system also includes a camera module. The system also includes a memory. The system also includes a processor coupled to the memory and the camera module and the communication transceiver. Execution of instructions stored in the memory by the processor performs a number of system operations. The system operations include receiving a first location from a first global positioning system (GPS) receiver at the communication transceiver. The system operations also include receiving a second location from a second global positioning system (GPS) receiver at the communication transceiver. The system operations also include calculating a distance between the first location and the second location. The system operations also include receiving a digital photograph from the camera module, the digital photograph depicting at least one side of the structure and a point of interest that is part of the structure. The system operations also include calculating an elevation of the point of interest using the digital photograph and the calculated distance. The system operations also include storing the calculated elevation of the point of interest in a memory of the user device. The system operations also include generating an electronic document that includes the calculated elevation of the point of interest.
One exemplary non-transitory computer-readable storage medium may have embodied thereon a program executable by a processor to perform a method for property measurement. The exemplary program method includes receiving a first location from a first global positioning system (GPS) receiver located near a user device. The program method also includes receiving a second location from a second global positioning system (GPS) receiver located near a structure. The program method also includes calculating a distance between the first location and the second location. The program method also includes receiving a digital photograph depicting at least one side of the structure and a point of interest that is part of the structure. The program method also includes calculating an elevation of the point of interest using the digital photograph and the calculated distance. The program method also includes storing the calculated elevation of the point of interest in a memory of the user device. The program method also includes generating an electronic document that includes the calculated elevation of the point of interest.
Two global positioning system (GPS) receivers along with a user device with a camera (e.g., a smartphone or tablet) can be used to determine an elevation of a point of interest on or within a structure. The user device and first GPS receiver can be located somewhere outside the structure from which the structure is clearly visible. The second GPS receiver can be located on, within, or near the structure. The user device receives location data from both GPS receivers and calculates a distance between the two receivers. The user device then takes a digital photograph in which structure is visible and notes the photo capture angle. The user device then trigonometrically calculates the elevation of the point of interest using the calculated GPS distance and the photo angle. The user device can then automatically insert this into a document (e.g., a FEMA Elevation Certificate) and transmit said document to a requisite locale or device.
The point of interest 130 may be located anywhere along a face of the structure or within a structure 100. For example, the point of interest 130 may represent a particular floor such as one with structural issues. Point of interest 130 could also be a particular elevation at which the structure 100 may face flood dangers or any other point or elevation that may be useful or necessary to measure.
A user 190 is illustrated in
Measurements completed by the user device 110 in concert with the GPS receivers 170/175 are used to automatically generate a document. The document might be a Federal Emergency Management Agency (FEMA) “Elevation Certificate” form as illustrated in
The user 190 conducting the elevation measurement does not need to be a surveyor or engineer in order to perform the measurements. The user 190 can use a software application 290 executed on their user device 110 as part of the measuring process. The user first deploys the “base” global positioning system (GPS) receiver 170 and the “rover” GPS receiver 175. One or both of the GPS receivers 170/175 may be specialized GPS receivers incorporating Real Time Kinematics (RTK) technology and/or Real Time Network (RTN) technology for greater accuracy. One or both of the GPS receivers 170/175 could additionally or alternatively incorporate additional sensors such as accelerometers, Wi-Fi network-based positioning componentry, and cell-tower triangulation positioning componentry for additional accuracy. One or both GPS receivers 170/175 may be capable of measuring elevation in addition to latitude and longitude.
One or both of the GPS receivers 170/175 are mounted on a base, mount, tripod, or object of a known height, which is then being used as an elevation offset 115. The “base” GPS receiver 170 is generally placed on a prominent location on the property that is associated with the structure 100 in question. Said receiver 170 is in clear view of the sky such that sufficient signal reception with a GPS satellite 180 is present. While not shown in
The “rover” receiver 175 is located at, or next to, a relevant reference point for a structure 100. For example, the second “rover” GPS receiver 175 can be placed near or atop a bottom floor of the structure 100, inside a crawl space or basement of the structure 100, at a foundation of the structure 100, at a pier extending from the structure 100, or at a similar reference point. The rover GPS receiver 175 can also, in some cases, be placed on a tripod, rod, or other object of known height (i.e, the elevation offset 115).
The base (prominent location) GPS receiver 170 and the rover (structure 100 reference) GPS receiver 175 communicates with the software application 290 running on the user device 110. Such communication may be over a wired network whereby, the base GPS receiver 170 may be coupled to the user device 110 with one or more cables such as USB cables. The connection may also be a wireless connection, including but not limited to Bluetooth, Bluetooth Low Energy, Bluetooth Smart, 3G, 4G, LTE, Wi-Fi, Wi-Fi direct, Radio Frequency Communication, or Near-Field Communication, or some combination thereof.
In any event, both the base GPS receiver 170 and the rover GPS receiver 175 each communicate their precise locations and/or elevations to the user device 110. Utilizing a camera module 210 of the user device 110, a user device 110 software application 290 then instructs the user 190 to take a series of one or more digital images from the base GPS receiver 170 (at the prominent location) to the rover GPS receiver 175 (at the structure 100 reference point) whereby the distance, height, and angles can be calculated using the application 290 to arrive at the correct elevation measurements required for a valid Federal Emergency Management Agency (FEMA) Elevation Certificate or other type of document.
The series of digital images that the user device application 290 instructs the User 190 to take may be used to calculate the height of the structure 100, among other values. These values can be inserted in the document (e.g., FEMA Elevation Certificate form or other document). The government may, in some cases, require a clear photo to be included with the FEMA Elevation Certificate. The user 190 can be instructed to acquire one or more photos of the structure 100 in question. These photos can be used not only to satisfy the government requirement for underwriting (for example) flood risk, but to mechanically obtain clear line of sight of the entire structure 100 with view of the point of measurement (lowest point adjacent to a concrete slab, for example) not obscured by landscaping (bushes or tall grass.
In one embodiment, the user 190 would take at least eight photos of a structure 100 (assuming each photo had a clear view) starting with a photo of each corner of the structure 100 and additional photos depicting the center of each wall of the structure 100. In some embodiments, it may suffice for the user to take one photo (e.g., a front photo 440) or two photos (e.g., a front photo 440 and a rear photo 445 as illustrated in
The user device application 290 can thus be used to precisely provide a height measurement 135 of a point of interest 130 of the structure 100 by making calculations based on the photos taken by the user 190. In some embodiments, one or both GPS receivers 170/175 provide a precise location in three axes (X, Y, and Z or Latitude/Longitude/Elevation) to centimeter accuracy through a technology such as RTK. Thus, an elevation 135 can be determined for a point of interest 130 of the structure 100.
To measure the height of a point of interest the structure 100, one photo from the user device would suffice if the GPS receiver (or “rover receiver”) was installed on a tripod of known. The user device application 290 can then detect the tripod in the photo and use the known height of the tripod as an increment to compare against the height of the structure 100 similar to a “tick” of a ruler. In some embodiments, the user device application 290 can calculate the elevation at the top of the structure 100 using the height calculated and the known elevation at the rover GPS receiver 175 reference point.
It is important that the photo clearly show a side of the structure 100 as well as the rover GPS receiver 175 and tripod in such an embodiment. The application 290 would identify/determine the associated level point on the rover tripod and calculate the offset height knowing a priori the elevation of the level camera on the base receiver (i.e, the elevation offset 115). This method of measuring the height 135 of a point of interest 130 may be particularly useful if the point of interest 130 of the structure 100 is be obscured by overhead trees, foliage, power lines, or the like thereby necessitating the making of an indirect measurements by using calculations from data obtained by photographing the point of interest from a location nearby and that has clear view of the GPS satellite constellations with an established location for reference (base or reference receiver).
In some embodiments, a different reference height is used instead of a tripod. For instance, a ruler, yardstick, meterstick, rod, or other vertical element of known height (i.e, the elevation offset 115) may be attached to the rover GPS receiver 175, or a appliqué or “sticker” can be attached to the structure 100 at a known distance (or a laser as described earlier) level to the elevation offset 115.
For most Government flood underwriting purposes, the height of a structurual 100 point(s) of interest is needed as is a single known GPS Latitude/Longitude (horizontal) measurement. The precise horizontal measurements can be obtained of the point of interest 130 if the distance between the base GPS receiver 170 and the rover GPS receiver 175 can be calculated/measured. The application 290 can estimate the distance between the base GPS receiver 170 and the rover GPS receiver 175 if the image taken by a leveled camera at or near the base GPS receiver 170 is in full view of the rover GPS receiver 175's tripod of known height (i.e, the elevation offset 115) and noting the angle that the camera was pointed in by reading the internal compass of user device 110.
A second measurement/photo needs to be taken by moving the base station (i.e., the user device 110 and base GPS receiver 170) to another known point and the process of obtaining the height and distance to the rover GPS receiver 175 along with the corresponding angle. Each height/distance/angle measurement permits a right triangle calculation to be performed as illustrated in
Although the precise height of the reference point will be known, with centimeter accuracy, the horizontal coordinates (X, Y/Lat, Long) will be less precise but wholly contained within the ellipse. Adding additional photos will shrink the size of the ellipse considerably and thus increase accuracy. In one embodiment of the present invention, an elliptical or circular error of probability calculation can be performed by the application 290 with confidence level set to a conservative level (such as 50%) to determine the horizontal coordinates of the point of interest 130. In other embodiments, the distance 120 between the two GPS receivers 170/175 can be calculated using the GPS coordinates returned by both instead of using the photo.
Signals from GPS satellites dither, or wander, over a 24-hour period. Some accuracy benefit can be obtained by utilizing GPS receivers 170/175 that communicate with more satellites and utilize faster processors to generate more accurate position measurements. To get even more precise position information, a user device 110 removes the dithering using error correction. In one embodiment, RTK and/or RTN error correction is utilized to minimize GPS dithering. Other techniques can also be employed.
The user device application 290, according to some embodiments of the present invention, can determine the distance 120 with precision from the base RTK GPS device 170 co-located with the User device 110 (smartphone/tablet, etc) to a measurement point of interest within the digital image. The user device application 290 can display this information to the user as a measurement value, with centimeter accuracy, and to have that information stored in memory. In some embodiments, this measurement information is directly, automatically inserted/populated into associated fields of the survey documents (such as the FEMA Elevation Certificate) within the user device application 290, running on a user device 110 or on a remote user computing device (e.g., a laptop computer or desktop computer that can connect to and synchronize measurement data with the user device) where a user 190 might later review, interact with, and edit data.
Typically the user 190 would be instructed to take one digital image of one location. Additional images can be taken depending on which sensor and which processor type was being used while the calculations and mathematical processing would be accomplished by the user device app 290 using the image and by processing the other additional sensor measurements that were taken at the same or similar time as the image. For example, the user 190 could use a laser measurement tool (or another tool identified above) embedded in or attached to the user device 110, along with a measured electronic differential, to correct GPS RTK, RTN measurements. Thus, from a physical benchmark survey, a very accurate location in three dimensions can be obtained. The photo type can be a single digital image, 2D or 3D, or stereo digital image pair that the computer processes and provides the mathematical solution/calculated point from the reference site.
In some embodiments, the user device application 290, running on a user device 110, can be linked, wired or wirelessly, to a flying drone or remotely controlled flying device equipped with a camera. As described above regarding the user device 110, the flying drone or remote control flying device can carry additional sensors and allow measurements from these sensors to be integrated into calculations based on photos from the digital camera. For example, additional sensors can generate image positional information with respect to the object of interest in the photo through laser rangefinders, sonic echolocation or sonar sensors, radar sensors, acoustic sensors, and precision cell tower location triangulation. The photos can thus be captured through a camera on the ground or in the air.
In some embodiments, GPS corrections and other calculations can be received by the GPS receivers 170/175 and RTN. In some embodiments, the calculation process can be accomplished in real time on the user device app 290, running on user device 110. In other embodiments, the calculation process may be accomplished instead in post-processing, either on the user device 110 or on a remote user computing device. Such calculations can thus be used in order to pinpoint the precise location of objects within the digital image.
All Elevation Certificate information required is collected by the user device 110 through the user device application 290 at the property site. An “electronic” Elevation Certificate can then prepared by the user device application 290 in an electronic document format such as a Microsoft Word (DOC or DOCX) file, an Adobe Acrobat (PDF) file, a rich text format (RTF) file, a plain text file (TXT), or another format. The Elevation Certificate can be securely encrypted before or after storage by the user device application 290 within a memory 240 of the user device 110, a memory of a remote user computing device (e.g., laptop, desktop), or a memory of one or more secure cloud storage server(s) 160 accessible to the user device 110 through a network connection (e.g., internet 150).
In some embodiments, the Elevation Certificate is not encrypted. The Elevation Certificate can subsequently be transmitted to a third party's server either in an encrypted or unencrypted state. The third party may be, for example, an insurance underwriter, or a Realtor, a construction company, a title company, a local city government, a state government, the federal government, a government organization (e.g. FEMA), the property owner(s), the property manager(s), or some combination thereof. The third party may be given full access to download the elevation certificate locally into a memory of a server associated with the third party, or may be limited to viewing the elevation certificate through a third party interface (e.g., an internet/intranet web page with an interface for third party access) while the elevation certificate is maintained at the secure cloud storage server(s) 160. The third party interface may be hosted by one or more server(s) associated with the secure cloud storage server(s) 160 or by one or more server(s) associated with the third party. Once the Elevation Certificate is transmitted to the secure cloud storage server(s) 160 and/or to the third party server(s), it can be held there for processing, review, rating, archiving, quoting, billing, and other government, insurance underwriting, or property-related tasks, for example performed through the third party interface, which may save time and reduce reliance on paper documents.
In some embodiments, the GPS receivers 170/175 are RTK GPS receivers 170/175 in order to provide the improved accuracy that the document (e.g., FEMA Elevation Certificate or insurance form) may in some cases require (for example, FEMA Elevation Certificates require an accuracy of approximately 4 to 8 CM). In other embodiments, a different technology besides RTK may be used in order to ensure that the GPS receivers 170/175 are accurate enough.
While most user devices currently possess GPS modules, the accuracy is generally limited (usually to 10 meters). It is anticipated that user device accuracy will improve in the future, as the technology exists to produce it. However, the United States Government is limiting access to devices that directly produce this level of accuracy based upon security concerns, terrorism worries, and criminal activities. In the event that accurate user device GPS sensors become legally allowed and commercially feasible, the need for separate GPS receivers 170/175 (RTK or otherwise) may be made unnecessary, and the user device app could be used in place of one or both of the base GPS receiver 170 and rover GPS receiver 175, and gather this GPS input from the appropriate module internally to the user device itself.
As the User performs the steps, directed by the user device app, digital images are taken that are aimed at the point of reference on the structure 100 with the rover GPS receiver 175 in the photo. In some embodiments, a reflective or otherwise distinctive appliqué or “sticker” can be placed at the point of interest (e.g., on the wall of the structure 100) so that the post-processing algorithms, embedded in the app, can more easily determine the distance and geometries of the structure 100 using the precise location of the rover GPS receiver 175. The user device app 290 can calculates values such as the height of the structure 100 at a point of interest. The user device app 290 can then automatically enter the measured and calculated values into a document (e.g., FEMA Elevation Certificate form or insurance form) prepared for the structure 100.
The user device app 290 can, in some embodiments, prompt the User 190 to take additional digital images as needed to improve/finish the calculations or to help fill out a “photos” section of the document (e.g., FEMA Elevation Certificate form or insurance form). The user device app 290 may feature video, text, audio, graphical tutorials to aid the User 190 during this process. For example, the user device app 290 can execute such a tutorial if the user 190 has expressed confusion regarding what sort of photo to take, how to take the best quality or type of photo, what the next required measurement is, or other steps that the user 190 should take. In other embodiments, such a tutorial can be executed the first time the user device app 290 performs a measurement, with an option to disable the tutorials thereafter.
After the digital device app determines that all necessary information has been collected, the digital Elevation Certificate is, in one embodiment, displayed to the user 190 through the user device app 290 in a manner that allows the user 190 to review and/or edit the Elevation Certificate prior to transmitting the Elevation Certificate to the secure cloud storage server(s) and or third party server(s), either encrypted or not. In another embodiment, the Elevation Certificate is not displayed to the user 190 through the user device app 290, but is instead automatically transmitted to the secure cloud storage server(s) and or third party server(s).
Once received by the Underwriter, the Elevation Certificate file may be encrypted if it was not already. The encrypted file may be copied, with one copy being securely archived for possible future evidentiary use, and a “work” copy is made that is opened by underwriting personnel. Normally, a Surveyor or Engineer must manually, physically “stamp” the Elevation Certificate if their measurement services are required. However, this accuracy and reliability of the system can be periodically validated, by Surveyors, thus eliminating the need for their stamp each and every time a survey is conducted. In addition, the seal or certification can be handled by stamping the location with touch screen technology in the form of a digital stamp certified by the authority to hold the stamp device. For example, a certified surveyor or engineer could digitally “stamp” each Elevation Certificate, or could periodically certify the accuracy of the system so that a “stamp” from the surveyor is not required for every Elevation Certificate. The stamp may include a checksum and/or a certificate verified by a certificate authority.
Images of the structure 100 with precise elevation values, along with a pinpointed location of a Flood Insurance Rate Map, can in some cases be used to augment the cost of a flood insurance quote, or to obtain a premium real estate value for the property and/or structure 100. Comparable insurance values for real estate in the immediate area can be used to automatically determine if secondary “excess” coverage is needed for the structure 100 and, a premium for that type of coverage can be automatically determined as well.
In some embodiments, insurance cost and/or property value quotes can be generated based on the Elevation Certificate or the data within it. In some embodiments, such quotes can be transmitted back to the agents, insured persons, potential insurance clients, potential property buyers, real estate agents, construction surveyors, government officials or other individuals. Such quotes can be transmitted encrypted or unencrypted via e-mail, text message, phone call, audio message, videoconference, or video message.
The exemplary user interface of
The exemplary user interfaces illustrated in
The user interfaces shown in
The components shown in
Mass storage device 1130, which may be implemented with a magnetic disk drive or an optical disk drive, is a non-volatile storage device for storing data and instructions for use by processor unit 1110. Mass storage device 1130 can store the system software for implementing embodiments of the present invention for purposes of loading that software into main memory 1110.
Portable storage device 1140 operates in conjunction with a portable non-volatile storage medium, such as a floppy disk, compact disk or Digital video disc, to input and output data and code to and from the computer system 1100 of
Input devices 1160 provide a portion of a user interface. Input devices 1160 may include an alpha-numeric keypad, such as a keyboard, for inputting alpha-numeric and other information, or a pointing device, such as a mouse, a trackball, stylus, or cursor direction keys. Additionally, the system 1100 as shown in
Display system 1170 may include a liquid crystal display (LCD), a plasma display, an organic light-emitting diode (OLED) display, an electronic ink display, a projector-based display, a holographic display, or another suitable display device. Display system 1170 receives textual and graphical information, and processes the information for output to the display device. The display system 1170 may include multiple-touch touchscreen input capabilities, such as capacitive touch detection, resistive touch detection, surface acoustic wave touch detection, or infrared touch detection. Such touchscreen input capabilities may or may not allow for variable pressure or force detection.
Peripherals 1180 may include any type of computer support device to add additional functionality to the computer system. For example, peripheral device(s) 1180 may include a modem or a router.
The components contained in the computer system 1100 of
In some cases, the computer system 1100 may be part of a multi-computer system that uses multiple computer systems 1100 (e.g., for one or more specific tasks or purposes). For example, the multi-computer system may include multiple computer systems 400 communicatively coupled together via one or more private networks (e.g., at least one LAN, WLAN, MAN, or WAN), or may include multiple computer systems 1100 communicatively coupled together via the internet (e.g., a “distributed” system), or some combination thereof.
The processes or methods depicted in the preceding figures can be performed by processing logic that comprises hardware (e.g. circuitry, dedicated logic, etc.), software (e.g., embodied on a non-transitory computer readable medium), or a combination of both. Although the processes or methods are described above in terms of some sequential operations, it should be appreciated that some of the operations described can be performed in a different order. Moreover, some operations can be performed in parallel rather than sequentially.
While various flow diagrams provided and described above may show a particular order of operations performed by certain embodiments of the invention, it should be understood that such order is exemplary (e.g., alternative embodiments can perform the operations in a different order, combine certain operations, overlap certain operations, etc.).
The foregoing detailed description of the technology has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology, its practical application, and to enable others skilled in the art to utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claim.
The present application is a continuation and claims the priority benefit of U.S. patent application Ser. No. 15/051,569 filed Feb. 23, 2016, now U.S. Pat. No. 11,481,854, which claims the priority benefit of U.S. provisional application 62/119,703 filed Feb. 23, 2015, the disclosures of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4876597 | Roy et al. | Oct 1989 | A |
5343527 | Moore | Aug 1994 | A |
5553609 | Chen et al. | Sep 1996 | A |
5737491 | Allen et al. | Apr 1998 | A |
6038295 | Mattes | Mar 2000 | A |
6122526 | Parulski et al. | Sep 2000 | A |
6182219 | Feldbau et al. | Jan 2001 | B1 |
6256059 | Fichtner | Jul 2001 | B1 |
6278466 | Chen et al. | Aug 2001 | B1 |
6304211 | Boman | Oct 2001 | B1 |
6370568 | Garfinkle | Apr 2002 | B1 |
6584564 | Olkin et al. | Jun 2003 | B2 |
6662226 | Wang et al. | Dec 2003 | B1 |
6751454 | Thornton | Jun 2004 | B2 |
6784925 | Tomat et al. | Aug 2004 | B1 |
6847334 | Hayhurst et al. | Jan 2005 | B2 |
6995789 | Mcintyre et al. | Feb 2006 | B2 |
7028184 | Hind et al. | Apr 2006 | B2 |
7034880 | Endsley et al. | Apr 2006 | B1 |
7170551 | Fichtner | Jan 2007 | B2 |
7188307 | Ohsawa | Mar 2007 | B2 |
7239346 | Priddy | Jul 2007 | B1 |
7251343 | Dorrell et al. | Jul 2007 | B2 |
7343049 | Bulterworth | Mar 2008 | B2 |
7526718 | Samadani et al. | Apr 2009 | B2 |
8224178 | Keane | Jul 2012 | B2 |
8634712 | Mullins | Jan 2014 | B1 |
9094543 | Mullins | Apr 2015 | B2 |
9300678 | Stack et al. | Mar 2016 | B1 |
9538336 | Rudow et al. | Jan 2017 | B2 |
10048378 | Gogolla et al. | Aug 2018 | B2 |
10101465 | Loomis et al. | Oct 2018 | B2 |
10282562 | Speasl | May 2019 | B1 |
10318110 | Naaman et al. | Jun 2019 | B2 |
10360705 | Cervelli et al. | Jul 2019 | B2 |
10444941 | Cervelli et al. | Oct 2019 | B2 |
10726098 | Brama | Jul 2020 | B2 |
11146381 | Miller et al. | Oct 2021 | B2 |
RE48867 | Schneider | Dec 2021 | E |
11212416 | Speasl | Dec 2021 | B2 |
11227070 | Speasl | Jan 2022 | B2 |
11468198 | Speasl | Oct 2022 | B2 |
11481854 | Speasl | Oct 2022 | B1 |
11550960 | Speasl | Jan 2023 | B2 |
11553105 | Speasl | Jan 2023 | B2 |
11741219 | Zeng et al. | Aug 2023 | B2 |
20020108118 | Cohen et al. | Aug 2002 | A1 |
20020122568 | Zhao | Sep 2002 | A1 |
20020147618 | Mezrah et al. | Oct 2002 | A1 |
20020186412 | Murashita | Dec 2002 | A1 |
20030085989 | Tay | May 2003 | A1 |
20040012811 | Nakayama | Jan 2004 | A1 |
20040125208 | Malone et al. | Jul 2004 | A1 |
20040174434 | Walker et al. | Sep 2004 | A1 |
20040217884 | Samadani et al. | Nov 2004 | A1 |
20040218894 | Harville et al. | Nov 2004 | A1 |
20040218895 | Samadani et al. | Nov 2004 | A1 |
20040218910 | Chang et al. | Nov 2004 | A1 |
20040221227 | Wu | Nov 2004 | A1 |
20040264542 | Kientz | Dec 2004 | A1 |
20050036034 | Rea et al. | Feb 2005 | A1 |
20050062851 | Silverbrook et al. | May 2005 | A1 |
20050110880 | Parulski et al. | May 2005 | A1 |
20050114459 | Tu et al. | May 2005 | A1 |
20060105806 | Vance et al. | May 2006 | A1 |
20060114338 | Rothschild | Jun 2006 | A1 |
20060248348 | Wakao et al. | Nov 2006 | A1 |
20070008321 | Gallagher et al. | Jan 2007 | A1 |
20070063033 | Silverbrook et al. | Mar 2007 | A1 |
20070073937 | Feinberg et al. | Mar 2007 | A1 |
20070074035 | Scanlon et al. | Mar 2007 | A1 |
20080101784 | Hsu | May 2008 | A1 |
20080204317 | Schreve et al. | Aug 2008 | A1 |
20080219658 | Keane et al. | Sep 2008 | A1 |
20080305856 | Walker et al. | Dec 2008 | A1 |
20090031425 | Basson et al. | Jan 2009 | A1 |
20110137561 | Kankainen | Jun 2011 | A1 |
20110235923 | Weisenburger et al. | Sep 2011 | A1 |
20110276423 | Davidson | Nov 2011 | A1 |
20120086971 | Bisbee et al. | Apr 2012 | A1 |
20130046461 | Balloga | Feb 2013 | A1 |
20130080051 | Gribkov et al. | Mar 2013 | A1 |
20140049653 | Leonard et al. | Feb 2014 | A1 |
20140114691 | Pearce | Apr 2014 | A1 |
20140125822 | Mullins | May 2014 | A1 |
20140152854 | Iwaki et al. | Jun 2014 | A1 |
20140176733 | Drooker et al. | Jun 2014 | A1 |
20140281520 | Selgas et al. | Sep 2014 | A1 |
20140300722 | Garcia | Oct 2014 | A1 |
20140304184 | Fletcher | Oct 2014 | A1 |
20150098021 | O'Sullivan et al. | Apr 2015 | A1 |
20150304300 | Bender | Oct 2015 | A1 |
20150312227 | Follis et al. | Oct 2015 | A1 |
20150317368 | Rhoads et al. | Nov 2015 | A1 |
20150334257 | Woods | Nov 2015 | A1 |
20160042767 | Araya et al. | Feb 2016 | A1 |
20160070892 | Leonard et al. | Mar 2016 | A1 |
20160138919 | Green et al. | May 2016 | A1 |
20160169856 | Sung | Jun 2016 | A1 |
20160210734 | Kass et al. | Jul 2016 | A1 |
20170140492 | Leonard et al. | May 2017 | A1 |
20180357632 | Jammikunta et al. | Dec 2018 | A1 |
20190097812 | Toth | Mar 2019 | A1 |
20190325164 | Speasl | Oct 2019 | A1 |
20200014816 | Speasl | Jan 2020 | A1 |
20200151363 | Speasl | May 2020 | A1 |
20200184465 | Kislev et al. | Jun 2020 | A1 |
20200403796 | Sapena Solar | Dec 2020 | A1 |
20210150066 | Speasl | May 2021 | A1 |
20210312561 | Speasl | Oct 2021 | A1 |
20210400161 | Alrahaili | Dec 2021 | A1 |
20220004666 | Speasl | Jan 2022 | A1 |
20220070330 | Speasl | Mar 2022 | A1 |
20220078522 | Zeng et al. | Mar 2022 | A1 |
20220116511 | Speasl | Apr 2022 | A1 |
20230351011 | Zeng et al. | Nov 2023 | A1 |
20240187539 | Speasl et al. | Jun 2024 | A1 |
Number | Date | Country |
---|---|---|
108040050 | May 2018 | CN |
109460732 | Mar 2019 | CN |
110866224 | Mar 2020 | CN |
WO 2020010355 | Jan 2020 | WO |
Entry |
---|
PCT Application No. PCT/US2019/040852 International Preliminary Report on Patentability dated Jan. 12, 2021. |
PCT Application No. PCT/US2019/040852 International Search Report and Written Opinion dated Oct. 22, 2019. |
U.S. Appl. No. 15/051,569 Final Office Action mailed Mar. 2, 2022. |
U.S. Appl. No. 15/051,569 Office Action mailed Aug. 27, 2021. |
U.S. Appl. No. 15/051,569 Final Office Action mailed Oct. 20, 2020. |
U.S. Appl. No. 15/051,569 Office Action mailed Apr. 29, 2020. |
U.S. Appl. No. 15/051,569 Final Office Action mailed Aug. 14, 2019. |
U.S. Appl. No. 15/051,569 Office Action mailed Feb. 8, 2019. |
U.S. Appl. No. 15/052,774 Final Office Action mailed Jun. 1, 2018. |
U.S. Appl. No. 15/052,774 Office Action mailed Aug. 7, 2017. |
U.S. Appl. No. 16/399,785 Final Office Action mailed Nov. 6, 2020. |
U.S. Appl. No. 16/399,785 Office Action mailed Aug. 9, 2019. |
U.S. Appl. No. 16/741,605 Final Office Action mailed Jul. 24, 2020. |
U.S. Appl. No. 16/741,605 Office Action mailed Mar. 20, 2020. |
U.S. Appl. No. 17/162,629 Office Action mailed Oct. 18, 2021. |
U.S. Appl. No. 11/715,049 Office Action mailed Dec. 14, 2011. |
U.S. Appl. No. 11/715,049 Final Office Action mailed Jul. 8, 2011. |
U.S. Appl. No. 11/715,049 Office Action mailed Jun. 12, 2009. |
U.S. Appl. No. 13/491,026 Office Action mailed Mar. 5, 2013. |
U.S. Appl. No. 14/154,156 Office Action mailed Feb. 28, 2014. |
U.S. Appl. No. 14/809,068 Office Action mailed Dec. 18, 2015. |
U.S. Appl. No. 16/505,305 Office Action mailed Mar. 3, 2021. |
U.S. Appl. No. 16/505,305 Final Office Action mailed Nov. 17, 2020. |
U.S. Appl. No. 16/505,305 Office Action mailed Jul. 22, 2020. |
U.S. Appl. No. 17/556,071 Office Action mailed Sep. 26, 2022. |
U.S. Appl. No. 17/008,568 Office Action mailed Mar. 9, 2022. |
U.S. Appl. No. 18/094,519, filed Jan. 9, 2023, Jerry Speasl. |
U.S. Appl. No. 18/786,090, filed Jul. 26, 2024, Jerry Speasl, Property Measurement with Automated Document Production. |
U.S. Appl. No. 18/785,787, filed Jul. 26, 2024, Jerry Speasl, Secure Digital Data Collection. |
U.S. Appl. No. 18/785,812, filed Jul. 26, 2024, Jerry Speasl, Secure Digital Data Collection. |
U.S. Appl. No. 18/785,843, filed Jul. 26, 2024, Jerry Speasl, Secure Digital Data Collection. |
Friedman, “The Trustworthy Digital Camera: Restoring Credibility to the Photographic Image”, IEEE Transactions on Consumer Electronics, 39(4): 430-435, 1993. |
Collomosse et al., “To Authenticity and Beyond! Building Safe and Fair Generative AI upon the Three Pillars of Provenance,” IEEE Computer Society, pp. 1-9, May/ Jun. 2024. |
Earnshaw, et al., “Fighting Misinformation with Authenticated C2PA Provenance Metadata,” Proceedings of the 2023 NAB Broadcast Engineering and Information Technology (BEIT) Conference, 2023. |
Number | Date | Country | |
---|---|---|---|
20230281737 A1 | Sep 2023 | US |
Number | Date | Country | |
---|---|---|---|
62119703 | Feb 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15051569 | Feb 2016 | US |
Child | 17967554 | US |